Quantcast
Channel: LabVIEW topics
Viewing all 69579 articles
Browse latest View live

Setting a True/False State in subVI

$
0
0

Hey people of the world,

 

Im currently working on a program that tests dc motor for torque characteristics. The way it works is you set your voltage and current to actuate the motor, when that happens a torque and speed is measured. This data is then logged when the log button is pressed and continously updates the indicators, when the hold data button is pressed the last data loaded to the queue is held and then will eventually be exported out later. My issue is I would like for it to log data and then when the hold data button is pressed or "true" the data is held and stays held until hold data becomes "false". Once it becomes false i would like for it to go back to continuously log data and update the display. I believe i have the right idea in how its setup in my data manipulation subVI but it doesnt seem to work. any suggestions or feedback would be appreicated. thank you!


Reading I2C in LabVIEW

$
0
0

Hey all, I'm having a really difficult time understanding how to read from an I2C to USB converter. I have my device hooked up right, because it works perfectly with the company's OEM reading software. I cannot get the device to read in LabVIEW, however. Here's some info on the product I'm using, the Sensirion SFM3000: http://www.sensirion.com/fileadmin/user_upload/customers/sensirion/Dokumente/GasFlow/Sensirion_Gas_Flow_SFM3000_I2C_Functional_Description_V1.pdf

 

My biggest questions I suppose:

 

How can I create a 16-bit array?

How can I send an unsigned 16-bit array to the device while the VISA/I2C VI's only allow 8 bits?

How can I get LabVIEW to read the returned array?

 

Thank you!

Using Labview to make a custom wave for daq assistant to output

$
0
0

Hey there, I have a table of x values which I have made into a wave form.

 

I was curious as to whether this wave form could be outputted by my Ni Daq via Daq assistant, or is there an alternative method for outputting a table of values via a daq card?

 

Thank you

missing description and tips

$
0
0

Where are the description and tips located? I have several programs that are now missing the description and tips for controls and indicators. Is there a file associated with the VI that is misplaced that contains description and tip text?

Running LV 2012 Version 121.0f3 32 bit on Windows 7.

The descriptions and tips are missing for several programs, meaning days of work if I can't correct the problem. Not sure when it happened - after update, patch...

 

 

Number to string with decimal separator

$
0
0

Hello Community,

 

I would like to convert numbers to string with decimal separators. So if the number is 52351 then the string I need is 52,351

another example 18653284.9653235 should become 18,653,284.9653235

 

What would be the easiest way to do this?

 

thanks!

Flatten to XML doesn't work well with waveform attributes

$
0
0

Hello,

 

I have an application where I have a data class containing waveforms acquired from DAQmx.  I'm using the "flatten to XML" and "write to XML file" to save the class object.  I get an error reading the class back in, in "unflatten from XML" I get error 1403 "Attempted to read flattened data of a LabVIEW class. The data is corrupt. LabVIEW could not interpret the data as any valid flattened LabVIEW class."

 

When I dig into the XML files, I see an entry for attributes, but it contains no data:

<Name>attributes</Name>

<Default>
<Name></Name>
</Default>

 

As a workaround, I'm able to strip out the attributes before I save the data. But I have to store the data somewhere else. Is there an easy way to save a class where all the data is preserved?

 

I've attached a example project in LV2013 SP1, run the "test class flatten and unflatten.vi". I get the error when I try to unflatten the waveform with the attribute, but not the one without.

 

-Jeff

 

 

LM92 I2C temperature sensor with NI-8451

$
0
0

Hello !

 

I try to communicate with a LM92 temperature sensor via the NI USB-8451 by I²C protocol. I have an issue with the addressing because the 7 bits sensor address is 1001011 (A1 and A0 set to 1) but the LM92 is working in little endian (LSB first) and Labview adds the R/W bit at the end of the address so the complete frame sent to the LM92 is 1001011R/W.

 

I'm trying to find a solution to invert the complete frame before seending it in order to communicate with the LM92.

Labview returns the error -301742 because the address is not the wright one and i can't find how to send the adress in little endian.

 

I just need to read the temperature register of the sensor so i don't need to write anything in it.

 

Does anyone have an answer ?

Creating executable - How to add config-ini?

$
0
0

Hello,

i created a new GUI which reads at startup a config.ini file.

In the vi i specified the path to this ini-file to be in the same path as thecalling-vi.

Will this also work if i create an executable? So that the ini-file is searched in the same path as the executable?

 

Thanks


Display x against y in a graph

$
0
0

Hi everyone can someone help me display some numerical value(Y) against some numerical label (X) such as a serial number. All the configurations I tried seem to add the numbers in between the serial numbers on the x axis. Attached is an excel example of what I am trying to do.

 

 

Problem using HW-access in executable

$
0
0

Hello,

i created a C#-DLL to communicate to a DIO-module using modbus (over LAN).

The DLL-functions are called in separate VIs.

Finally this vi´s are called in my GUI and everything runs fine when executing in Developer-Mode.

But after creating an executable of it the DI is not reading anything any more.

What can be the problem here?

 

After start of the program i run an initialize-vi for the DIO-module that creates a reference as return-value. This reference is saved in a global-VI-variable.

Is it possible that there are problems with global variables in executable?

 

Thanks for tips

Manipulating the cell size in an Excel spreadsheet

$
0
0

Problem 1 of 2 (2 is in the same project but a different function so will place in separate post)

 

Using LV 2013 on Windows 7 with Excel 2010

 

I have a routine where I insert assorted data into an Excel spreadsheet based on a pre-formatted Excel template. Problem is, when one group of data is placed in the range of cells (ends up being a column because of the way the data array is formatted) that particular column width is reduced.

 

Because once of the cells automatically formats as the Excel Date / Time format (as desired) the narrower width of the cell makes the value unreadable. (Just like when a real numerical value is longer than the cell is wide, all you see is hash marks)

 

I verified my column width in the template is where it should be, it definitely is reduced when the data is entered. I also tried using the “Excel Set Cell Dimension.vi” to change the cell width back to what I need but it does not work. Not sure why. When I tried placing the function at the beginning of the routine, I think it created an error and the rest of the report/excel functions failed to work. I didn’t monitor the error wire to confirm this.

 

My next approach will be to write a macro in the template file and call this at the end to fix the cell width but this seems like the long way around.

 

Any thoughts on the best way to fix this? Or why the “Excel Set Cell Dimension” function does not work?

 

Thanks…..

How do I creater a indicator to read power factor

$
0
0

The problem that I have is not that I can read power factor because our RTU gives me a analog input in numeric fomat for our power factor. I need to know if there is some sort of scaling or indicator ( such as a analog dial type gauge) that can be scaled to read power factor in the below format. We want to see Lag / Lead that mimics our mechanical analog meter at the power plant. Any suggestions?

 

-0.5

-0.6

-0.7

-0.8

-0.9

1.0

0.9

0.8

0.7

0.6

0.5  

 

Programmatically Terminate Power Save Mode

$
0
0

Hello,

 

I am trying to figure out how to programmatically wake my computer when the screen saver is running and the PC is in power save mode. In the LV Developer Zone, I found the attached VI, Simulate_Mouse_Clicks.vi. The VI works to keep the computer from starting the screen saver and go into power save mode. But when the PC is already in power save mode and screen saver is already running, the computer does not wake up. The "Terminate Power-Off Mode" Call Library function does not appear to be working to wake the computer from power save mode when the screen saver is already running. My application requires the PC to be awaken by the LabVIEW VI when in power save mode.and the screen saver is running. 

 

Does any one have experience with this problem and can you offer a solution?

 

Regards,

Bill

 

Terminate power off mode function.png

LabVIEW crashes when modifying OOP parent class private data control or typedef contained in that control

$
0
0

I've been seeing strange behaviour when I modify the private data control of a class, especially if it is a parent class.  It seems that those changes are not always propagated to all the VI's in the project.  This sometimes causes my project to crash with an exception error, or sometimes the problem is more subtle as it will simply write data to the wrong elements in the control (when bundling/unbundling).

 

I solve the problem by opening the typedef or class control by itself (i.e. not as part of the project), and then saving it.

 

The next time I open the project all problems are solved.  This is a difficult error to track down but I now know to keep a list of typedefs or class controls that I have modified (using subversion helps here), and then when this strange behaviour or crashes happen, I simply close the project and open each modified typedef or class outside of the project and save them individiually.

 

Anyone seen something like this too?

Delete files from multiple paths

$
0
0

On my block diagram I have an input path which leads to a case structure that is controlled by radio buttons. The outside case structure will turn to true ater a user selected time period. I am now adding an additional path of images that need to be deleted or moved. 

 

I need to now add the image path to this case structure also but it also needs to go through the same file age parameters from the get file info vi. I have tried using the build array function but it did not work to append the filenames.


Parrallel vi's with Visa

$
0
0

I'm trying to create a parrallel vi with two DMMs (Digital Multimeters) making voltage and current measurments using a Visa example provided by Agilent.  I thought the vi was parrallel that I wrote but when it executes it sets ups both DMM's then proceeds to gather measurments from the first DMM, then the second DMM.  Is there a way to get readings from both DMM's in parrarel?  I was initally using one DMM but the setup from going to current and voltage was taking too much time.  I suspect I have a dependancy that I'm not aware of between the two portions of code.

 

Regards,

 

-L4Y

State Machine Timing - on Compact RIO

$
0
0

A time loop does one processes at a time. While a while loop do more than one process simultaneously (depending on if machine has multiple core processor)? 

 

 

Would it be possible to put a time loop inside a while loop on RT machine (specifically NI cRio-9076)?  I want to do a state machine with event based timing (see attachment) or state with periodic timing (see attachment).

 

 

The state that contains the time loop will be used to acquire data. Thus it will be deterministic.

 

 

Can someone tell me if this is doable?  

 

-Would I have to change the priority on the while loop to 100?

 

Thanks alot.  

 

*Also i am using the RIO Scan Interface 

strange charaters in vi's

$
0
0

I don't know when but my vi's have strange characters... How do I fix this?

I tried various fonts in environment option but made no difference..

 

 

Generate a PWM signal from labview to control a valve

$
0
0

Hi all,

I'm trying to build this set up using labview.

I want to generate a PWM signal from labview to drive a solenoid valve. The signal should be driven out of a DAQ (NI USB-6211). The DAQ would be connected to solid-state relays and the relays will be connected to the valve. My primary problem at the moment is to generate a proper PWM signal from labview. Can anyone please help with that?

 

I've attached two VIs. The one titled submitis a very basic VI that shows a visualization of a square signal and a DAQ Assistant. But I'm not sure if this is working. How do I physically obtain the signal from the DAQ? What settings do I have to manipulate?

The second VI (DAqmx) is somewhat a replica to NI's PWM example, I don't particularly understand how it works and I'm not sure it's even working.

 

If anyone could offer any help/advice I'd be very grateful. I really need a reliable pwm signal to drive the valve. Thank you very much.

Help with signal noise in NI 9237 (FPGA mode)

$
0
0

I am using a cRIO-9012 controller, 9112 chassis, and modules as follows:

 

1) NI 9237

2) NI 9214

3) NI 9263

4) NI 9265

5) NI 9477

6) blank

7) blank

8) blank

 

I managed to get my temperature measurements / control working on the NI 9214 using the Wait on Interrupt / Acknowledge Interrupt nodes similarly to what is shown in the shipping example for that module.  When I programmed the NI 9237, I followed a similar structure, since I don't require the high data rate capability of the module.  Thus, I am performing a single sample, asserting an interrupt, waiting for it to be read and acknowledged on the LabVIEW RT controller, and then proceeding with another iteration in the FPGA loop.  I understand that this is not efficient, but as I mentioned, I don't require a high rate of acquisition.  The problem that I am seeing is that the measured signals appear to be very noisy.  I am using two pressure transducers (50,000 psi Honeywell TJE) connected to channels 0 and 1 of the NI 9237, with the other two channels open.  I have the RJ50 version of the module, with TEDS cables supplied by Honeywell / Hoskin Scientific, and I am able to successfully read and parse TEDS information.  I don't suspect a cable problem, as step changes in pressure (tested using shop air <200 psi) are reflected correctly in the displayed signal; however, the signal appears to be randomly noisy, far outside both the 24 bit resolution of the NI 9237 and the specs of the pressure transducer (352 ohm full bridge, 2.84 mV/V)  Scaled data shows +/- 30 psi random noise, which I have never seen before with similar transducers.  I wonder if it is somehow due to my programming implementation, as the shipping FPGA example for the NI 9237 entails the use of buffered acquisition using a FIFO, and I am merely polling the FPGA I/O node once per iteration of my while loop, with an interrupt asserted each time.  I also thought it might be related to the excitation voltage, since I am using internal excitation, and the module may not support both bridges at 10 V, but setting this down to 3.3 V has no apparent effect on the output.  I'm stumped.  Thoughts?

 

Sean

 

Viewing all 69579 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>