Quantcast
Channel: LabVIEW topics
Viewing all 69222 articles
Browse latest View live

LabVIEW 2015 unresponsive

$
0
0

Hello all,

 

I seem to be having an issue with the LV environment  I am repeatedly dealing with an issue where when I right click on a project item or attempt to save, the LV environment becomes unresponsive for long period of time (right now I have had an unresponsive window for > 5 min) which totally destroys productivity.  I am running 2015 on a Windows 10 x64 machine.  Any thoughts or suggestions for improving this?

 

Thanks, cirrus


error code -200077 in DAQmx

$
0
0

Good Morning, 

 

I am receiving the error code -200077 in my code with the source being :

Property Node DAQmx Timing (arg 5) in DAQmx Timing (Sample Clock).vi:4730001->Init2.vi->GPScan2.vi <append>
<B>Property: </B>SampClk.Rate
<B>Requested Value: </B>Inf
<B>Valid Values Begin with: </B> 22.250739e-309
<B>Valid Values End with: </B> 10.0e6

<B>Task Name: </B>_unnamedTask<1F>

 

the code is built from a research paper I had read through and other resources. It seems that the issue is arising with the DAQmx Trigger Property Node in the first portion of the code. The equipment I am currently using is BNC-2120 and a NI PCI-6036E, attached is a zip folder with the code I am using. Any help is appreciated.

 

-Dan

Write to DBL previous values

$
0
0

Hello,

 

I am working with LabVIEW 2013. I am receiving strings from a robot and parsing the data and plotting the actual position vs commanded position. When the robot encounters and error, it sends LV and error message. All the string parsing is done in the same place, so the graph will just plot all 0's when it receives an error.

 

I have it such that when it detects an error message, it will turn a boolean on. What I need to do now, is to have it write the previous known values for the "actual position" instead of setting them to 0's. Is this possible?

 

Attached is a snippet of my code. I am receiving this string 4 times a second.

 

Thanks,

SM

Conditioning Counter Input in LabView/DAQ Assistant: Hall Effect RPM

$
0
0

Hi All,

 

I'm new to LabView and I'm trying to read the RPMs on a training bike. I'm using a USB-6001 Multi-Function DAQ. I've followed some tutorials using the Counter setup with the USB-6001. In a Test Pane (One Sample on Demand), while I'm able to get counts off of the sensor, I'm getting multiple counts for every pass. Say, when the wheel is turning slowly, I'll get 7...15....25...35...etc. So, I'm getting batches of counts on each revolution. I'm thinking that there must be some noise going on. I'm reading about using a Min Pulse Width DAQmx property that may filter out any noise, but I'm not sure how to apply it to the DAQ Assistant or DAQmx.

 

The circuit to the 3-wire sensor is pretty straight forward. I have ground, +5v in, and signal out. The Signal stays at 5V until the magnet gets close, then drops to zero. I have a 10k-ohm resistor bridging the 5v in and the sensor to pull it up to 5V so there shouldn't be any float.

 

I'm using the standard Counter example VI attached with the same results as seen in the Test Panel.

 

Thanks for any help!

Icon Editor Crash

$
0
0

I am using LabVIEW 2015 64-bit. Everytime I double click an icon to edit it the icon editor flashes for a moment, then seems to crash, and is replaced by the old icon editor from years ago. Everything else seems to be working fine.

 

I updated my version of LabVIEW yesterday to see if that fixed it, but it still happens.

 

Does anyone know why the icon editor doesn't work, or how I can get it to work?

 

Unknow internal error -124201 using WEBDAV to return remote file path info

$
0
0

Hi All,

 

I am trying to use the Labview WebDAV "Path info" VI on my cRio 9068 to see if a file exists on a remote data server. I am running into an issue where that VI fails and returns an An "unknown internal error" of -124201. I have been unable to find any info on Ni.com regarding this error, I have no clue as to how to troublesoot this issue, since other WebDAV VIs work correctly in the same enviroment.

 

Does anyone have any insite into this error regarding WebDAV PathInfo ?

 

Below is some example Labview code:

 

  File check Block diagram.jpg

 

 

 

 

 

 

File Check Front panel.jpg

 

I am using Labview Real-Time 14.0.1 with WebDAV Client & Server 14.0.0 on my cRio

 

Thanks

Is there a way to get the total samples from a TDMS file?

$
0
0

I'm looking at the TDMS Get Properties Function documenation and I see that there's a property: wf_samples which "Represents the number of samples in the first data chunk". Is there a way to get the number of samples in, say, the second data chunk? My goal is to get the number of samples in all the data chunks for one channel.

labview 8.5 vi wont communicate with Max . no error

$
0
0

I replaced my hard drive , now my VI wont communicate with Max. Windows xp. My app was working now it wont work . If I run a task or test panel in Max I am able to communicate to my devices , should  i Re-instal Daqmx ? LABVIEW 8.5 


[Bug Report] Cannot Set the Cursor Position in a Graph Precisely Enough

$
0
0

I am pretty sure that what I am reporting as a bug will be re-qualified as a feature but it is disturbing to new users.

Take a graph (like the one created in the attached VI) with a plot which has too many points within the visible area to reallistically expect all of them to be identifiable/plotted (e.g. use autoscale X once), and try to navigate the curve using the cursor navigation arrows: this should work fine if you are not too nervous. The cursor will move in 0.025 increments.

 

Screen Shot 2016-07-08 at 18.49.13.png

If you have autoscaled the graph though, the plot should have 2000 data points displayed over an area covering 415 pixels. If you are careful enough, you will notice that occasionally, the cursor X position is incremented (or decremented) by the right amount, but there is no visible displacement of the cursor. This is particularly noticeable around the peaks and troughs of the sine curve.

So far so good.

 

Now that we have established that it is possible to navigate a cursor around a curve with the data resolution, we (I) would expect to be able to do exactly  the same using the cursor coordinates displayed in the table to the right.

No such chance!

Try typing 3.275 for instance, and you will see the cursor jump to 3.225. Try 3.125 and you'll get 3.100.

If you zoom the graph such as to obtain a better ratio of number of visible points of number of pixels:

 

Screen Shot 2016-07-08 at 18.56.33.png

things work fine. You can type any valid data point coordinate (multiple of 0.025) and get the cursor to jump there and its position staying at the value you have typed in.

The exact same phenomenon occurs programmatically.

Type a target coordinate in the X control and press Set X: it will work fine if the graph if zoomed enough (the displayed cursor coordinate will match your requested position), but if you reset the graph to autoscale, this will fail for some values (as shown below):

 

Screen Shot 2016-07-08 at 18.58.49.pngI can not expect my users to be willing to zoom around their target position to be able to set a cursor's position, then autoscale back to full range and repeat for as many cursors are needed in the type of application they are interested in (defining regions in a graph). I can also not expect them to delicately navigate to their target using the oversensitive cursor navigation arrows.

I perfectly understand their frustration of not being able to type in a cursor target location and observe it jump to where they instruct the software to go.

 

Now, I can use a trick (the Set X(*) button shown below) and obtain the effect I am looking for:

 

Screen Shot 2016-07-08 at 19.10.39.png

 

 

How do I do that? Simple and stupid (check the code):

- programmatically zoom around the target position

- move the cursor

- reset the graph range to the original values

 

Notice that there is no visual signature of these scale range changes.

 

So my first question is: why do I have to do that and why is LabVIEW not doing it by default?

 

More aggravating: since there is no Event telling me that THE USER HAS EDITED THE CURSOR COORDINATES, I cannot use the trick I described above for the most natural case of user interaction with a cursor on a high data density plot, namely when the user types in the desired cursor coordinates.

 

Hence my second question: why is LabVIEW not moving the cursor where it is instructed to, even though admittedly that cannot always be done from a graphical point of view (but this works perfectly well using the cursor navigation arrows)?

 

To me, this sounds like a bad UI choice. 

sync

$
0
0

Hi,

 

I am trying to figure out what is the best option for doing the following:

1) Generate a potential signal (which starts a trigger). Currently doing with NI-6501. 

2) Generate a potential and acquire a current. I am using a CHI600D potentiostat, throught the labview interface provided by the company (the code works through the call dll to access the functions).

I need to know what current corresponds with each trigger (sent with the NI-6501). So, first trigger signal I get x1 amps, second x2 amps, and so on.

For that, the signal is sent when I get a certain time multiple from the CHI. So I can trigger every 100ms (for example).

 

The problem: too slow. I am able to do this each 50ms (roughly). I need to do it every 1ms at least.

 

I am able to buy whatever hardware I need for doing this simple task. I know the problem is in the way I acquire data from the potentiostat, and recently I found a Keithley 6487 which none was using, and I was thinking of using it instead of the CHI, through a GPIB-USB-HC controller, but not sure. I can also buy another instrument, not a problem.

 

The question is: how can I sync the trigger with the intensity signal I get?

 

LabVIEW 2014 SP1 32bit

Win10

 

Thanks for your time. Any ideas are welcome. 

 

Cheers,

 

Yan

 

 

 

File not found becuse excel file become RandomName.tmp and original file disappears

$
0
0

Hello All,

 

I am using .xlsx file to store the result of my testing and I have observed that at some point my file becomes .temp file and then original file no more exist.

 

Original filename is VCP Report.xlsx

 

please see I am attaching my code for reference

 

 

excel file error.PNG

 

I am sure the of the file given by me is correct just the file is missing somehow, it happened many times.

 

file error.PNG

 

attached VI is LV15, 64 bit.

 

If you can suggest some solution that wil be great.

 

Thank you.

 

 

 

 

 

How to align control A to control B?

$
0
0

Hello.  I am attempting to align controls on a front panel (see attached).  It appears that I need something more than the alignment tools on the toolbar, or a different strategy... What I typically do in UML tools is to align a single column of controls (such as the "Pulse Width" row) by placing the highest and lowest control where I want them and then distribute the column evenly.  Then I align the top and bottom rows to the top and bottom controls, and distribute the other columns evenly.  However, LabVIEW appears to use an averaging algorithm to align items rather than using one item as the baseline (e.g. the first or last item selected).  So I end up doing an iteration exercise which leaves me with carpal tunnel syndrome and a desire to post on the forum for a better way :-)

 

Thanks,

-Jamie

RT——what's about the VI running in degug mode

$
0
0

I have a puzzle:   what's about the differrence between [Debug] mode and [Stande alone] mode ? Because of my working,  I need change the RT programe very often, so i would not like to run the RT programe in [Stand alone] mode   Will the [Debug ] mode influent the RT programe's efficecy???

how to save a cluster of two elements in excel

$
0
0

Hi,

 

I'm having a problem with saving Rainflow count cycles output Value, it's 1D array of cluster of two elements.

Please, help me save it to excel!

 

Thanks!!!

RT FIFO with complex data?

$
0
0

Scenario:

We have a deterministic loop on an RT target (NI-PXI). The speed need to be 1 Hz only with as less jitter as possible. This deterministic loop will execute picoAmp measurements using many Keithley6487 units (using low level SCPI functions, so Measurement configuration, then "INIT" command all devices (takes about 2 msec per device), and about 300 msec later "FETCH" data from all Keithleys (requires about 10 msec per device). This deterministic loop will have a state machine and accept commands coming from lower priority loops (like zero check, reset, etc one Keithley or multiple). The measured data will be in a form of double array, so it is OK for RT FIFO communication between the loops.

However I wonder how to extend features: during measurments some warning might be created by the Keithley VIs in string forms. Also, an acknowledgement/feedback of certain actions performed on the Keithley should be reported back to the lower priority loops, again in original form they are strings. Strings are not supported by RT FIFOs, but I have found this trick below how to convert it (see below).

 

To broadcast all info properly, I think I should send the measurement data as a double array, and a same sized string array which would hold any SCPI related warnings/acknowledgments from the Keithley units. Since I cannot use cluster either for RT FIFO, is there a trick to "compress" or convert the complex cluster type into something RT FIFO compatible?

 

Right now I think about just a workaround: I could create two RT-FIFOs, one for the double data array and another using the trick below to convert from string (both fixed sizes). I could convert the Keithley info string array into a comma delimited single string, so I could use the "String to byte array" function.

 

Any advices and suggestions are very welcomed! Smiley Happy

 

disclaimer: I am a beginner in the field of RT...

 

fifo test with strings.png


Cross platform web app development labview

$
0
0

I had developed a VI for reading a serial port from window 32 bit machine & then displaying data on labview GUI. It is working fine

Now can I further developed a cross platform web app which can work on both window & macosX?

DLL loading with VI

$
0
0

Hi,

 

I have a project which includes some DLL's and project is working fine without any issue. If I open a VI in that project and running it showing DLL exception error (VI's is working fine in project instance). My requirement is to run VI independently instead of project instance every time.

From some NI forums, I found that if you place required DLL's in parallel with VI folder it will work but it is not working and I tried placing DLL in parallel to labVIEW.exe it is working.But I don't want place the DLL's in parallel with LabVIEW.exe. Is there any other way to load DLL's.

PS:I tried running VI by giving the DLL path also, it is not working.

Loop won't stop when stop button is pressed

$
0
0

Hi,

 

I have a program on LabVIEW which includes a number of loops but one of my loops (which surrounds a series of "TDMS Write" functions) has a boolean stop that passes a signal through two OR functions and into a loop control (please see aea of interest). My problem is that when I press the stop button the loop will not stop and so will continue to record and write data that I do not need to a file.

 

Could someone please help with this?

 

Regards,

Aaron Broady

How to export timebase from a cDAQ chassis 9174 to a cDAQ module 9234?

$
0
0

Dear all,

 

I'm trying to exort the 80MHz  timebase of a cDAQ9174 to a 9230 board and 9234 board by software.

 

System configuration:

- NI cDAQ 9174 with:

--> Slot 1: 9234 board

--> Slot2: 9230 board

--> Slot 3: 9211 board

--> Slot 4: 9862 board

 

Software:

LabView 2014

NI DAQmx 14.5

 

I would like to use the 80 MHz time base of the cDAQ 9184 as the master timebase for the boards 9234 and 9230. Measurements from these two boards must be synchronized and I would like to set the sampling rate to have a measurement every ms. However, the internal clock doesn't allow to set the sampling rate "as I want it to be". There are dedicated values with Fs = Fm/256/n with n between 1 ans 31 (from the board specification).

 

I tried to use the 80MHz timebase clock with the NI LabView example "Voltage-Continuous Input.vi" but the connection is not allowed

Error: "Requested Sampe clock source is invalid"

 

I tried to find a way to configure boards on  NI MAX but I have not found a solution there.

 

Is there a way to connect the 80MHz clock to the 9230 and 9234 boards, or to set the samplig rate so that it's possible to have a measurement every ms?

 

Thanks for the time you will consider to these questions.

 

Kind regards.

 

Jimmy

9232 module in a 9191 Ethernet - install

$
0
0

I developed an application using a 9232 module in a 9191 chassis (Ethernet connection)

 

The software works on my laptop (development station), it’s also works on aother laptop with LabVIEW.

 

I need to know what needs to be installed on laptop without LabVIEW (no development station)

 

Thanks

 

Shimon Zerbib
Software Design Engineer | Test & Measurement
Web: www.checkbench.co.il
Email: shimon.zerbib@checkbench.co.il
Mobile: (+972) 54 20 52 337

 

Viewing all 69222 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>