Quantcast
Channel: LabVIEW topics
Viewing all 69011 articles
Browse latest View live

Agilent 34460A VISA problem

$
0
0

Dear users,

I have encountered a problem with my Agilent 34460A multimeter. Here is the background:
I am using the driver provided from NI website.

 LabView version is 8.6

NI VISA version is 5.4.1


Usually I can initialize the equipment and work with it, calling "Close.vi" provided in the library at the end. However, if I stop the program midway, it is extremely difficult to start it again without errors. Currently I have no idea when it works. If I stop it midway and then run it again, it either throws error on Visa Clear :
Initialize.JPG
and this particular error is documented as The value of some parameter (which parameter is not known) is invalid. This I find not very helpful. The weird thing is that I can still communicate with the device from MAX panel:
ClearMax.JPGIDN.JPGMAX.JPG

which makes me wonder, why LabView throws error? My guess is that since I do not close VISA connection, it still might be open, or, my second guess was that I leave the tool in unknown state, which it might not like.
Exploring the first option, I tried restarting the computer and tool. This did not help. 
Exploring the second option, I reset the multimeter to factory settings using front panel. Again, no success.
Reading up on forum, I noticed some people mention that disallowing the PC to turn off USB hub to save power might help, but it did not.

Then at some miraculous moment it starts working again.

I really do not know how to proceed, and how to solve this problem.

Any ideas?

Edit: I noticed that even when the initialization succeeds, the program hangs in middle of error query:
errorQuery.JPG

It looks like it is stuck on Visa Read, and I cannot stop the VI, I have to unplug the tool. If I disable reset, it hangs even before on Visa Clear.
DisabledReset.JPG


Agilent 34970A Visa session randomly disconnecting

$
0
0

I have an Agilent 34970A that I'm using the record temperature and voltage values.  The DAQ is connected through an NI GPIB-USB-HS.  I have downloaded NI Visa drivers and the Agilent 34970A instrument library.

 

I am able to run my program from ~1 min before an error occurs and the DAQ is completely unresponsive through Labview 2018 and Max.  Error -1073807342 occured at VISA Read in Agilent 34970.lvlib:EZ Voltage.vi.  Possible reason: VISA: (Hex 0xBFFF0012) Invalid resource reference specified. Parsing error.

 

I must unplug the device from the computer, replug the USB, and scan for the instrument through Max to reestabilish a connection.

 

I have replaced the GPIB adapter, the DAQ, checked all connections, so I can rule out a physical connection issue.  Something is happening to trip up the VISA address and I cannot figure out what it could be.

H.264 raw byte streams to VLC

$
0
0

Hi everyone

I have a laboratory setup for a communication system prototyping with two FlexRIOs and a PC. I'll be more verbose here.  

Tx:

IP camera --> video encoder (SDI-to-H.264) --> FlexRIO(tx) controller ethernet port --> UDP read in LabVIEW RT --> FPGA modulation backen --> on Air

Rx:

From Air --> FPGA demod backend --> to RT UDP write --> FlexRIO(tx) ethernet port --> to PC Ethernet 

PC:

UDP Read --> getting raw h.264 streams. 

 

What now I need to do is stream those packets to VLC and playback the video. Since VLC won't just understand the raw byte streams if I send it via localhost, I have looked upon something known as NAL unit handling. My question is, do I need to perform any byte reordering for VLC to understand? 

Any help would be appreciated. 

 

Regards.

LabVIEW DAQ/Simulated Signal for ECG

$
0
0

 I am connecting ECG pads to my heart and sending the pulse to LabVIEW with my DAQ. I am then acquiring the best waveform from the DAQ into a excel file, where I will use the same parameters to repeat my signal so I get two pulses. I then have to input these numbers into a "simulate signal". Now, I need to determine the time duration of each segment (i.e., the time in seconds of the P wave, the ST segment, etc) as well as amplitudes of each wave (i.e. the height in volts of the P wave, QRS Complex, T wave, etc). However, I do not know how to create this VI without a LabVIEW biomedical toolkit or any advanced signal processing toolkit (as I do not have access to these). Is there anyone who knows how I would make this VI? Attached are some ideas I have, however, I keep getting errors. Thanks 

check if entry in numeric array is empty

$
0
0

I have a numeric array (i32) that gets data inserted into it as the program is running.  I am looking for a way to determine if a specific index in that numeric array has been populated yet.  In other words, I need to determine if an index in a numeric array is empty.  Index array is close to what I need, but if the indexed entry in the array is empty, it will return 0 (the default value for this numeric array), which is a potential valid data entry for my numeric array, and therefore is not sufficient to check if the indexed result was originally empty or 0.

 

Any help on this is appreciated.

Dessiner un graphe à partir d'un tableau

Modbus RTU over UDP/IP

$
0
0

I am trying to communicate with a device over Modbus UDP/IP. Here is some of the information I am given from the manufacturer:

 - The module (server) will automatically open UDP port 2001 for receiving data from the client

 - The client in turn must open UDP port 2000 to receive data from the module.

        - This means I need to keep port 2000 open on my machine with LabVIEW trying to access the device?

 - The modbus data packed is wrapped inside of a UDP datagram before being sent over the network.

        - From my understanding, this is just saying its encapsulating the standard serial Modbus RTU frame into a UDP?

 - I am unable to ping the IP address that I have set on the device ( can you ping a UDP address?)

 

I've tried using the example simple UDP as well as some modbus example. I'm not sure how to begin this. I've also created a static IP on the device using their HMI.

 

 

Parse Date with "," correctly from CSV file

$
0
0

In the attached VI, I have a CSV file that contains a Time Stamp with "," in the first column.  When I try to generate the array from the original file, I get two extra columns with part of the date occupying the following columns because of the ","s in the Time Stamp column.  Is there a simple solution to parse the date correctly to be limited to the first column and not be split up into 3 columns as in the attached VI?  

 

Note that this is a snippet of the full data file.  There are thousands of rows.

 

Thanks for your help and time.


Automating function generator 33500B by LabVIEW

$
0
0

Hello,

 

I am using LabVIEW to control a function generator 33500B. I have to use the burst mode to generate 20 cycles sine wave (1000Hz) in every second. And I need to keep the code running so that I can update the amplitude in every second (without clicking the RUN button again and again). Now I am using a while loop to keep the code running. I've got the following issues:

 

1) The execution time for one-time loop cycle is about 1.3s, which is longer than my desired burst cycle (1s). So I got a burst of 20 cycles sine wave in each 1.3s, instead of every 1s. My guess is that, since I am using a while loop, so in every execution cycle the function generator is re-initialized and re-configured, which is time consuming, costs about 1.3s. Is there any way to speed up the initialization and configuration process?

 

 2) I found that there is another burst of sine waves appears before the 20 cycles of sine waves. This undesired burst has the same frequency and amplitude with the desired one. This undesired burst lasts about 1.5ms. And, after about 5.8ms, the desired burst comes out. I have completely no idea why and how this undesired burst is generated. I have attached the .vi project and a photo of the waveform I got.

 

Thanks in advance!

Shanshan

Help to convert LabVIEW 2015 built VI in to LabVIEW 2014 VI

$
0
0
Moved to Version Conversion board

Get Date/Time Delta Time Calculation

$
0
0

Hello,

 

For a rough check, I decided to try checking the RTC drift of an android device by using ADB and comparing it to the Get Date/Time In Seconds function. I first use the Get Date/Time in Seconds to set the android device's clock with the understanding that there will be a small delta between the read and the setting. I then read the devices time/date, compare with the output of Get Date/Time and calculate the difference.

 

I know I could just do everything manually but I'm controlling a temp chamber and logging the data. Plus, what's the fun in that? Smiley Happy

 

So the part that's driving me crazy. I'm having an issue where the camera time comes back pretty consistently in line with the delay I put in the while loop but the Get Date/Time in Seconds output comes back ~100ms different every time. I'm seeing this if the while loop delay is 10 seconds or 1 minute. Does anyone know what could be causing this behavior and maybe suggest a way to overcome it? I'm at a loss. I attached an image of the block diagram and the output of the date query from the device and the calculated delta t. I'm using LV 2018

 

Object Tracking VI with VDM Bounding box error

$
0
0

Hi! I'm really new to Labview and have only started a week ago.

 

So I really need help making this code that will take 3 objects, track their x,y coordinate, create a time-taking function, and output their velocity and rotation.

 

In order to tackle this, I used Labview's object tracking and modified it with Labview's histogram program to have it record a first picture and boundaries on the object then use the object's rectangular coordinates as a base for the bounding box. I am using the traditional mean shift device at the moment. However, as shown by the error in the picture, I have an "invalid bounding box passed" error. I am not sure how to fix this and would appreciate any examples or advice.

 

Thank you so much!

 

Sincerely,

Venus Luong

 

Need help with school project (This is not an honor code violation)

$
0
0

Hey, so I just created an account here. Before I get a flood of comments, let me answer some basic ones:

 

1. My team asks my professor for help with this constantly, and he usually shrugs his shoulders and says: "Yeah I don't really know HOW to do that but THIS is what you gotta do somehow." -- aka not helpful.

 

2. This is not a violation of any honor code. I am not asking for anyone to do my school project for me, I am simply asking for some basic guidance and labview tutoring from labview experts. This is the best place I know to do that.

 

3. I wish my school had a labview class or something that actually could help me prepare for the kind of project I am doing, but alas, we all kind of just get thrown into this with little more than a crash course in how to make a basic GUI.

 

Now, to begin:

 

I am working in a team of 3 to create a data collection system using arduino and labview. The project consists of creating a robust 3D-printed shell for electrical components, creating an electrical system that can collect the data we need, managing that data with labview through a publishing/subscribing strategy using the arduino mac address, and creating a "professional" GUI with labview that makes the collected data very clear and easy to read/navigate.

 

Problem: None of us have ever used labview before, I have the hardest professor for the class, and we have roughly 3 weeks before we gotta have our deliverable product.

 

We will be collecting humidity, light, and temperature data with the arduino. 

 

Questions:

 

1. We need to figure out how to send data from a server directly to an excel spreadsheet file. Where do we even begin with this? Again, my professor offers no help besides: "Yeah, you have to retrieve the data from the server and somehow store it in excel." -- how do we do this? Is there any tutorials you guys know of that explain how one could take data in labview and automatically send it into an excel file?

 

2. We need to figure out how to display data based on a specific selection like: "temperature from 8-9 pm Thursday + 7-9 am Friday." Again, is there anything out there that can help me better understand how one might do something like that? Would it be smartest to try to retrieve that data back from excel somehow and graph it, or is there a better approach?

 

These are the two most pressing things for us right now. I'm sorry if my questions seem very vague--I'm not even sure what I am supposed to ask or what terms I can use to make the questions clearer, as I literally barely even know how to make a button light up in labview lol.

 

ANY help is appreciated. ANY re-directions to helpful tutorials in something like this would be appreciated. I basically just need to learn labview well and very fast, as my professor isn't giving us any VI's to copy like other class sections are getting--We gotta do things from scratch, and I barely know where to start.

 

Thank you!

 

P.S. Pray for the grades of everyone in my class Smiley Sad

Owning the lifetime of VI launched with "Run VI Method"

$
0
0

Lifetime of VI launched by the "run VI method" are managed by LabVIEW (meaning the caller do not own the the lifetime).

 

Practically, as soon as the target VI terminate, LabVIEW will dispose of all the resources created by the target resulting - among other thing - for every references created within the target scope to die (see attached example).

 

2019-04-05_18-55-32.png

 

I have a use case where this is undesirable and I would love to have the flexibility of the run VI method (generically address control on the target VI) while keeping ownership of the VI call chain in the launcher (like the CBR does).

 

Note: as far as I know, what I am asking is not possible but I would love to be told otherwise.

 

Thanks

 

PJM

 

Note: Cross posted on LAVA Forum

Control and simulation loop

$
0
0

How to pause control and simulation loop at each iteration for troubleshooting purpose.


Control and simulation loop help

$
0
0

Hi, I am running a control simulation and I want to check the value of a parameter on every iteration. I have tried a lot using a breakpoint but after execution vi goes into not responding mode and doesn't show result of parameter. Is there any function, similar to HALT SIMULATION function to pause simulation at current iteration?? Kindly help.

Labview program stop after a few while loop

$
0
0

I have made a labview program to acquire waveform data and save it every 10 seconds, but the program runs good at beginning but will stop after a few loops. The error message i got is" I/O error". i have to restart my oscilloscope, the program will run again. What is the possible reason? attached is the VI.

Thanks.

Read a particular column in excel spreadshee​t

$
0
0

 I ve created a project that measures temperature and water level. I want the data to be saved into an excel spreadsheet. I have attached the project.

 Thank you for your help.

Data aquisition with USB-6363: Buffer overflow due to high sampling rate

$
0
0

Hi,

I would like to aquire data from a USB-6363 device using LabVIEW. The waveform I'd like to aquire consists of multiple frequency components and I would like to monitor how the magnitude of a Fourier component at a specified frequency changes over time. Therefore, I though to read-in data and Fourier transform it afterwards. However, I encounter already problems while reading-in the data. Attached you see my block diagram. The application starts to run, but after several runs, no data is anymore aquired and I encounter an -200361 error inidcating that I have a memory overflow. It seems that the USB connection is not capable to "remove" the data fast enough from the DAQ device. How can I prevent that without reducing my sample rate? I need to sample at 500 kHz since my singals have frequency components up to the region of 100 kHz. However, I don't need to track time change over time with such a high rate. In principle, I would like to aquire a fixed amount of samples (e.g. 6000) with 500 kHz sampling rate, Fourier transform this, determine the power magnitude of my observed frequency and then capture the next chunk of samples. The time tracking resolution (determined by the time between the samples) can be much lower and should be at maximum 1 kHz. So, have you any ideas how to achieve this?

 

Peter

How to display each value inside a string on an indicator?

$
0
0

Hi,

I am a beginner in labview. I want to split the values stored inside a string and display each value on an indicator. The values are acquired from an arduino through serial communication created into a string.

Note: i have tried using the scan string function but it didn't work so i don't know if i am using it in a wrong way or if there's another function i should use.

Thanks.

Viewing all 69011 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>