Quantcast
Viewing all 69214 articles
Browse latest View live

Problems "doing too much" with a USB-6009

I've developed a little routine to measure muscle twitches using a small triaxial accelerometer.  We have a stimulator that produces an excitation pulse to start the recording, which consists of three voltage channels at 1KHz for 400 points.

 

Our prototype uses an NI USB-6009 Multifunction I/O device, with the TTL Pulse from the stimulator going to the PFI terminal on the 6009.  Here is the code to set up the Triax Task:

Image may be NSFW.
Clik here to view.
Triax Task.png

When we are ready to record, we do a DAQmx Start Task, then do a DAQmx Read (Analog 2D Dbl NChan N Samp), which waits for the trigger pulse, then delivers us 400 data points.

 

This works fine, but I want to Have My VI and Demo It, Too.  The Prototype, with both the 6009 and the pulse generator, is currently "in use", so I decided to take another 6009, hook it up to another accelerometer, and use LabVIEW to "fake" the pulse generator.  I tried two things, and neither of them worked.

 

The first thing I tried was to use a "non-triggered" form of the Task.  I modified the Task definition by removing the DAQmx Start Trigger function.  I then started the Task, and in a For loop, waited 500 msec and did a DAQmx Read.  When the For Loop was set to 1, I got an array of 3 channels, 400 points (as expected).  But when the loop was set to 2, the second iteration generated Error -200278, Attempted to Read Sample: 400; Property: RelativeTo; Corresponding Value: Current Read Position; Property: Offset; Corresponding Value: 0.

 

I don't know what this means nor why it failed, but figured it had something to do with missing a "Restart" signal that had come in on the PFI line.  OK, so I'll "add to the Task" by modifying it to include a Digital Out channel that I can toggle on and off to create a TTL pulse I can wire to the PFI line, since I know triggering works.  Here's the code:

Image may be NSFW.
Clik here to view.
Modify Triax Task.png
 This, however, immediately generates Error -200559, complaining that I'm adding a DO channel to a task configured for AI lines.  [I'd also tried creating a brand new task for DO only, and got yet another error that I figured meant I couldn't have two tasks running in a single USB-6009 -- not illustrated here.]

 

So I can't get the 6009 to trigger itself, and I can't do multiple Finite Reads without a trigger.  Oooh, all this writing has given me an idea -- what if I go back to the Finite Read model, but put a Start Task and Stop Task "around" the DAQmx Read (i.e. inside the For loop) -- would that work?

 

I'm going to post this here, go do the test, and come back with the results.  This will either be Success (and a Solution), or Failure, and a cry for help ...

 

Bob Schor

 


Error loading EthernetIPInterface.dll: Missing export 'FreeSid'

I have a Real-Time Application that I have built that is being deployed to a NI PXIe-8133 controller.  I have successfully built the application with no errors.  I can run the built application from a development computer and it works as intended.  However, when I deploy the application to the PXI controller and run the application as startup on the PXI, I receive the following error: Error loading 'ethernetIPInterface.dll' : Missing export 'FreeSid' from ADVAPI32.dll'. I have attached a picture of a screen shot of the PXI.  I have also attached a picture of the software that is installed on the PXI. 

 

I noticed in version 1.2.0 of NI-Industrial Communications for Ethernet/IP there was a fix for an error that is pretty similar to what I am experiencing.  The readme file for version 1.2.0 reads: Bug ID 305672 Resolved a problem that EthernetIPInterface.dll is not exported when building an EIP windows application.  This sounds similar to my problem except that my problem says that 'FreeSid' is missing export.  And I am using version 1.2.0 of NI-Industrial Communications for Ethernet/IP.  I have thought about upgrading, but I would rather not if I don't have to.

 

If anyone has any thoughts that would be great.

 

Thank you,

 

Shoe

.NET Invoke does not update array passed to it - Makes new!

Hello everyone! This is my first post on these forums. I am having a problem that is stumping me.

 

It is as follows:

In my code I am invoking a function located in a .NET 2.0 assembly. The only documentation I have says the following:

 

"int ReadHPTDCHit(HPTDCHit buffer)
Copy TDC data into buffer. The number of data words that were read is returned as an integer. All available data up to the size of the buffer is read."

 

However, it would appear that when I get my HPTDCHit buffer back from the invoke node, it has not used the array I supplied it with. Memory concuption increases by every call of ReadHPTDCHit. To my limited knowledge, it seems that the array I supply is no longer handled neither by Labview, nor the DLL. 

I have attached a picture showing a minimum working example of my problem. It is analogous to what my main program is doing. To get it to work, I needed some more "fluff" around the problematic part to properly start and stop my hardware. This can be ignored.

 

Explanation of the VI: A 1000-element array of HPTDCHit .NET references is created. This will the the buffer that the function requires. It is passed to the function repeatedly, and the return from the function is reused. One should imagine that the original array is forever reused, but what I experience is that a new array is returned from the invoke node on every call. The original array is lost to the wind, but the memory is still allocated.

A possible solution is of course to feed the original array into a "close reference" function on every loop iteration, but that would seriously degrade the performance of the readout. A quick test shows that dereferencing 1000 .NET references takes 2.4ms. The amount of data I am expecting (a few MHz) could easily overflow with such a delay. At 2 MHz, 4800 HPTDCHit  objects is generated every 2.4ms.

 

Any suggestions on how to solve this would be greatly appreciated.

 

Printing on a DYMO LabelWriter 450 Turbo...?

Hello everyone,

Currently I'm in the need of printing bar codes using a DYMO LabelWriter 450 Turbo. I've read 2 post with a related issue, someone trying to print using the LabelWriter450. In that post, the .VI has attached, this person developed an ACTIVEX base VI, in which he/she managed to print label with different barcode values (task that I want to accomplish also)

 

I've read the SDK manual from the DYMO webpage, to understand the block diagram code and a question popped up:

 

String Indicators:

GetDymoPrinters: Here it appears printer it was selected.

Getobjectnames: it doesn't appear anything when I run the vi.

 

String controls:

  • Filename: [the path of the layout label file. It's extension can be ".label" or ".lwl". In my case I saved my label template as ".label"
  • Field: [in here what do I enter???]
  • Text: [here do I write down the new value which I want the label to be printed with??]

 

Any tips on this matter is well appreciated.

 

In advance,thanks to all.

Problems with Feedback Node

I'm trying to control the position of a stepper motor with labview. I'm using the square waveform vi and trying to count the number of pulses and control the duty cycle to thus control the position of the motor (open loop). However, when running my code, it seems that the feedback node does not output any information above like 3Hz inputted frequency to the square waveform vi. If I remove the duty cycle input to the square waveform vi, it works fine. Because of this problem, the "actual position" indicator does not change. I am directly looking at the feedback node with the signal 4 graph.

 

Thanks in advance!

Mason

Read and plot single colum textfile

The assignment is:

Plot the entire ECG signal and one period of the ECG signal. Point out two distortions which are observed in this signal and motivate why have they occurred.

 

-all i hade is a text file with 1 column and 4000 rows. What do i do? Is it possible to read from spreadsheet?

 

please help med, this is the first time working with labview

Acquired Image from Webcam and Convert into Binary Image

Hello guys!

 

I would like to post a question, but for some unknown reason there isn't any button for me to do that so I will post question in here instead. I tried follow the example code from: https://decibel.ni.com/content/docs/DOC-32584

to acquire image from webcam and convert into binary image. I download the code from that link and it work fine. However when I tried to replicate the code in another empty project it does not work: the binary image remain black all the time. Please give me some suggestion.

 

 

Cannot find Labview Function

I can't find the labview function that are marked in red and blue square of the upload picture. I belive it is one of those IMAQ function but I couldn't find its VI pallete.


Intentionally Cause LabVIEW to Not Respond

Perhaps not a common question to be asked (didn't see this question in the forums), but is there an easy way to intentionally get LabVIEW to not respond?  I have a watchdog timer that communicates to my test equipment through LAN (outside of LabVIEW), and it is working great, but I want to test it to be sure that if LabVIEW specifically goes into a "not responding" state, as in the window grays out and I cannot access anything, that the watchdog timer in the background sees this and flags.  I've already confirmed its operation by intentionally creating various errors in LabVIEW, but want to test this final case before I move on.  I was thinking some sort of intentional memory leak but also do not have days to wait to see if the memory locks up.  Any help would be appreciated. 

 

Ben

 

PS. One other thing, getting LabVIEW to not respond can be done outside of the LabVIEW as well (doesn't need to be in the LabVIEW code itself.  For example, is there a process under task manager that I can kill to get LabVIEW to not respond?)

question about Conditional Disable Symbols

Hi,

I found a problem when Im using the 'Conditional Disable Symbols'.

I have a labview project which I set a Symbols 'debug mode', which I guess lots of engineer will use it similar when debug VI.

all works fine.

My question is on this:

Im not 100% sure but looks like when i want switch the mode, if i just only change symbol value and save the project. it will not take effect to the real VIs which will call the releated condition structure. Instead I have to open all VIs related before change the symbol, after change the symbol just save all (project property and VIs)

In the other word if I do not open VI but only change project property, then the change will not happen on VIs.

Is that the truth? If I have many VI so each time change the symbol, I have to open all VIs and save?

 

Thanks!

labview FPGA 2012 the compilation cannot be performed by the compiler worker

I am using labview 2012. i have installed FPGA2012, labview2012 FPGA module xilinx tools 10.1, LabVIEW FPGA Driver for Xilinx SPARTAN-3E XUP, LabVIEW Realtime 2012.

i am not able to complie the code. i am getting these messages

FPGA 1.png

FPGA 2.png

 

 

Subtract filtered signals

Hi crossrulz, I need help subtracting two filtered signals that are ubicated in different frequencies.

Well I have a stereo signal with 3 signals (L+R,L-R and pilot), I filtered the signals L+R (50-15khz) and L-R (23k-53k) and now i need to move the L-R signal from 23k-53k to 50-15k so then i can subtract them.

I will attacht my program in Labview 8.5

thank you for your help

Control a device with serial communication

Hello,

 

I'm working in a project that consist of controlling 2 axis with serial communication (RS 232). I already did the control of one axis : I put a distance in mm and with my program I calculate the correct frame to send to the device that control my axis.

 

131072 point is a displacement of 2,5 mm.

For the moment, I just want to control 1 axis.

 

But now, I want the program to send by itself several coordinates which are in a txt file (. We need to respect :

- at least 500 ms to send from the program to the device cause of the serial communication.

- If the first coordinate is 0, the second 20 and the third 15, that means the axis should go from 0 to +20 and after 15-20 = -5.

 

The encoder need to communicate the position of the axis all the time (in point), so when the axis is going from 20 to 15, it needs to communicate so the program know if it can send the next position. I put in the Vi "Response of the displacement from device" that will simulate the displacement. If the current displacement is egal to 2, so it means that the axis has arrived at the position (I will change this later with the correct value cause you don't have the device) otherwise the motor turn until it arrives.

 

The problem is that my program stop working when the value of Response of the displacement from device is not egal to 2 and when it's egal to 2 it send all coordinates but that's not what I want ...

 

I hope you understand, sorry for my bad english. You need to download all of 5 files to work :

 

- Projet.vi is the program.

- calcul_checksum.vi is a VI that calcul the checksum.

- displacement.txt is the txt file that contains coordinates.

- XY. is the the program that found X and Y coordinates begin with the G1 lines and put them in array.

 

thank you, best regards.

LabVIEW: (Hex 0x627) Keyboard error

Hi,

I am using a myrio 1900, and want to control a few things using the arrow keys.

I have made a VI for the same, but when deploying it to the myrio, I get this error:

 

LabVIEW:  (Hex 0x627) The function name for the lvinput.*:getKeyboardState:C node cannot be found in the library. To correct this error, right-click the Call Library Function Node and select Configure from the shortcut menu. Then choose the correct function name.

 

Image may be NSFW.
Clik here to view.
deploy error.PNG

I tried mass compiling it, but still the same error.

I have also attached the VI to detect key press.

Does anyone have ideas on how to fix this? Help will be appreciated.

 

Thanks

LabVIEW WINCC OPC

Have a Transmission device,use siemens simotion D425 and s120 to realize the torque load;use wincc to acquisition of the motor torque and rotational speed.use LabVIEW to read wincc OPC server motor torque and rotational speed and able to Send the instructions to control S120.I don't know how to solve this problem and how to start ,Specific implementation process is what? anyone have the tutorial?

Thank you very much.


firing an LED value change event

On the host vi I have an LED which turns on when the cRIO target enters a particular state.

I have been unable to fire an event on the host when the LED changes its state from false to true.

Any help or an example on how to generate events without user intervention will be greatly appreciated!

Questions about "Sound output write"

Hello,

 

The function I want to realize is quite simple. Reading data from DAQ, if a falling edge is detected, play sound files using "Sound output write". If a rising edge is detected, stop playing.

 

However, if I run the vi, when a falling edge is detected, sometimes it plays the sound file for 2 or 3 times, sometimes the vi will get stuck.

Could anyone help me find the problem?

Thank you.

 

PS: I know using "Play Sound files" is easier, but "Play Sound Files" will have a tiny delay before playing, but I want the vi start playing immediately without any delay.

VISA Write problem

I have a problem with tunneling the strings comming from a for loop to the VISA Write VI. The for loop generates drive commands ( for example DRV 50,75) with a curtain frequency. Those commands have to send directly with the VISA Write VI. The tunneling is the problem. If I chose the Tunnel Mode "Last Value", only the last command is send. How to solve this problem???

Thanks and regards...

Problems using Clusters or Arrays of Clusters with Unit Test Framework

Hi All
I am having problems setting up some unit tests with a complex control.  The control is a Cluster of Arrays of Clusters of Arrays etc.  When I start creating the test, the Unit Test Framework will recognise the control's structure.  But when I resize the first array, the structure of the control is no longer recognised and I can no longer access all elements in the control.  See attached screen shots 1 and 2, as well as attached code to see what is happening with the Display Data Store.vi. I have tried to ensure that there are no special characters in the control, and I have even removed all spaces too. I am using LabView 2014.
I would appreciate if someone help me understand how to get the Unit Test Framework going for complex controls such as these.

Bug or feature: IMAQ Particle Filter 3 with ROI connected erases particles outside ROI

Hi,

 

I just ran into something odd. When using the ROI Descriptor connector on the IMAQ Particle Filter 3 (using LV2013) particles lying outside the ROI are erased no matter what the filter criteria are. The particles within the ROI are scrutinized and accordingly kept or erased. This is regardless of the "Keep/Remove" connector value.

As the help states that the connected ROI "defines the Region of Interest (ROI) within which the particle filter is performed" I would expect the function to keep all particles in the image that are outside the connected ROI untouched. Have a look for yourself with the attached VI.

Am I barking up the wrong tree here?

Viewing all 69214 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>