i am working on labview base project that is bomb disposal robot. so kindly can someone tell me or send me some link that how to interface micro servo motor with labview 2014....
robotics
Feature Request: Notification for Build Status Complete
I would find it extremely convenient to have a notification option (or options) to notify the user when the Build Status for a project is complete.
Sometimes when building executables or installers--or both--the compiler can take a while depending on the size of the project. During this time I don't just stare at the progress bar, but found that even if I left window open in plain view (off to the side or something), it is not obvious when the process is complete (the 'Done' button changes to Enabled and that's about it).
Options to notify the user could be any number of things:
- Beep when complete
- Bring the dialog to the front (only works if it's not visible already)
- Flash the window on the taskbar as a notification (maybe)
- Pop up a modal dialog
- Some combination of the above
- Also possible, a checkbox on the Build Status dialog to receive a notification or not
Responding to two user events with same name
LV 2013, Win 7, & LVRT 2013, PharLap OS
I have a situation where I create an ARRAY of User Events, one for each "domain" .
A "domain" is an area of DAQ: there is a SCXI domain, a CDAQ domain, an EtherCAT domain, several domains dealing with TCP instruments etc., about 25 domains all told. The name of the event is "New Channels This Domain".
When a new configuration comes to the PXI from the host, I sort out the channels by their domain and generate a user event for whichever domains have channels that have changed: more channels, fewer channels, differences. If a domain has no channels that have changed, I do not generate an event for it.
There is a handler for each domain: in that handler, I pick out the event for this domain from the array, and register to receive it. I have an event structure that responds to the NEW CHANNELS THIS DOMAIN event. Since it is an array, then no matter which domain I select, the name of the array is "New Channels This Domain".
All this works fine.
Now I have a situation where it makes sense to have one handler for TWO domains.
I can pick out TWO elements of the array just fine, and register them both. (See attached pic) But now, in the EDIT EVENTS FOR THIS CASE list, I have TWO events called "New Channels This Domain". I can select one for one case, and the other for the other case and it seems to work.
--- Is there any heartache ahead with this scheme?
--- If I change the events in the clusters before, is it going to confuse LabVIEW?
--- Is there something I can do to change the name of the event after I pick it out of the array?
--- Is there something I SHOULD do at that point?
Like I said, it seems to work, but I'm leery of it staying that way - I've had event structures disturbed before (LV2010) when changing cluster order.
Image may be NSFW.
Clik here to view.
Project cannot find Stop Task.vi
First off I have LabVIEW 2013SP1 and 2014 installed on Windows 7. I do not understand why National Instruments keeps mulitple versions of their software on a computer each time you upgrade, but here is my problem.
I created a project in which I created a couple of excutables. Everything was going good. I was then looking into a way to save the scales with the excutable so I created an installer. Everything looked good. the excutable works. Now on the original computer I open the project hoping to edit the vi to make a new excutable, and it cannot find anything.
When I attempt to open a VI in the project it starts searching for files. It cannot find DAQmx Clear Task.vi, DAQmx Stop Task.vi, Convert 1DWfm to DDT.vi, DAQmx Read (Analog 1D Wfm NChan NSamp).vi, DAQmx Read.vi, DAQmx Start Task.vi, DAQmx Timing (Sample Clock).vi, DAQmx Timing.vi, DAQmx Create Channel(AI-Temperature-Thermocouple).vi, DAQmx Create Channel (TEDS-AI-Temperature-Thermcouple).vi, DAQmx Create Channel(AI-Voltage-Basic).vi, DAQmx create Channel (TEDS-AI-Basic).vi, DAQmx Create Virtual Channel.vi, and DAQmx Create Task.vi.
After this, it opens the VI. It states a warning that a driver or toolkit component is not found. MIssing resource file "daqmx.rc". Where is this located? How do I load it? Why all of a sudden are these items missing when this was the computer I created project with?
I have reinstalled the 2014 after this, but noticed an issue installed the driver disc. I did NOT uninstall the software. Do you feel this is required, start from scratch?
Any help is clearly appreciated.
'Control terminals on connector pane not on top level block diagram.' comment on CLD report
Hello All
Could anybody enlighten me please , what does that comment mean on CLD report
'Control terminals on connector pane not on top level block diagram.
Does it mean that some terminals are hidden within some case structures and not showing on block diagram without going into case structures or by 'top level block diagram' it means
main.vi and controls on main.vi must also be connected to its connector pane?
Thank you
K.Waris
Find peak regions in an array
Hi,
I have a question on how to detect peak region. Let's say I have a parameters named floor value ( for example 0.0002 ) in the following graph.
As you can see there are 2 regions above that floor value ( graph number of points is 64 )
How can I detect peak regions ( 2 in this example ) because if I search for values above 0.0002 there are many points
How can I get the following data from this graph>>>>> peak 1 , starts from 18 to 26 . peak 2 starts from 32 to 37
THanks
Image may be NSFW.
Clik here to view.
stationary graph
Hi,
I am very new to LabView and am having trouble right from the beginning with a graph. I am collecting data from a temperature probe and want to continously collect data. I have my graphs set up in a while loop and the graphs are collecting data fine except that the the entire line of the graph moves when temperature changes instead of previous data points staying stationary. How can I make it so that the graphed line is stationary expect for the data it is currently collecting?
Thanks!
corrupted data display from server to client TCP/IP
I copy the zip files that has the project and all the vi and Subvis. The two main Vis are: New Client.Vi as the client and multiconnection-server-with waveformGen.Vi as the Server.
The problem I am having is that when I fire up the server then fire up the client in waveform mode. I notice I get cDAQ data first before the data is switched to waveform. My guess is because the server is running in Cdaq mode as default so when the client is turned on by the time it could send the command to the server to switch to Waveform, It already receives Cdaq data. How can I stop it from doing that.
Also, Once I get the waveform data displayed on my graph, I notice that the waveform does not move it at all. The timestamp is changing and moving accross but the waveform remains stationary. Why and how do I change that? Is that a sampling issue or a graph property issue?
You will also notice that I can change the sampling info wht the server is set to Waveform but I cant do that if I set it to Cdaq,I tried using globals to get the sampling info and feed it to the read Daqmx but somewhat It causes error.
My files are attached wit this email, I am using a Cdaq -9172 with two NI 9239 cards. You might have to reconfigure the Daqmx config.vi. The confige file for the channels for the waveform is attached seperately.
Your help would be greatly appreciated!
Writing to remote file from cRIO
On a Real-Time cRIO, I would like to create a reference to a file on a remote Network Attached Storage device. I know it is possible to do this from a Windows XP PXI chassis by wiring <ip address>\\<file directory> into the Open File utility, but this does not seem to work from a Real-Time cRIO chassis. Is it possible to create a reference to a remote file from a Real-Time cRIO?
micro epsilon CTL-CF1-C3 infrared Sensor
Has anyone interfaced with Micro Epsilon CTL-CF1-C3 infrared sensor? I have been sucessful with talking to it via Micro-Epsilon's utility CompactConnect but not with Labview.
I have tried using MAX. The sensor is recognized under the Devices and Interface tree but in the VISA Test Panel under the View Attributes tab, Is Port Connected (VI_ATTR_ASRL_CONNECTED) states that it is is an Invalid Property.
Particle in a Box
Hello everyone,
I'm very new to LabVIEW and programming languages in general. In our class, we have been asked to create a program that describes the motion of a particle in a 2D box. That particle in to be subjected to user selected forces, acceralerations, etc. The problem is, our instructor pretty much everyone for us to do and didn't really explain how to anything.
For the simple case of a particle moving around in a box, I understant the logic behind it. Just give a particle an initial position. To that initial position, add the displacement that it occurs after time dt, by the linear equation, r(j+1)=r(j)+vdt. Then if the particle reaches the boundrym, reflect the corresponding component of the velocity. Can someone just please explain to me how I would go about putting any of this down?
I've attached a copy of all that I have. Am i going in the right direction or am I completely wrong?
Thank you,
Javier
Using Reference trigger in a for loop
I am using NI PXI and RFSA library. My question is:
Are there any examples of how to use reference trigger in a for loop?
I want my application to make several measurements during a certain period of time ( 1 second). These measurements are triggered when the input signal reaches desired value. After a short measuring session I want my app to stop fetching samples from RFSA and wait for another trigger to begin the session again. All is done in a for loop.
Should I stop the session using Abort.vi every time I want to wait for another trigger? I mananged to make my app work only using such chain of VI's:
RFSA config -> niRFSA Configure Trigger -> niRFSA Initiate-> niRFSA Fetch IQ -> niRFSA Abort -|-> niRFSA Initiate ... and again fetch and abort
I think thats not the correct way to do that, however thats all I managed to create using examples in LabVIEW. I appreciate any kind of help.
Abort task that has not yet completed writing
I am generating some analog signals through DAQmx -- these waveforms can be >10 seconds long by necessity (the sample rate is very low). I would like to be able to stop the output of the waveform when I press a button. What happens now, when I use the Clear Task vi, Labview waits some time and throws an error (Error -200292, Some or all of the samples to write could not be written to the buffer yet. More space will free up as samples currently in the buffer are generated.). The task is subsequently cleared, which is what I want, but I don't want the wait time. Is there some way to interrupt the writing of the task? I know it is theoretically possible, becuase I can stop the vi and the output stops, but that is not really a feasible solution.
Thanks!
Critically Dampened 2nd order lag
Y(s) = W^2/(S+W)^2 X(s) (Pardon my Greek)
I thought I had a fairly good demo of this filter. no overshoot- no stastical deviation from a "Simulink" model implementation (OK I have a "Simpson's Rule" error between the Simulink model and the discreete implementation (No, I cannot show the code)
The question comes down to more: "How do I educate the Engineer?" What I heard in the discussion was "What good is LabVIEW if I cannot drop a discrete transform block?"
OK, I can configure a 2nd order filter easilly enough but, how do I prove the implementation of an n-Order filtet in LabVEW meets his critea for a 2nd order function out of matlab? (Yup- that lvann*.dll sure does not document methods specific to filter order)
Setting up a discrete "Classic" fiter results in a non-critically dampend system- That really is bad in this case!
So, I'm not as "Trustworthy" as the Mathworks or National Instruments in developing this incredably limited filter but, I cannot find a generic filter to replace the Simulink block.
-All Ears!
Increase UDP sending size over 64k bytes and get error -113,sending buffer not enough
Dear all,
I have a case that I must send a data over 64k bytes in a socket with UDP . I got a error-113 shows "A message sent on a datagram socket was larger than the internal message buffer or some other network limit, or the buffer used to receive a datagram was smaller than the datagram itself.".I searched for this issue and got the closest answer as below:
http://digital.ni.com/public.nsf/allkb/D5AC7E8AE545322D8625730100604F2D?OpenDocument
It said I have to change buffer size with Wsock.dll. I used the same mathod to increaes the send buffer to 131072 bytes by choice optionname to so_sndbuf (x1001) and give it value with 131072 and it worked fine without error. However I still got an error 113 while sending data with " UDP Write.vi ". It seems UDP write.vi reset the buffer size? Are there any other things cause the error?
I attached example code. In UDP Sender.vi you can see I change send buffer size to 131072 and send date included a 65536 bytes data.There is also a UDP receiver.vi and there is some missing VI which you can get from the LINK. But it's not necessary.
save waveform from TDS3000
Hello & Good day
I think this post have been asked before. and solved by Dennis_Knutson
(http://forums.ni.com/t5/LabVIEW/save-waveform-of-tds3000-with-time-axis/m-p/2318464#M728026)
I did the same thing to save my waveform. But when i compared it from the graph( i rightclick at the waveform and export to excell) the data is not the same.
Here i attached 2 excel file
1=from push button save (when pressed it save the data)
2= i rightclick at the waveform and export to excell
6430 Current Sweep below MicroAmp Range Instrument Hangs Up
Hey,
I am using the Keithley 6430 to source I and measure V (with Pre-Amp Connected) .. All works well as long as I stay above MicroAmpere Range (Meaning 1E-6 Amps to 105mA) the instrument works fine but when I try to source below 1E-6Amp the instrument just kind of Hangs up on the start current value and never moves further and after a certain time has passed absurd reading are displayed as a result ... The VI and screenshot is attached ..
Does anyone know why this happens ??? (Sweeping Voltage and Measuring Current works fine .. No problems)
If I configure the sweep using the instrument front panel (manually) .. It works without any problem ..
Any feedback would be appreciated ..
Thanks
Example VI for reading Touchstone data files
Can someone please suggest an example VI for reading a touchstone data file (A__S_params.s2p) for me as a starting point. I want to get the magnitude and phase data of S21 to design a filter. I have tried to use Touchstone Data Plugin with the (DataPlugins - ATML) project from the LabVIEW example finder to first convert Touchstone file to TDMS format but have not been successful. Thanks.
executable opc connection
Hi all
I-trying to build an executable which reads values from a RT-target (cRIO) and then sends this data to an opc-server.
I managed to read the values from the rt-target, but unfortunatelly the connection to the opc seems not to work. I created a library with the opc Server and the variables in the project. if I run the vi on the pc with LabView installed everthing works fine. Also if I try the executable on this pc. When I change to the other PC where I want the executable to be running. The programm runs with no error, but the data never gets to the OPC-server.
Has somone any idea why this happens?
Thanks
Manuel
fpga labview 8.2
I am having issues programming my FPGA using Labview FPGA. I already have a bitfile that I am using with the FPGA reference, but the FPGA is not getting configured. I am using Teststand software and calling Labview code to set up a reference to the FPGA. That reference is then being saved to a station global.
Also, the FPGA PXI card that I am using does not appear in MAX explorer as a RIO device. I tried reloading NI RIO 2.1 but that did not help. The FPGA pxi card is PXI-7833R. Any suggestions?