Hi everybody !
I know the question looks deeply simple but i don't know how to clear my tree, I mean every time I run my VI, the elements are added, I would like that when I run my VI, the tree is empty.
Thanks !
Lucas
Hi everybody !
I know the question looks deeply simple but i don't know how to clear my tree, I mean every time I run my VI, the elements are added, I would like that when I run my VI, the tree is empty.
Thanks !
Lucas
Hello,
In our lab we have a "normal" teststand using a cRio (extended by an ethercat). Currently we are developing on our "production-system" (actually jus via the pc whichwcontrolls thetteststand), but we are going to stsrt long-time-tests in near future. Thus will be no longer possible to develop in this system.
Since we are using shared-variables to ascess the data of the teststand, we are having trouble developing on another pc because we are lacking the shared variables. Consequently the code won't execute and we are notnable tottest it which is obviously a big problem since the "waiting"-time of the teststand is limited, or in other words tests shall be o already conxidered using git to settup two branches (one with and one without the shared variables) but it naturally involves a lot of jerry-picking and rebasing and the risk to overwrite the subVIs with the shared variables. In short, it is not a nice way - sone might call call it the "nuclear option" ...
So has anybody experienced similar problems and how did you overcome them?
Thanks in advance!
Hi,
when I wire a unc path (a path beginning with //?/UNC/) into the start path the dialog never shows and the node throws error 43 (aborted by user).
For quite a while I was really confused, because the vis checking the start path beforehand (e.g. Check if File or Folder Exists.vi) handle the unc path quite alright.
It seems I'd have to specifically check for the path beginning with //?/UNC/ to keep File Dialog from tripping.
Of course it would be best if it accepted the path.
Am I overlooking something, or is there a workaround for this?
Hello,
I am trying to build a simple programme to acquire analog inputs for multiple channels. I need to set pre-trigger samples in the programme and it should be triggered by one of the AI signals. The problems are as follows
1. When I use Reference trigger, I get an error saying my device supports reference trigger only for a single channel and all the other channels should be removed.
2. When I use a Start trigger VI followed by "Reference Pre-Trigger" property node, It does not give values in the pre-trigger. I have attached the VI which I have used to create Pre-Trigger samples.
Hardware
NI- PCIE6361
BNC-2110
Can anyone tell me what is the issue in Hardware/ Programme and how I can put pre-trigger samples for multiple analog channels?
I'm trying to create a Modbus Slave Program to communicate. I followed the following tutorial http://www.ni.com/tutorial/13911/en/ with the exception of setting it up as a Slave and Serial instead of TCP.
The tutorial worked great but i could only bind single Modbus variables to a global variable. I read in the help menu in the "Create Bound Variables" and "Using Modbus I/O Servers (DSC Module or Real-Time Module)", Variables that start with "A- Denotes an array". so i created a simple bound variable to the coil range to start. Then i deployed it and i created a simple Vi, to set the coils but while i was testing it the coils would not update and i verified this by reading the bounded variable and i also used the Distrubted System Manager but i could not read any values.
I'm using Labview 2018 SP1, RT 2018 SP1, basically everything else is version 18.5. I'm using a 8840 controller in a PXI-1044 chassis, running windows 10 32bit
I've attached a couple pictures to hopefully make sense of what i'm trying to do but the unit is stand alone so i had to take them with my phone
Hello everybody,
I have a PCI-9820 card which is connected to the laptop ,it has 2 analogue channel.
So, i'm a beginner i want to do the acquisition of the signal with Labview ,and i don't find any examples of that .
i will appreciate any help .
Thank you
TL;DR: What is LabVIEW's procedure/time-frame for releasing memory back to the OS? Is there any way to ensure that LabVIEW gives up unused memory resources (i.e., to the OS) from queues, variant attributes, DVRs, etc. after they are destroyed/deleted/empty? If not, is there at least some way to have the memory released once the VI is finished running (idle)?
<rant and background information>
When it comes to memory management in LabVIEW, there seems to be two main aspects:
It can be hard to determine when and how these two processes take place. While neither is well documented or frequently discussed online, releasing memory to the OS is particularly enigmatic. While automatic memory management and garbage collection is a wonderful thing (and I'm glad LabVIEW has it), it can be a hindrance in certain situations, especially when its behavior is not predictable and clear to the user.
LabVIEW is relatively liberal with its retaining of memory, which is often cited as a speed benefit since it can prevent repeated deallocation and reallocation of memory resources. However, in cases where a queue or other data structure must temporarily grow to a significant fraction of the system memory, it is important that at least some of the memory be returned to the OS after the operations is complete or else the entire system can come to a grinding halt. In many cases, LabVIEW will hold on to this huge chunk of unused memory even when the system memory is completely full and will not release the memory after the program is finished running.
</rant>
I am working with a SEA 9745 module on a cRIO and I am having issues with getting a consistent connection/registration to the AT&T cell network I have a SIM card for. I have managed to get it connected a few times and I have sent two test messages and received two test messages with the module. So I know the module and antennas work in addition to the PUK number I am providing to the example VIs is being accepted.
My issue is that I can't seem to get the module consistently registered on the cell network. It is always telling me that I am having an antenna issue even though the few times I have had the drivers work it shows the signal strength on the antenna to be good. I was able to get connected to the module via the web page configuration tool. When I look at the RADIO tab I can see that it keeps trying to get connected. It connects for a few seconds, but then losses connection.
If you have worked with this device or type of device I would appreciate some feedback on what you think is wrong if you have ran into this issue also. I have sent an email to customer support, but the company is overseas and feedback is delayed.
Thank you for your help!!
Joe
Hi, im totally new to using the datalogger, i would be very thankful if someone could help.
Im looking to connect an amplifier transducer:
https://www.fylde.com/wp-content/uploads/2018/03/FE379TA.pdf
to the NI USB-6212. There is only a single output from the transducer (with a BC connection), i don't understand which input on the logger shouldi connect the wire to?
Thanks so much.
I am trying to achieve PMU functionality using cRIO 9068 with NI 9242, NI 9238 and NI 9467 modules. I want to set up a LabVIEW application to observe the signals and timestamp them. I downloaded the power_monitoring_starter_kit and followed the instruction to delete the example modules and add my modules in the starter kit. Once I do everything and run the main VI I get an error "Error 61024 occured at niLvFpga_Open_cRIO-9111.vi:5110001". I have attached couple of images for reference. My guess is I am using cRIO 9068 and the program has cRIO 9111 which is creating an issue. I need help to resolve this. Also, I have a very basic knowledge of LabVIEW.
Earlier, I went in to edit an XY graph that was part of the DMC LabVIEW UI Suite. I had a misclick and was surprised when LabVIEW hard crashed. I tried to replicate with other standard XY graphs and was unable to. I don't know what was unique about this graph produced by DMC. Regardless, thought I'd post.
Steps to replicate:
1. Create a new VI and add the DMC XY Graph
2. Right click on the graph and do Advanced -> Customize
3. Mouse over the Plot Legend array until you can select a single element inside of the array.
4. Drag that element outside of the array.
5. Release element. LabVIEW will crash.
Additional info:
LabVIEW 2018 32 bit f2 patch
Windows 10 Professional, Version 1803
DMC XY control is attached.
I am trying to understand how the following can be done, I am trying to recreate a similar step in my application to default all fields ( controls and indicators ) to their default initial values. Why is it important to create the UI references as shown in the figure below and how to do it if I have multiple controls?
how can implement array of microphones to labview with myrio ?
I have a VI that is essentially parsing a text file into commanding Front Panel Objects. The reason for this is to iterate through various combinations.
Anyways-- I'm getting these two errors depending on slight changes to the attached VI.
Error 1: occured at dequeue element. (invalid character)
Error 1122: occured at dequeue element (invalid refnum).
What is causing this error?
Also-- how do I synchronize my state machines so that the three loops essentially correspond to one line of text in sample_error.txt?
Anything else I can do to my VI to make it more seamless?
There are a few VIs in the attached folder. The one I'm talking about is queue_data_error.
I need to install NI Circuit Design Suite Application Software support for LabVIEW 2018 and NI LabVIEW Control Design and Simulation Module support for LabVIEW 2018, where I could find this software?
Thank you
Can't seem to get the Configure VI to work, need ideas and help
From manual-
select parameter command is FUNCtion:impa C
query parameter command is FUNCtion:impa?
Wondering maybe if I might need the command too to set the equivalent mode-series or parallel??
VI is attached.
Hello friends, I am a engineering student and I am currently starting to work with LabVIEW NXG 3.0, but I do not know where to download the NI-DAQmx libraries for this version of LabVIEW and I do not know how to install them either.
I need help plase.
Greetings and thanks.
Hi all,
So I'm aware that similar topics have been discussed before, but my question is slightly different.
What my application is supposed to do - Receive keyboard inputs (from a bar code reader) and take appropriate action. The PC that runs the application will not be accessible to the end user. Only the bar code will be accessible. User doesn't see the UI/PC. Therefore my application needs to stay on top of any other window that may pop-up and keep the "focus" on itself so the bar code input can be read at all times. So what I do in the main code is, every three seconds I call the case that takes care of bringing the UI to the front.
But I'm having trouble with keeping the UI window on top of all other windows and keeping it's focus.
I took the Windows user32.dll approach as you'd guess since this needs to be controlled via Windows itself.
I have attached a test vi taken out of my main code. Below is a snippet
I tried few different dll calls inside the user32 but was not able to achieve what I wanted.
Would appreciate any feedback.
Thanks.