Hi guys,
I'm learning LabVIEW NXG at the Moment.
How can I protect LabVIEW NXG file (*gvi) with a password?.
Thanks,
Muaadh
Hi guys,
I'm learning LabVIEW NXG at the Moment.
How can I protect LabVIEW NXG file (*gvi) with a password?.
Thanks,
Muaadh
Hello,
whats wrong ?
So I have this problem where I need to be able to autogenerate the selection of lines to output waveforms on.
Normally to select multiple lines, one would write for example 'cDAQ1Mod3/port0/line0:7' or 'cDAQ1Mod1/ao0,cDAQ1Mod1/ao1,cDAQ1Mod1/ao2,cDAQ1Mod1/ao3' to the 'DACmx Create Virtual Channel' but how to do that dynamically?
If it were a string, my approach would have been to have array where all the output lines are listed and concatenate those lines that are needed and input the result to be block.
Is this possible with 'channel constant'?
Alternatively it is possible to 'disable' lines that do not from part of the 'output selection'?
I have a simple XL Report that creates and saves a file in a specific location on Disk.
Works perfectly in the development environment.
It does not work when built as an executable.
LabView 2011. Windows 10
Any suggestions?
KM
I'm having problems in upgrading an application from LV2015 sp1 to LV2019 sp1. I detected memory leak after running the executable recompiled by new version of LabVIEW. After some debugging I found that this problem is related to the asynchronous call VI. Calling the VI several times without closing the reference to it seems to behave differently in each version of LabVIEW. The figure below shows the minimal code needed to reproduce this issue. The test.vi just makes a simple addition operation and it is called with call and forget flag enabled.
I tested this code on both versions of LabVIEW. The 2015 version is installed on a Windows Server 2008 R2 and the 2019 version on a Windows Server 2012. The handle count curve looks very similar on both version of LabVIEW. However, the result of memory allocation is totally different. On 2015 version the memory allocation seems to happen by chucks of data, steadying after some time. While in 2019 this process happens more smoothly but keeps increasing the memory allocation indefinitely.
Performance LabVIEW 2019
Performance LabVIEW 2015
As a workaround I could open and close the VI reference every time it is called, but this could increase latency on the application execution.
Is this behavior expected from LabVIEW 2019?
Does anybody have experience with creating a normal dll in LabVIEW (not an interop assembly) and loading this dll in .net Framework or .net Core?
I am trying to share LabVIEW code with .NET developers via a dll that I have created in LabVIEW but have run into problems when simply attempting to add the dll to the list of references in .net. Any comments and help on this topic would be appreciated. Thanks.
Besides "don't use that motion controller". I'm kind of stuck with it sadly. A person before me bought them and they're sufficiently fancy that buying new ones is pretty cost prohibitive. The device has an RS232 interface, expects ASCII commands, uses no flow control, and has multiple nodes that are controlled by the one RS232 port. The expected command structure is as follows:
Writes:
[Node Number] Command [Argument] CR
Reads:
Response CR LF
There are a couple of ways I can see how to handle this, but I'm wondering what the best practice is here. Especially because the final program will probably do tens of millions of commands through the next couple of years, so even a small speed boost should save me quite a bit of IRL time.
Option 1:
Because the command structure has all of the relevant info before the carriage return on the response command, I can simply do a standard write-read way more bytes than I expect with CR termination character-clear buffer to throw away the LF. This is probably the simplest option code wise, but it feels very hacky and I'm unsure of how fast it'll run in practice.
Option 2:
Don't use termination characters on reads. Instead, do a write-while loop-wait-check bytes at port-read bytes available-check for lf-end loop if lf is there-store in shift register if not-repeat- contenate strings after loop terminates-output result. I don't like this because it uses the bytes at port node I've only heard bad things about, and while I have a decent idea of what the pseudocode for this would look like, I don't have a full mental picture of what the program would actually be in labview. Plus, I kind of have trouble believing that this would actually be faster than simply clearing the buffer given how many more operations it involves/I'm realistically going to wait more time than you actually need to in between iterations.
Or is there something I'm just missing and there's an even better way to do this?
And as a semi related question, what's the consensus on setting the end mode for writes/reads to termination characters during port configuration? It seems like activating it would save me a little bit of coding time, but if there are compelling reasons to not use it I'd like to hear them.
please somebody help me out. I am currently a student and beginner to this entire labview thing...i was given assignment on labview cld security system topic but with a twist of using client server concept by implementing tcp/ip protocol...
I have no idea on even how to begin...can please someone help me...please am very desperate...please help me out
Hey!
I am currently trying to program a dice using a random number generator. I have programmed it to light up
1 LED when showing 0
2 LEDs when showing 1
3 LEDs when showing 2
4 LEDs when showing 3
5 LEDs when showing 4
6 LEDs when showing 5
The number generating part of the program works good. The problem happens when it is supposed to light up the LEDs. Instead of lighting up multiple LEDs it only lights up one.
For example: The number generator generates the number 4, in this case it is supposed to light up 5 LEDs. But instead it only lights up one LED. Sometimes it doesn't even light up at all and sometimes it lights up different LEDs that show the wrong number.
This program used to work perfectly before but i did some small changes, after that it stopped working. I have also resetting the changes but I still can't get it working as I don't know what broke it.
All help is apprecieted!
I have attached a screenshot of the program.
Good evening everyone!
Before I dive into my questions, I'll provide some background regarding the code. I have been tasked with developing a random number generator that generates 20 values that will be written to a csv file. In addition, I must provide an input for users that allows them whether or not if they want to generate number random values or recall values from the csv file. This is where the write/read delimited spreadsheet nodes come into play. I have a few questions regarding the Write and Read delimited spreadsheet nodes. In regard to the write node, my data always starts in the second row of my csv file. Why is the data entry starting on the second row of the csv file? This is a problem because when I read the file, I get a row of zeros output in my 2D (it should be the random numbers). Also, my waveform graph comes out looking crazy when reading from the csv file, is there anyway I can clean this up? I have attached my logic and csv file in a zip folder. Thanks for the help everyone.
When scripting a named bundler or unbundler, adding elements tot eh node can be done two ways: The methods "AddInputAfter" or "AddOutputAfter", and "Resize:AddChunk" (with argument to add elements to top or bottom).
The cluster unbundler on an in-place structure, in contrast, offers the method AddOutputAfter, but the "Resize" method is not the same. There are no arguments, and the help indicates that it literally resizes (in pixel dimensions) the bundle node. I think it does not do this, as resizing this node in this way is not possible.
So, while I can programatically insert elements before the first element on a named un/bundler, it appears I cannot do this with the similar node on the in-place structure.
Any suggestions for a workaround? I could add the items to the end and then rearrange, which requires some rewiring of the existing terminal (more code, but doable).
Hello Guys
We have a question for you,
We would like to repressent multiple blocks with a singol block, it’s possibile to do with Labview?
Thanks
Marcello e Maria
Hi everybody, I would ask you some about how to implement on a Labview, how to create a Choose implementation dialog box, a step-by-step manual of how to create this dialog box would be very useful
I wanted to connect to ODBC in Filemaker. When going to add DSN "Filemaker ODBC" did not show up. I had to download "FM18_xDBC_18.0.1.exe". When I ran this program it flashed and disappeared. Apparently is justed unzip to the same directory and made a folder "FileMaker 18 xDBC\ODBC Client Driver Installer" in that folder is 2 installers, FMODBC_Installer_Win32, FMODBC_Installer_Win64. The both need to be install appartently according to "https://www.youtube.com/watch?v=8T_M8lDwGg0" here is a guide to finishing. "https://fmhelp.filemaker.com/docs/edition/en/fm_odbc_jdbc_guide.pdf"
more later
Hello all,
I have a trapezoidal force trajectory and have a Cursor.PosX which is supposed to indicate the end of the plateau and have been told to create an Active Cursor to indicate the beginning of the plateau. I am a true beginner and am being led through this with few instructions. I am using labview 2017. I'm sure its relatively simple but I haven't found an answer in my searches on search engines, this forum, and the stickys here. I have attached my vi. Please let me know if I can answer any questions or if you have some good resources for beginners!
I have been looking for an example that will allow me to generate two different tones right and left channel using a sound card.
All of the examples I can find generate a mono signal of the exact same frequency and amplitude on both channels. The Sound Output Configure.vi lets you pick one or two channels but if I select one channel what channel is selected right or left? How do I select the Right channels for a 1khz tone and the left channels for a 5Khz tone?
BTW: I need to generate the tones not playback a waveform or recorded stereo sound file
I am fairly new to LabVIEW and I am trying to us a NI-9860 with a cRIO-9045. The module doesn't work with FPGA so I have it listed as a Real Time Resource. Research I have done says I need to use XNET to use this module. I am able to install XNET on my computer but when I try to install it to the cRIO it always fails at the same point and I get this error message.
Without XNET installed on my device I don't think I can use this module for CAN messages. What could possible be the cause of this error and how could I fix it?
Thanks in advance for any advice or help for this problem!
I entered a formula with several variables with no problems (all green) but how do I define the variables outside of the Formula Express VI, since it has no inputs to wire up, other than the error in node. I tried using indicators and naming them same as my labels but that doesn't work and I can't find any examples. Thanks for any help someone may provide.