Quantcast
Channel: LabVIEW topics
Viewing all 69326 articles
Browse latest View live

Shared variable 1:N

$
0
0

Hello,
 
I have a two questions about the shared variables. Previousy, I try to set in context my developing application.

I want to have a one server with bidirectional connection with N clients. Clients will be in different computers. These clients will be the user interface of server, that it execute the different actions. The idea is create remote actions in server and check and monitorice it in the clients.

Server will have all shared variables and will send information to the clients. However if any client change a parameter, the other clients must has change also.

Example: Any client selects the setup for execute the test cases.
For do that, is necessary that the first step of clients is introduce server ip address.

The questions are:

1- Is possible that clients has an event structure that check if server has changed any value of any shared variables ?

My idea is set a loop that read all time if variable change. But I think these method is not efficient, because I need a loop for every shared variable....

2- Is possible that server change properties of shared variables?  The idea is that server set invisible a control of a client for example. If it occurs the control of another clients also wil set to invisible

Aimar Roura


Deleting, copying files from USB device

$
0
0

I have a USB device that I am trying to write a binary firmware file to. The device will initially have a generic binary file on it. The sequence of events should be:

Delete generic binary file

Copy new binary file(stored locally on the PC) 

Do other test commands

 

Just for reference, The USB will initially enumerate as a Mass Storage device then when the new firmware is loaded, it will enumerate as a generic USB device. 

 

Is there already built in VIs to do this? Or maybe some documentation on it? My searches haven't turned up anything useful. 

 

Thanks.

NI 9401 Multiple Digital Outputs

$
0
0

Hello, LabVIEW community, I'm using NI 9401 for digital output on cDAQ-9178. I'm using line 6 to send a 160Hz clock signal to the MUX and line 5 to toggle control the MUX reset terminal (function as power control). Toggle ON, the MUX is running at a specific frequency, toggle OFF, it stops. When I try to put it together, it does not work. Any suggestion or modifications would be greatly appreciated. Thanks.

I'm confused about the mathscript module

$
0
0

I have 2017 and 2018 Professional Development (32 and 64 bit for both), does this not come with Mathscript module?  I have data processing code in MATLAB and would like to use it in my labview code which controls motion and DAQ.  I have downloaded both the LV 2018 32 and 64 bit Mathscript files but cannot install them, the message says i have to have LV 2018 installed before i can install the Mathscript addon...

 

I'm confused about this, can anyone help me out.

How to Save Array and Calculations even if I close the VI

$
0
0

Hello Everyone
I am doing a Data System Using LabVIEW, I have finished the VI, But I am facing a problem that when i close the VI, I lose my data and Calculations, I tried to use Read From Spread Sheet, I get the old array and old data but with New Calculations not related or added to old array
How Can i fix this problem in a simple way. and here is my vi in the attachments.

Labview Vision IMAQ Color Learn VI - problem

$
0
0

Dear Everyone,

 

I have the following problem. 

 

I have a school project to write a program which counts the colored things that are shown to the camera, classified by colors. I solved this with the usage of NI_Vision_Development_Module.lvlib:IMAQ ColorLearn. To the 'Color Spectrum' output are wired parallel an '1D array (64bit real)', an 'Index Array' and an 'Array Max & Min'. I wanted to handle only 6 kind of colors (red, blue, green, yellow, black, white). My idea was to show this colors to the camera and check which value has a maximum in the 'Color Spectrum'. It was working fine, the cells were in these patterns: (i=0 - red, i=4 - yellow, etc.). With the 'Array Max & Min' then I get the index of the color with the maximum value in the array and with a 'Case Structure' I made the counting. It was working perfectly, I have tested it in many environments. 

With 7 days after the last save of the program (the final, working version), today I have opened it again but the count does not work, the colors are mixed up. I show red to the camera and in the 'Color Spectrum' it is in i=1, not in i=0 as before, yellow  is in i=3 and so on to the other colors. I have loaded back many backup savings and all have the same issue, every is the same. It will be the easiest to reinitialize my array_index-color_name pairs, but I would like to know what has happened. I have never seen such a thing before.

 

I am using Labview 2016 (64bit) Student License, Windows 7 SP1 (64bit).

 

Thanks for your suggestions.

 

Best regards,

Lajos

 

 

Read variables from a server (local) - LabVIEW 2013

$
0
0

LabVIEW 2013 in Windows 7

 

Hi there, I'm new with LabVIEW.

 

I have to read the variables from a server (local), for my school.

A project that do this is already existing, but I have to adapt it to my variables.

In the old project, the read (lettura) of the variable is made with "Read variable" from Shared Variable (look at image "Reading variable.png"). To adapt the old project to the new, I just browse and selected my new variable.

When I run the simulation, LabVIEW give me back this error (image "Error.png") ->Error -1950679035

 

Someone know the reason? Help please.

Attached there is also the my Project explorer and the my Shared Variable Properties.

 

The new project have already worked, but now I have this error.

And now also the old project doesn’t works no more.

 

The main objective is to read variables from a local server (localhost), and use them in LabVIEW. Maybe someone can me explain easily step-by-step what I have to do, please?

 

 

Thank you very much for your help.

 

 

Problem with memory usage when loading jpeg files into array

$
0
0

Can anyone help me understand what is going on here?

 

"Memory1.PNG" contains the VI in question. I tried to use tbob's answer on this thread to load images into the picture array, "Slides". The folder "Demo Slides" contains 672 jpeg files. The total size of the folder is ~45 MB.

 

"Memory2.PNG" depicts, using Task Manager, how memory usage changes on different events. Running Untitled 1.vi causes an immediate jump that stays constant till the end of the program upon which is a second, huge jump! Then if I hit "Save" on Untitled 1.vi (at any point) after the program has stopped running, some of the memory seems to be deallocated. Finally, only after complete closing LabVIEW entirely, is the remainder of the memory usage freed.

 

"Memory3.PNG" shows the Profile Buffer Allocations results.

 

New to LabVIEW. First time posting. Any and all criticism is welcome! Thank you!


LABVIEW S/N activation

$
0
0

Hello,

   I had my labview 2014 activated in my old laptop and It worked fine. I now have a new laptop and I installed LabVIEW 2014 on it using the same DVDs. When I try to activate it with the same S/N number as the one in the old laptop, it says that the S/N is invalid. Why is that?

How to up-cast array data to a pre-allocated array

$
0
0

Hello,

 

I am writing code to stream images and my main goal is to improve performance. 

 

I have an array of i16s that I need to cast into an array of i32s. In worst case scenarios I have 4096x4096 * 4 (67 million) data points, attempting to stream at 5 FPS.

 

Is it possible to move this i16 array into a preallocated i32 array?

Currently, I am preallocating arrays and using the replace array subset; however, I am seeing that when I convert the i16 array to i32 to then place it into the preallocated array, there is a buffer allocation happening. This happens whether or not I explicitly cast the array. I know that using the memory manager and allocating dynamic resources can be costly (remember, I only care about timing here, I'm already in 64 bit LV).


Is it possible to simply use the replace array subset without the extra allocation?

 

Host to Target DMA transfer timeout

$
0
0

Hi All,

 

I've programmed a 7975R FPGA to perform a cross-correlation between data read from two separate disk drives (HDD 8264/8265 raid arrays). The data sets are quite large ( >100GB ) and so the FPGA is used instead of standard processors to gain a significant advantage  (>100 x) in the processing time. To perform the stream, I first fill up the host buffer for both transfers, start the DMA FIFO transfers, and then it is managed (on the host end) by two DMA FIFO transfers that are running in separate threads (while loops). The two threads are effectively identical, using the TDMS Advanced Asynchronous Read VI to perform the transfer. 

 

Now the problem is that I keep getting FIFO timeouts which I believe to be due to underflow on the FPGA end in that the host processor cannot buffer data to the FPGA through the DMA engine quick enough. I first compiled the FPGA at a rate of 125MHz, which will immediately times out. When I lowered it down to 80MHz, it will transfer successfully for some length of time but consistently will timeout (underflow) after ~ 1-2 minutes. I lowered the FPGA rate even further (down to 40MHz) and it performs roughly the same, which I found surprising. I am now trying it at 10MHz, but this is too low for our application in that it will take far too long to process the data. 

 

I am using an 8133 controller. The data is streamed with 16-bit resolution/sample. I tried changing many of the parameters (FPGA FIFO Size, Host Buffer Size, Write Region size) and there are slight differences in the results, but after a few minutes of the application running, it times out. At 40MHz the transfer rate is 40M x 2 (hard drives) x 2B (bytes per sample) = 160 MB/s, which is well within the specs of the system (~ < 800 MB/s or so). The behavior is a bit confusing as well, in that I don't gain much in the timeout rate from when I go from say 80 to 60 to 40 MHz. When I try looking in other posts, it is generally concerned with transfers from the target to host and I can't find much resources in dealing with this when going from host to target. 

 

Happy to post any code, but first it would be nice to hear thoughts on what I just described. 

 

NKM

COM Ports werden nich erkannt

$
0
0

Hallo,

 

ich bin noch Anfänger in LabView und benötige daher Hilfe. Ich nutze LabView 2015.

Ich möchte gern ein USB Device in LabView verwenden. Dieses erscheint aber nicht im VISA Resource Control. Ebenfalls erscheint kein anderer COM Port der mir im Gerätemanager von Windows angezeigt wird. Im NI MAX sind alle COM Ports rot markiert und als nicht vorhanden gekennzeichnet. Wenn ich das USB Device an einem Rechner meines Kollegen mit LabView 2017 anschließe, dann funktioniert alles einwandfrei. Dort erscheinen alle COM Ports wie es sein sollte.

 

Vielen Dank für die Hilfe

 

how to receiver raw ADC

$
0
0

Hi, everyone. I am new user

 

I have a cRIO 9038 with LABVIEW

How can I get raw ADC value in data logging? I would like to keep log files as small as possible. 

 

It seems to me that I only obtain a real number range.

Thank!!

Shared Library Interface (SLI) is not working properly in LabVIEW NXG 3.0.1

$
0
0

Hi,

 

It is not longer possible to use Drag&Drop holding the Ctrl-Key to use the functions of a SLI in your GVi's.
I tested this on 2 PC's and a Virtual Machine.
I tested it on the shipped example "External Code (DLL) Execution" - Example. There is also a bug that it does not load the dll properly, so I had to choose the path to the dll manually. After that the GVI's compile correctly but I cannot use the functions in the SLI for my own GVI's (unless i copy &paste the blocks).

 

This is more an issue if I want to use my own Dll for my application. The workaround is now to create the GVI's in NXG 2.1, where the Drag&Drop still works, and then use those in NXG 3.0. But that is very annoying.

 

Best regards,

Felix

ファンクションジェネレーターのトリガーについて


SD card

$
0
0

Hello,

I am currently working with a sbRIO9607 card. I made a small expansion card that connects to it. On this second card, I have an SD card. When I put it on, I can't find my SD card on sbRIO9607. So I would like to know what steps I need to take to detect my SD card and then communicate with it.

 

Thank you for your help...

Add more input to the same code

$
0
0

Hi everyone. Need your help again. As you can see in the pic, there are 8 motors with 3 inputs, velocity and acceleration are numeric inputs and direction is Boolean. What I am trying to achieve is that, change this completely so that in the future they can add as many motors without copying the code for each motor again and again. Should be able to add as many motors on the existing code. I am thinking about this as hard as I can, but if someone can give me a small idea on how to go with this, please kindly enlighten me. Thanking You in advance.
Motors.pngMotors.png

Deep learning: Does the 'IMAQ DL Model Run' VI execute on CPU or GPU?

$
0
0

I recently upgraded to LabVIEW 2018 to test the deep learning interface to frozen tensorflow models. After some testing I'm pretty satisfied with the functionality however performance wise it looks like the graph is executed on CPU, not on GPU. I am using a fully convolutional network on pretty large images so GPU execution gives a pretty significant performance boost when executing in Keras in Python. Does anybody know if there are any possibility of changing between CPU and GPU execution in LabVIEW?

Window10: Labview error when communicate with two FTD2xx

$
0
0

Hi everyone,

 

I have two usb-ftd2xx to serial convertor, I_MON interrogator which appearing as virtualCom #3, and the other is thorlab Z812 motor stages controlled using kinesis/labview, when only one is connected, let say the interrogator it will work fine, once the thorlabs motor connected I receive the error (picture attached)… So kindly if there is any can help on this

Best PING, one that understands "TTL expired in transit"?

$
0
0

I wrote a simple network monitor that uses the "ping %s -n %d -w %d" windows command line command as suggested in for example:

https://forums.ni.com/t5/Example-Programs/Native-LabVIEW-Host-or-IP-Ping-Using-LabVIEW/ta-p/3511264

 

But this ping wrongly sais ok if there is an ping error "TTL expired in transit", same problem exists in windows command line,  see screnshot.

 

I find several suggestions if I search for "ping" in the forums, which one shall I use to avoid this problem?

 

By the way most humans interprets this error wrongly also, I know from own experience!

/Ola

 

bild.png

 

bild.png

Viewing all 69326 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>