Quantcast
Channel: LabVIEW topics
Viewing all 69098 articles
Browse latest View live

Optimization VIs and parameter bounds

$
0
0

Hi,

 

I have a question which I can't seem to solve: I want to use optimization VIs (Global Optimization or Unconstrained Optimization). I have some parameters which denote a distributions two endpoints, and therefore it is important, that one of them is higher than the other. Is there a way to fix this relation, that let's say parameter no. 1 is always less or equal to parameter no. 2?

I tried applying a min&max function to select the lower and higher one of the two parameters, however I think that it would converge better if I could force their relation.

 

Thanks


How can I communicate with this serial machine using LabVIEW?

$
0
0

Hello all, I'm trying to communicate with a machine that uses serial (RS232), in a LabVIEW program. Here are some important points:

 

  1. I'm using a Serial/USB converter, since I don't have any RS232 ports on this PC (they often don't have them on laptops anymore!), and Windows (10) successfully found and installed the driver for the converter. It now shows up in Device Manager under Ports, as COM3.
  2. I am positive the hardware of the converter works. I have a python script (shown below) from a colleague which was used to communicate with the machine, fetching a value. If I run this script in Python, on this machine, with the converter and such, it fetches the correct value. So we know that that part is working.
  3. NI MAX sees the device as well, calling it ASRL3::INSTR "COM3".

Here it is in device manager:

com3_devman.PNG

 

Here it is in NI MAX:

NIMAX_com3.PNG

Here's the python script:

 

import os
import sys
import glob
import operator
import serial
import time

from string import *
from time import strftime

# Initialize serial
sii=serial.Serial('COM3',1200,timeout=0.5)
print( "BEGIN")

sii.write(b"T\r") 
time.sleep(1)    
temp=sii.read(100).decode("utf-8") 

sii.close()

print(temp)
print ("done")

 

I only include that to show an example of something that does work.

 

Here's the LabVIEW VI I'm trying to use (.vi attached):

serialread_vi.PNG

 

(edit: I forgot to say, for the VISA resource name, I'm using ASRL3::INSTR.)

and it always gives me this error:

 

NI_serialerror.PNG

 

So here's the problem. I'll run the Python script, it will work, and then I'll go to LabVIEW, try my VI (attached, same as image above), it doesn't work, gives that error, and then if I go back to try the Python script again, it gives this error:

 

serial.serialutil.SerialException: could not open port 'COM3': PermissionError(13, 'Access is denied.', None, 5)

So clearly the Port COM3 is getting used, but not given up, by LabVIEW or something. I really don't know much about Ports at all though.

 

Similarly, I can sometimes access it in the NI MAX VISA test panel, but usually at this point, if I try, I get this error:

 

visa_test_error.PNG

 

What can I try? Is there some way to "reset" a COM port? I've tried doing something I read, right clicking on the port in Device Manager, and disabling/enabling it, but it doesn't work (for example, after doing that, I get the same Python error). If I reboot, it works, but that's not a solution I can use.

 

Further, any idea why my LabVIEW VI wouldn't be working, when it appears I'm doing the same thing as the Python script? It appears to be sending "T\r" (T with a carriage return), which I'm doing in the VI. My one suspicion is that the script has that 'b' in front of it, making it a byte stream... I'm not really sure how to do this in LabVIEW. I know about the string to byte array VI, but that produces a byte array, and VISA write needs a string...

 

What can I do? This is driving me nuts. Thank you for any advice, it is much appreciated.

 

 

Does anyone knows about Zebra Printer?

$
0
0

Hi,

      I am using Zebra ZT410 printer for printing barcode purpose and it has integrated with LabVIEW (with .prn file format).

     Is it possible can i use all alphabetic format in this printer (except zebra font)?

LabVIEW Report Generation Word without installing any word processing software

$
0
0

Hi everyone,

               Is there a way to generate a word report in LabVIEW without installing any of the word processing software but only by installing the required libraries (For example : I must not be able to open MS Word on my computer but I must be able to use the report generation toolkit in Labview with a ".dot" file as the template for my report). I am doing this to ensure that the user (to whom a seperate PC is provided as a processor) does not misuse the MS Word license that is provided to him. 

               So, in short the Word Processing Software must not be available for the user to edit (general purposes), but it must be available when LabVIEW libraries want it for report generation.

               

               Thank you in advance

 

Optimum parameter selections for data acquisition and generation tasks in parallel loops.

$
0
0

Hi everyone, 

I have made this vi which performs following operations: 

1) Data Acquisition using DAQmx read at 500k S/s (Producer loop)

2) Data logging in a parallel loop (Consumer loop) in multiple files in TDMS format. 

3) Data generation at analog output using DAQmx write at 500k S/s. I have used the DAQmx write in non-regeneration mode. So, in the first second, I am sending 1msec burst of sine wave with 5k Hz frequency followed by zero voltage for remaining 1sec-1msec= 0.999sec. In the 2nd second, I am sending 1msec burst of sine wave with 5+1=6k Hz frequency followed by zero voltage for remaining 1sec-1msec= 0.999sec.................and in this way, I am increasing the frequency of burst by 1K Hz to reach 50k Hz. It is a kind of frequency sweep. After which I want to stop generation and acquisition simultaneously. 

4) The acquisition and generation is completely synchronized by sharing the 'ao' sample clock with the analog input and by triggering reading operation using 'ao' start trigger.

 

I have few doubts related to this vi:

1) Is it good to keepDAQmx read and DAQmx write in two parallel loops? In most of the online examples, I have seen that people have kept daqmx read and daqmx write in the same loop. In my case, the generated data needs to be updated every second and hence complex function generator sub vi has to be connected to the daqmx write inside the loop. When I am keeping both daqmx read and daqmx write in the same loop, may be due to the complexity and delay caused by this function generator sub vi, this error is occurring ("The application is not able to keep up with the hardware acquisition"). I think the DAQmx read is not called fast enough (due to the slow writing operation) in the loop. That is why, I put daqmx read and daqmx write in two parallel loops. It is working fine but I don't know whether it is good and efficient way of doing things or not because I have not seen any online example of that kind. 

2) What should be values of these parameters: 1) Number of Samples per channel in 'DAQmx Timing''  2) Number of Samples per channel in 'DAQmx Read'. The sampling frequency is 500K S/s. 

Till now in all my previous data acquisition projects, I have always left these two parameters unconnected and error never occurred. But in this project, if I am not assigning any value to these parameters, weird errors are occurring. 

Please have a look at my vi (attached) and suggest me the solutions. I am using USB 6356 board. 

Ghost dependency : LabVIEW 2016 bug ?

$
0
0

Hello,

 

Our applications are developped with low level packed library which are then called by a Main porject.

We have a dependency warning on each low level packed libraries calling library "X" trying to link "VI" (ReadCalibration).


The amazing thing is that on each low level packed librairies or Main project containing "X" find Callers/SubVIs return "No items found" missing item nothing because VI is not used in Main app yet and why this item is in dependecies? open the packed library X tree view above the public place of the library (VI is located above Public in Private folder).

 

What's important to know is that VI is in a private access scope of the library. I tried to change it in public to access it in main project but the Warning remains.

I tried the replace item which make the warning instantly reappears.

 

Moreover this warning is not a problem at all, i can build all PPL and in main project EXE and Installer... !!

But that's ugly and peraphs a LabVIEW 2016 bug ?

I unfortunately couldn't share my code but i can answer any tips you could have to solve this !

 

BR,

Vincent

Add Customized ROI Tool button NIVision Palette

$
0
0

Dear All,

I usally use NIVision Module in order to display camera image, i use also ROI Tools Button ( cross, line, rectangle) to define Region of Interest but I would like to add a new button ( or modify a button) at this palette. Is it possible?

Thanks for your help,

Best regards,

Problem in reading signal at DIO of myrio 1900

$
0
0

Hello,

I am controlling myrio from keyboard (PC)
I am using network-published shared variable to share data from Host VI (k.vi) to target VI (k_myrio.vi)

when I press RIGHT arrow key on keyboard , DIO 21 and DIO 25 should give me 3.3 V (when measured by DMM) but this is not happening, I am getting 1.65 V  Why is this happening?
same in case of LEFT arrow key 
if I add more variable then I am getting less than 1.65 v i.e. 0.75 v 
 

I have attached both vi 
vi computer as a target = k.vi

vi myrio as a target = k_myrio.vi

 and Keyboard is project 

Please provide me solution asap


Open SubVi

$
0
0

Hello everyone.

 

Iam trying to open a Labview SubVi in a while loop but the only thing that happens is that the SubVi pops upp, disapears, pops upp .... and so on...

Which make sense when looking at the while loop...

 

Now Iam wondering how I can make the SubVi stay.... Would one use a variable for that which contains the status of the window?

 

Thanks in advance,

 

best regards,

 

Michael

 

Untitled.png

IMAQ Border size Error

$
0
0

Hello,

I am currently working on an application, which is able to automatically find Gold Particles in electron microscopy- images.

To archieve that, I have built an image recognition Algorithm using NI Vision Assistant 2015 SP1 and integrated it into a UI based on LabView 2015.0f2 (x32).

 

The Algorithm works great for small testimages (ca 10400x6200px). But when it comes to working with the actual images (ca 18900 x 15500 px), the Algorithm quits with an Error.

The Error Occurs in IMAQ Border Size and has the ID -1074396159.

 

I'm using the german Version of LV, so I'll translate the Error description freely:

"IMAQ Vision: (Hex 0xBFF60401) There is not enough memory available for this task."

 

I checked it in the Taskmanager and when the error occurs, there are still like 40% of my RAM (8GB in total) free. So I assume the problem here is, that because the RAM is fragmented, there's not enough available RAM in one connected block to store the image array.

 

Any suggestions on what I can do? Splitting the Image in multiple parts works, but it's not a satisfying solution, as it definitely reduces the usability of my program. Also, slicing the image makes it hard for the Algorithm to find the Gold Particles close to the borders of these sliced images.

 

Is there a way to assign more Memory to LABView, for example as a Swap file?

I don't have a PC with more than 8GB RAM, but if I had one, do you think it would work then, or is there some kind of upper Limit to the Image size?

 

Also, as stated above I'm using the x32 Version of LABView, as I'm not sure whether the targeted system will be able to execute x64 applications or not.

 

I attached a Version of my Image recognition Algorithm to this post.

This is my first post here, so if anything is missing, please let me know it.

 

With best regards

 

Seine_Dudeheit

How to synchronize DAQmx AI/AO finite samples?

$
0
0

Hello,

 

I'm trying to implement an audio generator/analyzer with a NI USB-4431 DAQ card. A sine wave should be generated at the analog output and be measured at the same time at the analog input. There used to be an example "Multi-Function-Synch AI-AO.vi" in previous LabVIEW versions, but unfortunately it's not shipped with LabVIEW 2016 anymore. I found an example HERE, but I have a slightly different setup.

 

The example uses Continuous Samples, whereas I'm working with Finite Samples (because I don't have a stop button in my application). Currently I'm using the error lines to synchronize the output and input tasks. Is this the right way to do? Or is there a better solution?

 

DAQmx_Sync_AO_AI.png

 

Thanks,

Dietmar

 

FPGA reference is lost upon stopping

$
0
0

Hi all,

I'm having a problem with my MyRIO-1900, specifically with the FPGA functionalities. I have looked as far and wide as I can think, but have not yet found a solution to my problem (or even any post that concerns my problem.

So what is my problem you ask? In short: when I boot up LabVIEW, compile my FPGA-VI and run my Master-VI (on the RIO RT Target), it will work just fine. However, after I stop the Master-VI (using a soft-stop, not abort) the FPGA reference is lost and the only way to restore functionality is recompiling the FPGA-VI.

 

Long explanation now. I'm using a slightly modified version of the default FPGA personality, which is expanded to include 4 encoders on the A and C channels and do manual indexing of the encoders. Also I removed some excess code from the B-connector and Accelerometer/button, since I don't need those and the compiler began spouting errors about potential memory shortage (although the final placing was just fine). Anyway, those changes are probably not the cause of my error, since I can run the FPGA-VI in interactive mode just fine.

I'm also using a Master-VI to do a whole bunch of calculations and merge information from a large set of inputs. The problem is in the interaction between the two. It seems as though Open FPGA VI Reference only refers to the correct FPGA VI directly after compiling, but once the connection between the two is closed once, this reference is lost and communication fails. The error LabVIEW gives me is 'Error -63195 occurred at niLvFpga_Close_Dynamic.vi. Hex 0xFFFF0925 The handle for device communication is invalid or has been closed. Restart the application'.

Thing is, restarting is insufficient. Rebooting the RIO-1900 doesn't help. Closing and restarting the project doesn't work. Erasing the bitfile, re-downloading, restarting also doesn't help. As of now, the only thing that works for exactly 1 attempt is recompiling and running immediately after.

Other things I've tried:

-The problem is not in the VISA session being closed: I added the solution from http://digital.ni.com/public.nsf/allkb/CB82AC9CBC6C3F2386257241007A06EF but to no avail.

-I've tried this with one of the examples (customized FPGA Signal Generator). This one doesn't seem to have the problem, but copying the approach for opening and closing the FPGA didn't solve it.

-I've played around with the settings for automatically starting the FPGA-VI. Auto-run upon loading to the MyRIO, upon calling the reference, manually starting... all the same. 

-Both 'Close' and  'Close and reset if last call' from Close FPGA VI Reference have the same effect.

 

So, I'm at the end of my rope. I've included the FPGA and Master VI so you can see the part of opening the reference and closing it, but I'm not sure how much more MWA I can make it except for just opening and closing the FPGA VI reference. (Some of my code is not supposed to be shared with the outside world, so I can't post the whole batch).

 

I would greatly appreciate any help, pointers, tips, what have you. 

 

Force LabVIEW to ignore directories?

$
0
0

Hi all,

 

is there a way to deny LabVIEW access to certain folders?

I'm currently changing my software from one architecture to another and in the process, certain files may exist twice (I don't want to delete my old work in case I need some of it later on).

So I want LabVIEW to only access my new folder structure with the new VIs, not the old ones. Can I hide/blacklist a folder somehow?

 

Thanks!

 

VISA Read function

$
0
0

Hi everyone,

 

In my application i communicate with a device with USB UART , i send a command that returns a 48 caractere string, but the READ function return only 7 caracters, what i have to do so the READ function returns all the 48 caracters 

 

Best regards

Re: Multiple readers of an RT FIFO?

$
0
0

Hi everyone

 

I'm using rt fifo queues to transfer logging data (DAQmx) and setting data, start and stop boolean etc. from the host to my PXIe controller. In most cases it works fine. However, when I want to send a "stop" boolean to stop the data logging, the command never (or just occasionally) arrives at the target's side. When no logging data is transfered the boolean arrives. When I start the logging, the "start" boolean arrives as well - that works fine.

I guess that the logging consumes to much resources of the rt fifo queueing?!? I've also tried to enlarge the pre-allocated queue size...

 

Thanks for any helpful suggestion :-).

 

cheers


Which design pattern to drive several instruments?

$
0
0

Hi,
I wrote LV code to read and write on several instruments such as flowmeters, temperature and pressure regulators... It's working perfectly, but i'm sure that the code itself can be improved by using design patterns.

The code is composed of several independent loops, one for each instrument. They work the same way :

  1. initialisation of the instrument (open com port, read some parameters, etc.)
  2. while loop with event structure to read by default or write when there's an action from a user
  3. close the communication when an error occurs, when a user pushes a button or the general stop

This general structure just works in a serial way with error structures linked with wires. One of my coworker advised me to use QDMH pattern instead. Is it a good choice as it seems quite complicated at first sight? Is there also any other improvement i can add?

 

Please find the vi attached, there are a lot of missing subvis, but you can get the idea.

Thanks for your help

 

LabVIEW 2016 Run time error

$
0
0

Hello all,

 

I'm writing this post to see if any of the other users on here are seeing a similar problem to mine...

 

When building an exe in LabVIEW 2016 the runtime engine is generating the following error when it starts for the first time, and randomly when started after that.(see attached).  It appears that this error is exactly the same word for word every time I see it.  Pressing OK bypasses the error and starts the program without incident. It's always at the beginning when you run the exe.  I've never seen it during execution.  I've been dealing with this problem for about 6 months now.

 

Here is what I've tried so far:

 

1. I'm talking with NI support and (it appears) there are not any other tickets referencing a similar error, I'm still working with them.

2. I've built and run multiple executable on various machines around the office and the error follows. It doesn't appear to care what the exe has in it (i've seen it on basic adder programs and actor framework builds ~200 VIs) and it doesn't care which computer does the compiling; I've tested on two different machines and it still shows up.  The error appears if debugging and 'wait till remote debug connect' option is also checked.  It will pop before my software starts and is waiting for the debug connection.

3. I've moved to the f1 version of the run time.  

4. I've nuked my NI install.  Went into the NI uninstall window and clicked 'remove all'. Reinstalled this morning.

5. I've installed all of the recommended updates from the NI update tool.

 

As you can see I'm a little frustrated that I can't find a solution to this.  It makes deploying software a nightmare because I never know if and when a customer might see the error and I just have to say 'It's a problem with the runtime engine and I'm working on a solution. It won't effect your program.  Please ignore for right now.'  It makes LabVIEW look bad and I don't like throwing them under the bus, but it does appear to be agnostic of my code, I was convinced when I tested the wait till remote connection function and it still failed.  

 

Right now I'm at the point that I would like to do a big distribution of code around my company that I can't directly support, my code is ready but I can't get this error to go away.

 

I've attached a program that I've built that generated the error for me.If you'd like to try it and see if you get the same that would be appreciated.  Unfortunately I can distribute much internal code since it's IP, but I will try to provide any samples I can.  

 

Thanks for taking the time to read this and let me know if you  have any ideas or questions.

 

Regards,

 

Tim

 

 

Output frequency is different then the set value

$
0
0

I am generating sine wave using "simulate signal" block and sending it to DAQ Assistant to excite a shaker. The settings of Simulate signal and DAQ assistant are shown and the VI is also attached. The frequency being sent to DAQ is same. BUt when I am measuring the output frequency of shakr it is always coming higher. I am using BNC NI 9260 card and the shaker is a pssive device.

Can someone please guide where the problem is?

Thanks and regards

 

 

Efficiently Finding Max Intensity Coordinates of Image

$
0
0

Hello,

 

I am currently attempting to find the (x,y) coordinates of a U8 Greyscale image. Currently I am using the Labview IMAQ Light Meter (Point) inside a double for loop which iterates through all of the image's pixels and find the maximum indices of the resulting array. This, however has proven to tak ~1 minute to calculate for one image, which is FAR too slow for the project we are working on (ideally it would take <0.1 seconds to compue this for one image)

 

Anyone know of a more efficient way to find the coordinates of the max intensity pixel?

 

Thanks,

 

-Steve

1D Array Threshold Value Search

$
0
0

I have a 1D array of data. I would like to search the array for the first time it exceeds a threshold value.

 

I have tried using the "Threshold 1D Array" VI but not getting the results. It looks like that VI searches for 2 consecutive values, which occurs much later in my array.

How can I search for the first occurance of a singl point that excedds a threshold I set?

Viewing all 69098 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>