Quantcast
Channel: LabVIEW topics
Viewing all 69278 articles
Browse latest View live

Changing text colors in string indicator (on Operator Interface) passed from UIMessage

$
0
0

Hi,

I have an Operator Interface built based on the Simpler Operator example found here in this forum. I modified it to have a "Test Result" box which is a string indicator to display the test result (Pass/Fail/Terminated/Error etc..)

I was able to have Test stand output the right test result to that String Indicator box, but I would like to know if I can change the color of the texts that show up inside the String indicator. For example, green color for "Pass", red for "Failed" etc...

I tried using the Case structure, but not sure how I can have the cases to check when the "Test Result" string indicator output "Pass" or "Fail" or "Terminated" etc...

 

Any advice would be appreciated. Thanks!


User Event Data typedef shows as not existing

$
0
0

Hi all,

 

I'm using a producer/consumer architecture with events for user interaction. Here's one of my loops:

 

As you can see, I have a dynamic event registration coming from that RFNMS VI. When I register this user event, it shows the control as existing:

Yet for some reason inside the loop it can't find the typedef.

 

Any ideas?

Signal Generation and CompactDAQ (NI-9263) to output Sine Wave on Oscilloscope

$
0
0

I know this should be basic but I am having trouble generating a sine wave using LabVIEW. I have watched tutorials that use the signal generator and DAQ Assistant tools to generate the signal but only graph the result in LabVIEW.  I followed the tutorials as instructed but when I hook up the output from the NI-9263 to an oscilloscope I don't see any signal. It works when I use the DAQ Assistant test run but not when I use the signal generator. Below I attached photos of the setup I used.  

 

 

Error 1003 at invoke node when exe is run on another PC

$
0
0

I have built an LV application with a main VI calling subpanel VI's. I use shared variables, obtained from a PLC, within the subvi's. I load all the subVI's at exe launch, and keep them in memory so the variables are always updated.

In development, everything works fine, and also the built exe works fine on the development machine.

But when running the exe on a different PC, I get Error 1003 when it tries to load the subpanel VI's.

I have set them to always be included, so I'm not sure why they wouldn't be found or can't be run on a different PC. I am using the DSC module functions on some of the VI's.

 

Attached is a screenshot of the main VI calling the subpanel VI's. 

I am pretty new at Labview and I'm probably not doing certain things the best way, any suggestions with instructions are welcome.

I tried putting the subpanel VI's int he same directory as the main VI, and use relative paths, makes no difference.. I assume it's missing access to the SubVI's but I don't know how to fix or troubleshoot as everything works fine on the dev machine.

I did look around other threads with error 1003 and tried some suggestions, but none worked and none seem to have the issue with the exe not working only when loaded on other machines.

Where do I look?

Thanks, Dan

Memory allocation of in/out pointer from dll

$
0
0

The question I have is fairly generic so I will keep my example the same way.

 

Lets say I have a dll with the following function (exclude all the dll export stuff):

  • void myFunction(int* value);

 

For the sake of this discussion lets assume that the import shared library wizard works properly and the function vi works as expected.

 

I mainly live in the C/C++ world so I know that I would need to created the memory this this item before calling the function.  Such as:

  • int* value = new int();
  • my function(value);

Or

just passing by reference:

  • int value;
  • myFunction(&value);

 

Being new to LabView it is not obviously clear as to how this memory gets allocated.

 

Since this is an "in/out" I have three options:

  1. Only pass a value IN:  Such as a numeric constant which I can make the assumption that this allocates memory for me.
  2. Pass something IN and OUT: Same as #1 I can assume memory is allocated because I passed in a value.
  3. Only consume what comes out.  This is the case I am interested in.  Does LabView allocate this memory or do I need to do it?

 

With all that said the function seems to work.  Even with multiple instances I have not seen any issues; however, it seems a bit weird to me.

 

In summary, does LabView allocate the memory of an In/Out pointer when passing through a shared library node? Is it true for all primitive data types? Including structures of primitive data types? Specifically ones with defined size (i.e. not dynamic arrays int[])

NI VISA Issue with USB

$
0
0

Hi,

 

I am trying to detect a Lecroy oscilloscope connected to my computer through USB port. When I go to VISA interactive control through NI MAX and click on the USB address of the oscilloscope it gives the error in the attached screenshot and then VISA crashes. What could be the reason? It is detecting all the GPIB ports without any issue. This issue comes only when I double click on the instrument addresses connected through USB. Any advice will be appreciated.

 

Thanks,

Sachin Madhusoodhanan

 

Bug / Unexpected behaviour of 'TCP Write' (LabVIEW 2016).

$
0
0

I found a bug or at least an unexpected behavior of the TCP Write command in LabVIEW 2016 and would like to know if anyone else have seen it and if it actually is a bug or somewhat of a 'feature'.

 

The Setup:

ThinkPad running Win10 x64 + LabVIEW 2016 x64.

Lantronix XPort (server that LabVIEW will connect to).

 

 

The Lantronix XPort is a small server that connects to the local wired Ethernet. It is basically a Ethernet to RS232 'adapter' that will accept a TCP/IP connection from any programming language or even Telnet and SSH if needed.

Connected to the RS232 port of this server is a piece of custom electronics used to control a Cryogenically cooled low noise Radio Astronomy Receiver (not a really important detail in itself).

The commands are very simple and straight forward. In this example I'll use this command:

 

5\sRDIN\s\r\n  <-- note the LabVIEW '\' codes display.

 

The command ends with a 'CR' and 'LF'.

The 'LF' tells the hardware to start processing the command and for some reason this did not work at all. It seemed like it was still waiting for the 'LF' (or \n) character.

Sure enough.... hooking up an Oscilloscope to the RS232 lines showed that the 'LF' (\n) was not being sent. It was stripped off before the command was sent over the TCP/IP connection.

When adding in a second 'LF' the communication works as expected. The working command looks like this:

 

5\sRDIN\s\r\n\n

 

Is this the way TCP Write is expected to behave or is this indeed a bug?

Why is the first 'LF' stripped away? Is there a setting somewhere in LabVIEW or is this a Windows thing?

I did notice some odd behavior when connecting via Telnet to this device. Connecting from windows command prompt with:  "CSmiley Indifferent>telnet hostname port" works as expected but starting from the telnet prompt: "Microsoft Telnet>open host port" results in some kind of incomplete connection.

 

Have anyone else seen this strange behavior of TCP Write?

 

Thank you,

Christian Holmstedt

 

 

Newport SMC 100 and VISA read in 1073807298

$
0
0

I am working with a Newport motor controlled with a SMC100 controller.  I got the drivers for this motor here: 

ftp://download.newport.com/MotionControl/Current/MotionControllers/SMC100/Software/

 

I am getting very inconsistent errors that I believe are timing related, but haven't had much success in fixing it consistently.  The vi that I'm using as a base is the SimpleControlExample and vis contained within.  The error that sometimes gets thrown is -1073807298 and this sample vi points to the problem as"VISA Read in read_write.vi->tell current position.vi->Simple_Contol_Newport_mod.vi"

 

The first thing I tried was to modify the timing in the read_write.vi and this seemed to solve the problem.  A few hours later though, it came back (I'm saving files that work as I go along to check whether the problem is modifications I'm making, and this is not the case here).  In the upper level vi, I added timing delays which also worked temporarily, but the problem eventually crept in again.  It's always in the same spot - the Read VISA within the read_write.vi in the tell current position.vi

 

I looked up several possible solutions for this error number and tried all that I found.  I am using a third party serial-USB conversion, but I have no problems initializing the port or homing the motor.  I tried adding in a VISA Set buffer I/O size as was suggested elsewhere and this didn't fix the problem either.  When this error pops up, sometimes the motor will still go to the set position despite the error, sometimes not.

 

Happy for any thoughts or suggestions!


video file transmission

$
0
0

hello, I want to transmit a video file from one system to other system using two USRPs (N2932). How can I do this?

Draw a Graph in PDF format

$
0
0

Hi,

 

In my test, I am plotting a Waveform Chart for different(real-time) parameter. After completing my test I want to save the graph(all datas) as a PDF file. Kindly suggest me.

Re: Use listbox to select multiple channels and display on the graph

$
0
0

hi,

 i need a exact thing for xy graph but i need to select which channel should display in x and y axis  of the graph(i mean channel 1 can be displayed in x axis or y axis) i have tried using a array of ring button but the array goes endless i need exactly for 16 channels and i have attached you a model vi for 2 channels the same thing i need using listbox

labview做锁相环

$
0
0

我想通过labview做一个锁相环,进而实现调制载波的恢复,大家有做过的吗?我想参考一下锁相环怎么做,谢谢。

How to find distance between usrp and the object reflecting its rf wave using 2901 and labView 2014

$
0
0

Hi 

I want to find the distance between my usrp and the object from which its 2.1 GHz wave would reflect back that come in its way. please guide me how would i do this in labview.

 

I am thinking of making a radar that sends a pulse and receive it and find the time thus finding its distance. but the unable to do it in labview. I just need a  guidence that would help me make this labview VI.

 

Regards,

 

Increment Contents of FPGA Block Memory

$
0
0

I am attempting to to perform sequential high-speed data acquisitions using LabVIEW FPGA. I would like to process the data on the FPGA and then record the number of similar events in block memory (basically a histogram), then transfer the data to the host after each sampling period is complete. The time between sampling periods must be small - a few ms. The desired block memory size is 65536 elements, each element being U64 bits.

 

I have done this using DMA FIFOs, but there is a real possibility of overrunning the FIFO due to exceptionally high data rates and long acquisition periods. Additionally, I would be forced to serially analyze each data element in the host (Windows), a process that proves to be far too slow. 

 

The real problem is having to increment (read and write) the data element in the block memory without conflicts related to simultaneous access. Even though I have configured the memory to never arbitrate, I get randomly corrupt data. I have found the following suggestion in many locations on the website (this is one example location):

 

http://zone.ni.com/reference/en-XX/help/372614J-01/lvfpgaconcepts/fpga_memory_items/

 

Of course, when using this design in a SCTL (as I do), one needs to include the the appropriate number of feedback nodes in the read method, depending on the block memory read latency selected. While I have implemented this basic design inside and outside a SCTL, I have yet to get it to function without data corruption (additional or missing events in a record).

 

I am trying to run this process at 50MHz (100Mhz top-level clock). My current development system is as follows:

  • PXIe-1082 Chassis
  • PXIe-8133 Controller
  • PXIe-7820R DAQ Module
  • 2016 LabVIEW Professional Development System
  • 2016 LabVIEW FPGA

 

I have the distinct feeling that this is a simple, standard block memory action and I am just missing something exceptionally obvious. I have attached the FPGA VI and hope that someone has a reasonable suggestion (that won't make me feel too stupid). The latest version of the code of interest is the next-to-last SCTL (vertically arranged).

 

Thanks in advance!

Tony.

HELP: NI-488.2 Runtime 16.0: Service 'NI Configuration Manager' (mxssvr) could not be stopped. Verify that you have sufficient privileges to stop system services.

$
0
0

I receive this message when attempting to update my visa along with some other NI device drivers.

 

NI-488.2 Runtime 16.0: Service 'NI Configuration Manager' (mxssvr) could not be stopped. Verify that you have sufficient privileges to stop system services.

 

What would anyone suggest doing? I have attempted to hit 'Retry' and nothing worked, I fear if I hit cancel it will cancel my whole download (Which I must say is at 40% after 18+ hours). Look forward to hearing what anyone has to say! NI_ERROR.PNG

 

 


labview做锁相环

not executed

$
0
0

I made a PID controller and added a sub vi of a fuzzy controller. The idea is to add the outputs of fuzzy and PID controller. But when I run the vi it does not execute. Her is the vi attached . can some one help.

Trim Whitespace.vi is not re-entrant. Why?

$
0
0

The "Trim Whitespace.vi" is a very useful utility and gets sprinkled all over a project.  It can be called from various simultaneous parts of a project as well as from independent top level VIs.  As such I would think that it should be re-entrant?

 

I have been seeing choppy response from two different simultaneous top level VIs and one of the shared sub-VIs is "trim whitespace".

 

This seems to just be a configuration oversight?  This utility seems to have been written long ago (~ 2002) and has been blocking execution ever since.  I have lots of CPU cores and forcing single threading seems to be unnecessary here.

EHL and MHL-DAQ and MHL-Log

$
0
0

Hi, I am looking for some feedback on the basic QMH (Queued Message Handler) design pattern I have attached.

My main question:

Have I done anything 'wrong' in my code?

 

I can't get data from my ohaus scale in Labview

$
0
0

Hello everyone,
I want to have my bioreactor system automatized with labview. The problem is that I have an Ohaus scale (R41ME30) connected through a RS232 port but I can't get the weight data or tare it. I can do these things using the software real term so I don't think it is a connection problem, just that I bad labview programmer! There must be something wrong in my code. Could anyone help me with this? I have attached my code and the scale manual. Note that the main code (Rumen reactor control 2) has an initialize subVI to start tp communicate with devices and a Measurement subVI where I wrote the code to read the values in the scale. The pump subVi has the commands to tare the scale.

 

Thank you very much

Viewing all 69278 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>