Quantcast
Channel: LabVIEW topics
Viewing all 69298 articles
Browse latest View live

Adding AI and AO channel to VI

$
0
0

Hello,
I've started to work with LabVIEW a few weeks ago and already managed to create a few VI's. Unfortunately I recently have the issue, that I can't add AI oder AO channels from the Project into the VI. I used to drag an drop them from the Project file to the VI and it worked somehow. Do you have an idea what the Problem could be?
I use the NI 9066 Chassis with NI 9213 and NI 9205 modules. 
The firmware is updated to the latest Version, but the NI MAX still tells me to update the Firmware. You'll find attached a screenshot of the NI MAX and the Project file. 

If there is anything else needed, feel free to ask!


Various data type conversions

$
0
0

I am working on designing a digital controller to drive a real time target. I need to use an observer and am having nothing but problems with incomparable data types. I look in the numeric >> conversion pallet but nothing seems relevant to my particular case. Any ideas? Am admittedly relatively new to LabVIEW in general. This is the first time I have attempted to use CD Ackerman, CD Construct State Space and the DT Observer.

 

Please see screenshots and VI.This is a SISO system with 3 states.

 

Thanks for any help.

NI Partner Meeting

$
0
0

NI Partner Meeting doesn't run now. Whats the problem ?

Thursday, August 20, 2020

10:00 AM – 11:00 AM

(GMT-05:00) Central Time (US & Canada)

Top Level VI has DELAYED Shutdown

$
0
0

Hi,

I'm working on an application that uses subpanels to display different GUIs of several subroutines running in the background. The subroutines are run by reference at start up using the invoke node RUN method and then shutdown using the Control Value Set method. A user can select any specific subpanel. The remove subpanel method is called before each insertion of a subpanel vi.

 

The issue that I'm having is that the main top level vi does not shutdown immediately when you click the exit button and this delay increases depending on how long the main vi has been running. I didn't notice this issue initially because I only run the vi for a short time to check my code. 

 

I don't see any errors when I shutdown the application with execution highlights activated. I only notice a buffering type delay before the vi exits and I can't the source of the delay.  When I tried adding remove method to the "stop and close VI refs" routine I get invalid method error message.

 

I will appreciate any recommendation and suggestions. The application is large and also due proprietary reasons, I cannot post attach the code.  Please see attachments for some screenshots of the code and main vi front panel during a delayed exit.

Increasing Execution Speed

$
0
0

I am using Labview 2015, SP1.,Version 15.0.1f1 (32 bit).

I am receiving data from a digital scope via USB3. I am getting  data array with data points separated by 8 ns. This means the PC should process ~256 Mb/s.
I perform the search of the trigger in the incoming array than I extract and sum portion of this data based on the position of the trigger. This still works fine but when I added a search of another trigger the PC was already unable to process all data in time. So, I lost a lot of data.

 

Therefore, I have the question. How to enhance the processing of this massive data coming from the scope? Can I solve the problem using more powerful PC? Currently, I am running I7-4790CPU @ 3,6 GHz,  8Gb, Win7 64bit.

 

Would I get a direct correlation of the labview processing speed with the benchmark of the processors which can be found in internet?  Can a different version of Labview, for an example a 64 bit one, perform faster?  

Discover LabVIEW tools - Have you tried VIPM.io?

$
0
0

Hello everybody!

 

Wondering how many people have tried the new vipm.io site. We have added a ton of features to make it easy to Discover LabVIEW Tools and there are some cool ones coming soon.

 

Check it out and let me know what you think 😀

 

VIPM-Search-SocialMedia2.png

My First VI

$
0
0

Hey, this my first post so please, go easy on me.

 

I'm a programmer, recently very interested in labview. I already took my first try - VI for my geodesis GF, decoding data from lidar (binary) to more pleasant form. I know it's probably awful to watch and it should be do in much more efficient way but hey - it's working! (in attachment)

 

My question is - where to go next. I feel like begginer tutorials are too easy, and there is not so much otptions on the internet about Labview.

 

I'm considering Labview as my new favourite thing, so help me - where to start seriously?

Baud Rate into a XNET Cluster

$
0
0
 

In the insert below it shows 500000 being added to the XNET Cluster.

When I try to do the same I find that Baud Rate is not in the pull down menus.

So how do you get it to do what it is showing.

Is there a problem or am I just doing something wrong

 
 

Fig 4.5 Create Cluster and Frame in CANFig 4.5 Create Cluster and Frame in CAN

 

 

 


Different Conditional Disable Symbols for Different BuildSpecs in Same Project

$
0
0

I wanted to have different buidspecs in my project set different values for a Conditional Disable symbol.  I searched, but everything I found said it was impossible.  Well, just because something's impossible doesn't mean you're not allowed to do it.  See attachment.  (So there!)

Mantener waveform

カメラを用いた距離測定

SQL Database and LabVIEW EXE

$
0
0

Hello,

 

We have created a series of large data files in Excel.  We are retrieving data from these spreadsheets.

 

Considering moving to SQL database or equivalent but no experience in this field.  Searching for the best long-term technical solution to managing large sets of lookup data with LabVIEW EXE.

 

Ideas, guidance, assistance would be appreciated.

 

Thanks.

複数バッファ書き込み、出力

OpenG add on

$
0
0

Hi,

 

I am unable to download the openG add on from NI website. Any help is much appreciated.

LABVIEW NVG conversion

$
0
0

Can anyone help to review and check whether what is the error of this gvi. I have convert it from labview 2019 .vi into .gvi.


NXG 5, "Big" *.dni chrash

$
0
0

Dear Community

 

I observed in NXG5 a reproducable crash when I add to many items to a .NET interface node.

 

It starts already when the filesize of the *.dni is in the range of 500KB.

OK i know i can make more smaller dni's but is this the expected behaviour ?

 

In Detail: It Happens e.g. when I use the HDF5 .NET implementation. HDF.PInvoke.

Some thinks are already working but when I try to add the Namespace H5T to a *.dni it completely crashes.

It already happend when i select an assemby(namespace) with just 204 items (H5T).

 

I also observed this behaviour when i uses too much items of the mscorlib.dll.

 

Is there a workaround or some other good trick out there or do I just missing something to select ?

 

 

Have a nice weekend

Automated vision inspection test for LED Emission

$
0
0

Dear all,

          I need to make a test case whether my test hardware indication led is working properly and it was working based on the sequence of operation. My hardware test equipment consists of 6 series led(multicolor) and it will be enabled and disabled based on the operation it is carrying inside. I have Intel Real sense Depth camera D435 for my testing with me. Please guide me for interfacing and operating with that camera and compare it with the image backup(template) to make the test pass/Fail using vision acquisition, ImaqDx, or Vision Builder.

                       I have mention below the library file link shared in Github, Which is working fine with LabVIEW 2018 version. please help and guide me to complete this test case I haven't work in vision Acquisition Tools.

 

 

 

 

PID with LabVIEW model

$
0
0

Hi. Actually I'm not sure if this is related to LabVIEW, but I have no idea where to ask this question. So please help me if it's possible, or remind me if I should remove this thread.

I'm trying to create a LabVIEW subVI, which simulate level of a water tank.

meihk_0-1598010130667.png

(the SimulateInlet's range is between 0-100, which will increase the level of tank if Inlet's value is true; the ControlValve's range is between 0-100, which will decrease the level)

The subVI above is used inside a while loop in main VI, and execute every 3 seconds:

meihk_1-1598010378223.png

 

I also have a code in C language like this:

float error=Level - levelsetpoint; (the levelsetpoint is fixed at 30, for example) sum_error+=error; controlvalve=Kp*error+Ki*sum_error+Kd*(error-previouserror); previouserror=error;

(the code is run every 3 seconds)

The problem is, I don't know how to find the value of Kp, Ki and Kd.

 

 

 

 

 

 

Starting/ending two digital output signals with finite samples exactly at the same time

$
0
0

Hello everybody,

 

my goal is to generate two digital output (DO) signals with finite samples. The thing is that they have to start at the same time and after N-samples they have to end at the same time as well.

The first DO signal is a clock signal. The second DO signal is the digital data signal.

In my application i have a daisy chain of 16 shift registers each storing 8 Bits which requires a total of 16*8=128 Bits digital data output signal. With my DO clock signal i will push the 128 Bits data signal to the shift register chain, one Bit at each rising edge of the clock . The clock has to start and stop exactly at the same time with the digital data signal otherwise data will thrown away from the shift register daisy chain.

 

I used another main clock signal to trigger (with DAQmx Start Trigger Digital Edge) both my clock DO signal and digital data DO signal so they start at the same time but it didn't help. Sometimes the data signal starts before the clock signal or vice versa. How can i ensure that both signals (my digital clock output signal and digital data output signal) start and end fully synchronised which means starting/stopping at the exact same time everytime. What am i doing wrong, how can i change my VI to achieve this goal ?

 

Some details about my VI:

The while-loop on the top gets the data for digital data output signal from the user so it is not relevant for the problem i have. After the user sets the data which is going to be pumped to the shift register daisy chain, the generation of the output signals happens inside a Flat Sequence Structure.

 

I am using:

LabVIEW 17.0.1f3 (64-Bit)

Hardware -> NI PXIe-1073 with NI PXIe-6358

 

Any help is highly appreciated.

Best regards,

Ecafer

Naming results file programmatically

$
0
0

Hello,

 

    I'm working on a test setup where the user enters data about the UUT they're about to test. Stuff like model number, serial number, tester name, and whether the test is an initial test or final test. I'd like all this data to automatically be used to create the output file name.

 

Something like: S16-10_SN1234_Mike_Initial Test

 

What's the easiest way to do this?

 

Thanks,

Jay

 

Viewing all 69298 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>