Quantcast
Channel: LabVIEW topics
Viewing all 69758 articles
Browse latest View live

Graceful deploy to cRIO

$
0
0

Is there a more graceful way to deploy a built executable to cRIO? Is that what the new Undeploy feature of LabVIEW 2021 is all about?

 

I keep getting this dialog on deployment.

 

Photon_Dan_0-1632159448279.png

 

Attempts to programmatically end the execution of the Real-Time Startup Application prior to new deployment do not seem to work.

The next issue is that clicking OK on that dialog always leads to this next one.

 

Photon_Dan_1-1632159541693.png

Which leads to...

Photon_Dan_2-1632159581814.png

...making me have to do the Deploy again. That brings up this one which is somewhat similar to the first, but this time there is an Apply button.

Photon_Dan_3-1632159695177.png

At that point, I see some Deployment progress with a bunch of files scrolling by. Often, I also see the Waiting dialog like this.

Photon_Dan_4-1632159758732.png

After that, the deployment succeeds we reboot the cRIO to run the new Startup Application.

 

I have been living with the behavior for some time. Is there a better way?

 


NI 6583 maxes out at 15MHz?

$
0
0

I am using the NI 6583 to read in a clock. I need it to ideally go up to 80MHz. I have my setup where my clock is essentially tied to a counter.

 

I have my clock connected to the PFI/STROBE/ DDC A pin of the 6584 cables and my vi is essentially a timed loop (with DDC A being the frequency) and in the timed loop is just a feedback loop of a boolean getting NOT a bunch as well as a CLIP that counts every 80E6 positive edges.

 

However, I did notice that anything above 15MHz it errors and gives me this error essentially (https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000001DbHOSA0&l=en-US) . According to the datasheet, it should go up to a lot greater than 80 MHz (9ns is ~111MHz). and I'm running my clock way before I start my code.

 

Any guidance would be appreciated! Thanks!

accelerometer Data acquisition

$
0
0

Hello!

I have a problem with my accelerometer data acquisition, I use an ADXL203EB 2 axis accelerometer, when connecting it to myDAQ, I used the one simple (on demand) acquisition mode, and I wonder how to control the acquisition rate with this method. I tried continuous samples with the sampling rate that I wanted 100Hz but the results comes too odd!

Thank you for your help in advance!

 

Ali

MIFSystemUtility-DLL kann nicht geladen werden.

$
0
0

Meldung.PNGHallo,

 

Ich habe NI Package Manager installiert und versucht, ´´Labview Student 2019´´ zu installieren, erhalte jedoch immer die Fehlermeldung "MIFSystemUtility DLL kann nicht geladen werden". Ich habe einige Lösungsvorschläge Gelesen u nd Versucht , ES Wie b eschrieben zu tun , aber ich b ekomme immer noch sterben samt Fehlermeldung . Ich weiss nicht , wie ich das machen soll .   

 

Ich habe die Schritte , die in diesem Link b eschrieben sind , b efolgt

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z0000015CbwSAE&l=en-IN

 

Kann mir jmd bitte sagen, wie ich Labview installieren kann?

Ich benutze Windows 10

 

Danke im Voraus

 

Program working in 2019 version but giving 0 values in 2015 version

$
0
0

Hello

 

I have a code that incorporates 'iirnotch' filter and 'trapz' function through mathscript RT module. I wrote in labview 2019 version and it worked fine. But when I saved it for labview 2015 version and ran it from another computer, it is giving all 0 values for all outputs. What should I do? 

Communicate with USRP’s FPGA over 1gb ethernet

$
0
0

Hello,

 

I’m interested in communicating with an x310 USRP’s FPGA over 1gb ethernet,

I have checked out NI forums, and I did find an example of setting up a USRP for 10g ethernet support, do you have any resources or advice on setting up a USRP for communicating with the USRP’s FPGA over 1g ethernet?

 

Thanks in advance

Project of my class

$
0
0

Hello community, I want you to help me with a project, I would appreciate it very much, this project is about some seats in a cinema that when selecting them change the image of the seats and as well as each one of them, I have an example but it has to be different from that example, I would appreciate if you could help me with this project as it is very important, thank you very much.

 

How to convert .c to .so file ,so I can integrate the .so with myrio

$
0
0

Hello

In my project I have developed many function in matlab and now I want flash this to myrio .So what I thought is that I convert the .m file to .c to using matlab coder and then convert this .c to .so using eclipse real time and use this .so file via call library function using labview. But I dont know how to convert the . c to .so .Can anyone please elaborate the steps to convert .c to .so and integrate in labview? 


LabVIEW BLE Toolkit read characteristic values

$
0
0

Hello everyone, 

 

I have loaded a script to an ESP32 board that sends characteristic values (IMU data) to a central device. In LabVIEW, I used the LabVIEW BLE (Bluetooth Low Energy) toolkit and the ble_read_write_characteristics.vi. 


My device connects successfully and its name appears on the respective indicator field. My problem is that I don't know how I can read the data sent from the ESP32 board. When I use my phone instead of labVIEW I get those data as notifications. So, is there a way to do the same in LabVIEW?

Thank you

Having problem of reading file path of the writing to file module

$
0
0

I have a NI 9229 that generates data and the write to measurement file write the acquired data into text file normally. The write to measurement file is adjusted to generate an output file every one hour. After the output file is generated, a python node reads the path of the generated output and perform certain tasks. However, the path is disappeared as soon as the output file is generated. Also, I think that the python node should start working after the output file is generated. Kindly, help me to solve these problems.

TDMS: Best way to store small integers with scaling?

$
0
0

I'd like to log high-speed data in a TDMS file. The data arrives as I16 (non-DAQmx), with some scaling + offset.

 

If I call "TDMS Set Scaling Information" and then write I16 arrays to the TDMS file, they all get converted to DBL behind the scenes (NI_DataType == 10). This seems a bit wasteful and causes file bloat.

 

Is there a way to store scaling information without triggering an implicit conversion to DBL? Or would I have to scale all the data myself and then write smaller types (say, I32 or SGL) into the TDMS file without writing the scaling info?

enabling of timeout in Open Application

$
0
0

I would like to control B.exe from A.exe through Open Application in the same computer (localhost).

I expect some delay time in A.exe to prepare B.exe for 60 sec (default timeout) but it took only a few seconds.

Under this situation, a user should take care of the sequence or timing of executing applications not to have a connection error.

 

I suspected to open the network port before executing Open Application but same result after clean booting of computer.

*)This problem was shown in LabVIEW environment, too.

 

How can I enable the timeout (60 sec)?

 

labmaster.

Unpacking I32 into 10bit format , then from 10bits to 16bits Integers using MSB

$
0
0

Hi there, 

 

I've an ADC card which reads the waveform and outputs data streams. The acquired samples are output in a int32-bit data format array. Each int32 data returned contains a number of 10-bit raw data.  In order to read the data correctly, It is required to unpack the output data to retrieve the samples in 10-bit format, then converted from 10-bit to 16-bit Integers using the most significant bits.

 

Meanwhile, I have no idea how to unpack this int32 array into 10-bit  then into 16bit integers with MSB since 10-bit conversion is not a built-in function in LabVIEW.  The only post that I found similar is this one

https://forums.ni.com/t5/LabVIEW/How-do-you-read-in-and-work-with-I24-binary-data-into-LabView/td-p/3126027?profile.language=en

which is trying to convert 24bit to 32/16bit. But I'm still not sure how could I apply this into my case. Any help would be appreciated.

 

Thank you for your help in advance!

 

Marcus

Error 7 ocurred at New VI in MemberViCreation.lvlib

$
0
0

When I try to select a VI to override from the parent class (which is FrontendInterface, it is possible to see it in the picture) this Error ocurrs and I do not know why. Someone had the same problem? Any idea how I can solve this issue? 
Labview_Error7.jpg

Looking for Cleanest Way to Check Indices

$
0
0

I am processing some data in which one of the data items is an index. It should be that each index is present exactly once.  I can assemble the indices into an array, sort it, then check that the element values match their indices.  But I also want 2 lists: One of indices that are missing, and another of indices that occur more than once.

Here is an example of a valid set of indices:

cic1.png

 

Here is an example of an invalid set of indices:

cic2.png

 

What's a nice clean way of getting the lists of invalid items?


Vibration control of a Random Vibration test

$
0
0

I have a PXI system with:

  1. PXI-4472B analog input module
  2. PXI-6221Multifunction DAQ
  3. PXI-6733 Analog output module

 

Additionally I have a vibration shaker along with its power amplifier.

 

I want to perform a random vibration test using these devices. This test is demonstrated here.

Briefly;

  1. I'm given a power spectral density (PSD) curve of the random vibration. This is called the (reference PSD)
  2. I need to create a time history of a random signal whose PSD coincides as close as possible to the given PSD
  3. This time signal should be output to the power amplifier of the shaker
  4. There is an accelerometer that measures the acceleration generated at the shaker table. This is called the (control accelerometer)
  5. The PSD of the measurements of this control accelerometer are calculated and compared with the reference PSD and the difference between them is used to construct a feedback control law that aims to minimize this difference so that the measured PSD is as close as possible to the reference PSD

So my questions are:

  • Is there any existing modules in LabView or LabVIEW Sound and Vibration Toolkit that accepts the reference PSD and generates the corresponding time history signal?
  • Is there any existing modules in LabView or LabVIEW Sound and Vibration Toolkit that performs the necessary feedback control to minimize the difference between the reference PSD and the measured PSD?

 

Thanks

RS 485 program running dynamically doesnt read the holding register.

$
0
0

I have a program where I am supposed to measure the RS485 signal using an oscilloscope. For that I need to run a program where there is continous transmission between the master and slave, i.e PC and RS 485 device and also parallely read the data using a picoscope. For this I wrote a seperate program for RS 485 transmission and then run it dynamically. But then when I run the program the parallel dynamic program is running but I am not getting an output, instead I am getting an error -1073807339 for the read holding registers. I try to find out what the error is by running the program in highlight execution, then I am not getting this error and the read holding registers is reading the data. So i try to put in a delay time instead of highlight execution but still no output in modbus read holding registers, instead its just the error -1073807339. What is the reason for this. Kindly do help me thank you. I have attached the program. 

PXI-4065

$
0
0

Hi,

I want your help,

I want to know if I could see a PWM signal using NI PXI-4065.

Best Regard.

 

Igus dryve D1

$
0
0

Hey,

 

how can i manually add Hardware that dont show up automatically. I try to connect a Igus dryve D1 via MODBUS. If i try to search it via IP it says that a password is needed but there is no password. If i put on a password it always says its incorrect. I have the NI-VISA drivers installed. Im able to connect and configure the Igus Dryve D1 via Browser but it dont show up in LabVIEW NXG.

 

Thank you for helping

 

Koobn

Can/do network streams use an ethernet cable between RTT and host?

$
0
0

I think this must be simple, and that’s why I can’t find it explicitly said anywhere, but lots of googling still hasn’t helped.

 

I have a cRIO-9045, which will mostly run headlessly. Occasionally a user will need to connect a PC to view the UI and put the program in maintenance mode. They’ll then disconnect and the program will keep running. (pretty standard)

The company we’re working with want the cRIO to have a permanently connected ethernet cable in the installation, and the user will occasionally come along with their laptop and connect to the other end of this. I feel like this can’t be how things are normally done, but we need to meet ATEX (explosion-proof) standards and they feel it’ll be easier or something.

 

This is where I’ve realised I’m not sure how the different networking methods line up with the hardware setup. I’ve been under the impression that standard way to connect remotely doesn’t use a cable between the rio and host, and *somehow* they connect across a network and find eachother using the IP addresses of each, but clearly I don’t really understand how this works.

 

So I have a couple of questions:

  • When we’re talking about network streams, or TCP, do these assume a physical cable connecting the two or not? Do I need to do something differently to the examples (such as this) because I have a physical connection?
  • Currently, if I want to deploy to my cRIO I use an ethernet cable, but the connection occurs seemingly automatically. In an exe, would I just need to poll for this connection? Is that possible?

 

It would be convenient to use the examples as they are given, but I can’t tell whether they are geared towards my setup or not. Thanks in advance for any help. 

Viewing all 69758 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>