Hello,
i search the siplest way to convert array with dec values (ascii coded) in a string.
For example Array with 50 46 49 to string 2.1
Thanks
Sepp
Hello,
i search the siplest way to convert array with dec values (ascii coded) in a string.
For example Array with 50 46 49 to string 2.1
Thanks
Sepp
Hello, I'm stuck in reading OPC UA data and I need help, please.
I successfully created an OPC UA server and wrote on it matrix data as you can see in the first picture.
but when it comes to collecting that data. I do not manage to do so as desired.
and I want to get them back to the matrix form
attached in the second picture, the program diagram for the receiving function.
Well, don't know what else to say other than I have no idea where to go with this one, being that it is an undefined error. Error 11 on the TCP Wait On Listener. I didn't catch this error earlier because it is occurring a little later after it's been running for 10 minutes or more usually.
I attached my VI, the TCP logic is in the furthest upper right hand loop. LabVIEW 2020 and a cRIO 9053.
Hey NI community,
I have 2 VI's on 2 separate targets to achieve the following tasks:
1) create a .tdms file in real time (on real time crio target)
2) convert the .tdms file to .blf (on windows target)
You will see in this snippet how the 2 VI's are arranged within the project:
I had to do it this way since the Create tdms file in real time vi uses functions (like DAQ Assistant) that run on real time target, whereas the BLF functions in Convert .tdms to .blf vi run only on Windows.
There is also an intermediate step between the ones I listed above that requires me to manually copy the tdms file created in the first vi from the remote directory it gets saved to, to a directory on the host computer (I use FTP and WinSCP for file transfer).
I was wondering if there was a way to deploy the Convert .tdms to .blf vi from within the Create tdms file in real time vi since I would like to have a single program that would read sensor data, log it in real time, and produce a .blf file as its final deliverable. (I know there is a way to programmatically transfer files between the host and real time systems so the intermediate step should be take care of through that.)
Looking forward to hearing from you.
Thank you!
I am trying to ensure backward compatibility for a cluster saved as part of a binary file. This cluster may change over time as new info becomes available To achieve backward compatibility this cluster is made to be the private data of a class and I flatten the class to XML before saving to a section of the binary file ( which has other data types apart from the cluster) . When I want to read back the cluster,I simply unflatten from XML to the new class ( with private data changed) .This seems to work sometimes when the cluster is not populated with certain data types like 2D pictures. In that case it says the data is corrupt and can not be unflattened to any LabVIEW class. Could this be a bug? I am using LabVIEW 17.0f2(64-bit)
Hi,
Can anyone explain to be the ramifications of having multiple parallel loops all accessing the same open FPGA reference? As an example, this technique is used in the CompactRIO Project Template.
Are there any hidden mutexs on the reference? What happens if multiple loops all try and read from the same variable?
Thanks,
Tom
Hola, espero que se encuentren bien.
Les expongo mi trabajo y mis dudas:
1. Genero una señal senoidal de 60 Hz, mediante un generador de señales
2. Dicha señal la adquiero mediante un sensor de voltaje NI-9225 y estoy utilizando la cRIO-9024
3. Realizo un procesamiento para dicha señal
Este proceso lo implementé primero en programación Scan Mode y no tuve mayores problemas.
Ahora tengo que embeber este programa a la FPGA del cRIO, por el momento he realizado pruebas para la adquisición de la señal con programación FPGA pero el resultado siempre es 0, adjunto una imagen para que puedan ver el programa de la FPGA.
Ojalá que alguien me pueda ayudar, de antemano gracias.
Hello!
I'm having an issue with changing the Hostname on a cDAQ-9189 chassis using NI MAX.
I was able to change the Hostname and Name when initially setting up the chassis, but now when I try to change the Hostname (a very minor change: from "cDAQ-9189-BB07-CHAS01" to "cDAQ-9189-BB01-CHAS01") after the restart it reverts back to the old Hostname.
Here's what I've done so far to troubleshoot it:
After every change, I hit "Save" and then "Restart" as prompted. I've also made a few attempts where I added a Comment to the device, and these get blanked out as well. NI-MAX 19.5 would show the settings revert back to their pre-edit values even before the chassis restart was under way. In NI-MAX 20.5, it at least showed my requested changes until it finished restarting, at which point the settings would revert. No warnings are shown before or after the restart.
Additional info: This chassis has both methods of time sync turned off, has a static IP address on a "DAQ Only" network using subnet 192.168.127.x/24 using an unmanaged switch, while a second Ethernet adapter connects the PC on the DAQ Only network to our corporate network, which is using subnet 10.9.18.x/24. Along with this 9189, there are nine 9185 chassis daisy-chained together, with the 9189 as the first chassis in the chain, connected directly to the switch. I'd have skipped the switch entirely but we have some serial servers on this DAQ Only network as well, also with static IP assignments. As I said, there are no duplicate addresses on the network.
I saw a post from 2015 that talks about this issue, but that post wasn't resolved: Can't rename can chassis in MAX (9188XT) - NI Community
Is this a known/accepted issue with NI-MAX? Is there something I'm doing wrong? I was using LONG names (up to 25 characters) which isn't valid for NetBIOS, but NetBIOS compliance isn't important is it?
Also: I tried changing one of the 9185 chassis Hostnames and it accepted the change without issue. Both the 9189 and 9185 had names equal in length, with characters valid for NetBIOS even if the lengths were too long.
At this point I think my only option is to do a factory reset on the chassis (5+ seconds on Reset button on chassis) and see what works after that. Any ideas on what to try other than a factory reset? If a factory reset is the only option, can it be triggered programmatically or is the reset switch the only way?
Thanks,
Erik
I've been studying & Implementing NI QMH in my programs but I've realized that there's a lot more interest and sources for the DQMH. If I jump into DQMH before I'm any good at regular QMH will I regret it? Or, am I just wasting my time learning QMH before DQMH?
I just need some guidance from someone who can see the bigger picture.
Thanks.
Hi,
I have a program that will create & write to a tdms file.
The naming of the file involving using Format Date/Time String Function to return the date & time zone.
For example,
abc_date_time PM.tdms
abc_date_time AM.tdms
If a PC is using Chinese to display character, the tdms file will be in chinese character for the word PM & AM instead including the content.
I cannot properly read & load the file from my program.
Anyone can enlighten me?
How do I resolve if one were to prefer to use other PC language?
Hi,
We have a VIAVI MTS6000A bit error tester which we can send and receive SCPI through Putty. But none of these SCPI commands work through LabVIEW VISA write or read functions. The communication is through LAN cable. I could create a VISA instrument name in NI MAX using IP address and port number and can open and close a VISA session without error. But even a *IDN? command through VISA test panel in NIMAX doesn't return anything. But this command works fine in Putty. I am fairly new to LABVIEW and my networking knowledge is bit scratchy. Could anyone have an idea what is going wrong here? I am not sure if the instrument support the LABVIEW VISA drivers .
Sometimes, I had a trouble in configuring of tick mark in xy graph by changing of property node, xscale range.
I would like to check the property of x scale (tick mark of XY Graph) is copied to new destination (error plot).
Can you check why my code (tick increament.vi) is not working?
Is it a bug in Labview 2018?
labmaster.
Hi,
when I launche LabVIEW the following problem appears :
I need your help to fix the problelm.
Hi everybody,
The operating system I am using is Windows 10 and I am using LabVIEW 2019 (32-bit) and cRIO-9104, but I found that it cannot connect on NI MAX and has an error code Error-52005. What should I do?
Hello Everyone. I want to make human detection system using IMAQ tools in LabVIEW. Please give me some advice from where I should start. Like should I go with color detection or what. If someone have some experience in it please assist. I only need to detect a human that comes in front of camera. Any advice or support is appreciated.
Hello everyone,
I've been working on LV 2020, project with legacy code with no problems, until I checked the "Add files to new project library" option on Destination Build properties .
I can´t find the LabVIEW proyect library named on that option, and now the last and all the previous exe are broken due a several missing external functions in a dll.
I already undid that change an also did and build after adding the dll to the project and to "Always Included" section of the Source Files option and it didn't fix my exe files.
However the exe file is working on my coworker's computer.
Could someone please help me with this issue?
Thanks
Iratxe
hello everyone,
i have a question. i will build about detection for temperature with arduino then the result can display at the labview for gui. i try to communication arduino is done but the result is error. FYI i try the only arduino hardware not the sensors. Is it because I didn't install the sensor so it's like this? I saw the value on the Arduino serial monitor, it came out.
thanks for advance
I would like to assign each channel (or at least each module) deriving from different modules (using AIForceBridge, AITemp, AIVoltage, ...) different sample rates.
Is it possile?
Without creating other SampleClocks?
How (please send me a picture to understand immediately)?
Is it efficient?
Other approaches?
At approximately the 12.5 second point in the attached file there is a change in the pitch. I have been trying several methods of detecting that change programmatically to no avail. Does anyone have a suggestion?
Hello,
I am trying to update a small VI involving NI-6534 (Traditional DAQ 7.5) to NI-6535 (unfortunately TDAQ is not supported) with Daqmx 8.8.
I have been wondering how Daqmx handshaking(8255) actually works.
I know that in Traditional DAQ, there was fixed control lines for the NI-6534 (REQ, ACK ...) and you could simply configure ports as output with handshaking parameters.
However, I do not see any of these signals in the NI-6535 ( I think they are called PFI something), moreover the Daqmx Timing.VI (Handshake 8255) doesn't specify any source for REQ or ACK.
So the main reason I created this post is to ask how does Daqmx works and by the way ask how are these control lines used since I need ACK and REQ signals.
I admit that I am not sure of any statement above and please correct me if I might have misinterpreted anything.
Please find enclosed both card pinouts.
Thanks in advance
DRALV