Hi,
I have been using NI sbRIO for 3 years now in college for education purposes and it had served me as a great learning tool to build robots, create algorithms and deploy code with minimum development time and intuitive learning for students.
But there always had been a question lurking in my mind that how does a LabVIEW code gets compiled into a equivalent machine (binary) code to work with RT processor? Does it have any compilation tool (like Xilinx HLS tool to convert a LabVIEW diagram into bitstream)?
I know it's a very noob question to ask but I always had an impression from those of microcontrollers (AVR, PIC) where we first write a C/C++ code, compile it and generate a HEX file and then program our controller with a programmer.
Following questions linger on all the time whenever I create a project and deploy it over sbRIO:
1. How does LabVIEW convert a graphical code to a equivalent bitstream for FPGA?
2. Is it a good technique to first convert a LabVIEW VI to VHDL and then perform synthesis, mapping and bitstream? Isn't this process equivalent to some software which converts a VI into C++/C code and then creates a HEX file and then gets "BURNED" into a controller.
3. Does a graphical code get converted directly into machine language on the RT side? I know that for sure that a simple VI, running on a desktop, is converted directly into machine code.
I may not be able to clearly explain my questions but if someone could answer them I'd be really grateful. :)