Hi! I've done a little searching for answers here but am still confused, so I figured it was time to write my own question.
I have an engine control application with an entire (32ch) 9205 module's worth of medium speed analog inputs. I'd like to add a lowpass filter for each channel, and I'd like to be able to adjust the cutoff frequency from the RT during runtime (or ultimately with a field technician host utility).
I’m having a problem with the RT implementation of the Butterworth Coefficients vi. I made the assumption that I could set up the RT filter configuration vi in a for loop to generate configurations for each channel. I did have this code run successfully once (I think), but with a large RT processor penalty. After attempting to change the cutoff frequency from the RT, labview lost communication with the target (NI 9068) and the RT app appeared to hang, without any error or indication of high processor use on the front panel (though this is my suspicion).
This unresponsive state then repeated every time I tried to run my toplevel RT vi, and the issue ceased as soon as I removed the Butterworth Coefficients for loop from the application.
The FPGA code appears to happily compile and run.
I tinkered with details but am unsure how to proceed to design this better. So I turn to ye magicians. Is this a more complicated task than I think it is?
Attached is stripped down code that represents the RT and FPGA setup.