Hi all,
My experience with Labview is pretty limited, and so far I've mostly been editing and debugging code that has been handed down to me. I've been using USB DAQ cards to sample at around 10 Samples/second, and haven't cared too much about precise timing. I'm looking at a new application for NI software and hardware, and have identified the USB-6343 card as the one to use (500kS/s input, 900kS/s output). The plan is to generate an accurate output signal (accurate in timing and amplitude) which is slightly dependand on the input signal. I do not plan to write log data to a file at this rate, just input, process, and output.
Is it relatively straight forward to sample and output analog signals at these high sample rates? Currently I just use timed loops with millisecond precision, but this would require microsecond precision timing. Is that something that is pretty standard to achieve, or are these sampling rates achieved a completely different way? Can anyone provide links to point me in the right dirrection?
I'm sorry if this question has already been asked, but hopefully it's generic enough to not be a burden. This is a project I'm taking up in my spare time, so I'm just beggining in plan things out, which includes figuring out how difficult the software side of things is going to be.
Thanks.
-A