I recently refactored a legacy VI which was programmed in 7.1 days. It involved the usage of about 20 occurrences to trigger many parallel loops, each of which contained a simple operation.
The code had a complexity of 7.3 (2012 property).
I refactored the code to include a nice producer consumer architecture, got rid of FP elements for temporary data storage, introduced a state machine for handling UI updates and generally optimised what I could according to my understanding of good coding.
After all that I have a code complexity of 6.8. I was expecting a MUCH larger drop than that.
What exactly determines code complexity? What has a large impact, what has a small impact? If I compare the VI I modified with a version of the OLD VI which a colleague changed (a lot lot less than I changed) his version gives a code complexity of 4.3 (with 20 parallel loops coordinated with occurrences, each with their FP component and using Locals everywhere).
Both the memory footprint of my version is about 50% higher than before and the code size is double that from before. How can this be?
The one major change I made was using user events to communicate between my consumer and producer. My API has a total of 40 user events, not exactly a lot really.