Quantcast
Channel: LabVIEW topics
Viewing all articles
Browse latest Browse all 69105

Labview Vision IMAQ Color Learn VI - problem

$
0
0

Dear Everyone,

 

I have the following problem. 

 

I have a school project to write a program which counts the colored things that are shown to the camera, classified by colors. I solved this with the usage of NI_Vision_Development_Module.lvlib:IMAQ ColorLearn. To the 'Color Spectrum' output are wired parallel an '1D array (64bit real)', an 'Index Array' and an 'Array Max & Min'. I wanted to handle only 6 kind of colors (red, blue, green, yellow, black, white). My idea was to show this colors to the camera and check which value has a maximum in the 'Color Spectrum'. It was working fine, the cells were in these patterns: (i=0 - red, i=4 - yellow, etc.). With the 'Array Max & Min' then I get the index of the color with the maximum value in the array and with a 'Case Structure' I made the counting. It was working perfectly, I have tested it in many environments. 

With 7 days after the last save of the program (the final, working version), today I have opened it again but the count does not work, the colors are mixed up. I show red to the camera and in the 'Color Spectrum' it is in i=1, not in i=0 as before, yellow  is in i=3 and so on to the other colors. I have loaded back many backup savings and all have the same issue, every is the same. It will be the easiest to reinitialize my array_index-color_name pairs, but I would like to know what has happened. I have never seen such a thing before.

 

I am using Labview 2016 (64bit) Student License, Windows 7 SP1 (64bit).

 

Thanks for your suggestions.

 

Best regards,

Lajos

 

 


Viewing all articles
Browse latest Browse all 69105

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>