Quantcast
Channel: LabVIEW topics
Viewing all articles
Browse latest Browse all 69019

Decoding ASCII 2-,3- and 4-characters into decimals

$
0
0

My task is to decode ASCII characters into its decimal representation. The amount of ASCII code is 2-,3- or 4 characters depending upon the data size (<= 12 bits, <= 18 bits, <= 24 bit ) sent from the unit.

Following is a 4-characters decoding example in practice:

 

1. m2@0 = m 2 @ 0

 

2. ↓ Hexadecimal Equivalent

    6DH 32H 40H 30H

 

3. ↓ Subtract 30H

    3DH 2H 10H 0H

 

4. ↓ Binary Equivalent

    1111012 0000102 0100002 0000002

 

5. ↓ Merge

    1111010000100100000000002

 

6. ↓ Decimal Equivalent

    16,000,000

 

My problem is how to split the ASCII characters (step 1.) and process them separately (basically step 2. - 4.). I've done fundamental ASCII-2-Hexadecimal converting, but found that this problem is more complex. Anyone having any advices? :smileyhappy:


Viewing all articles
Browse latest Browse all 69019

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>