• Please review our updated Terms and Rules here

ASCII Character Conversion to Number: Code

mmruzek

Experienced Member
Joined
Sep 28, 2011
Messages
227
Location
Michigan, USA
Hi, I am hoping someone can help with this... I seem to recall that there are some coding shortcuts for converting the ASCII character representation of a number to the actual number. For example the ASCII character Hx39 "Character for 9" represents the number decimal 9, and the ASCII character Hx41 "Character for A" represents the number decimal 10. Does anyone have a trick or method for doing that conversion? I am coding in an assembly language that I wrote myself called LALU, but some examples in x86 or a method description would get me going to understand the approach. Right now I am subtracting Hx30 and then doing a compare and branch, depending on the result. Thanks! Michael
 
Maybe you can use this formula:

ASCII to number: ASCII - 30h (or, in decimal, ASCII - 48).
In x86 assembly would be something like:

mov al,'9' ; Equivalent to mov al,39h, the assembler calculates the ASCII number
sub al,30h ; Now AL has the number represented by ASCII caracter 39h

Number to ASCII: Number + 30h (or number + 48).

mov al,9
add al,30h ; Now AL holds the ASCII character for number 9
 
Last edited:
The reverse does involve a trick using the DAA instruction, but I don't recall one for going from ASCII->hex. May not matter, because you also want to verify that the input is in the vocabulary for hex-ASCII.

AAM can be used as a quick binary-to-decimal conversion.
 
Thanks for the replies! Here are 2 snippets of related code from the DEBUG source mentioned. I must have been mistaken that there is a bitwise trick for ASCII to HEX.
 

Attachments

  • hexbyte.png
    hexbyte.png
    17 KB · Views: 11
  • getnyb.png
    getnyb.png
    15.8 KB · Views: 12
Note the DAA trick that I mentioned in the first routine. This "trick" is very old--I believe it first occurs in the Intellec monitor code, but my memory may be faulty. It works on both x80 and x86.

The AAM trick that I referred to is useful in converting binary to BCD. Suppose, for example, AL contains 0ch. After an AAM, AX will contain 0102h. Add 3030h and you have ASCII "12" (3132h). This works for numbers from 0 to 99 only. Faster than the Chinese remainder method for this small range of numbers.

Going the other way, suppose you have AX=3132h (ASCII "12"). Subtract or mask out the 3030h to give you AX=0102h. Do an AAD and you get 0ch in AL. Just the reverse of the AAM case above.

You'll sometimes read that the AAM (D4 0A) and AAD (D5 0A) can be used to operate on other number bases by changing the second byte of the instruction from 0A (decimal 10) to whatever base you want to work in.
So, for instance, encoding the AAM as (D4 08) ith AX = 000Ch will get the result of 0104h--add the 3030h bias and you get 3134h or ASCII "14", which is decimal 12 in octal (base 8). However not all members of the x86 clan honor this hack--some automatically assume that the second instruction byte is 0A whether or not it really is. So not a safe hack, but it does work on 8086 and 8088.
 
Last edited:
However not all members of the x86 clan honor this hack--some automatically assume that the second instruction byte is 0A whether or not it really is. So not a safe hack, but it does work on 8086 and 8088.
The NEC V20/V30 CPU:s are the only ones that do not support this as far as I know.

BTW, there's a shorter version of that "byte value to hex ASCII" trick;
Code:
    cmp     al, 10
    sbb     al, 69h
    das
 
In retrospect, the only "trick" I've ever used has been the DAA--and only because it's well-documented. Doing the conversion "the hard way" is not much slower and hardly matters on CPUs made in the last 35 years. It is interesting from a bit-twiddling and historical viewpoint, however.
 
After giving it some thought, I ended up creating a lookup table for doing a full byte conversion. I put the ASCII character for the least significant nibble in my B register, and the most significant ASCII nibble in my A register. Then I created an ALU function with mnemonic HEX. (My ALU is custom built, and uses lookup tables for all operations... hence the name LALU.) Executing HEX returns the value of the byte represented by the 2 ASCII characters and puts it into the A register. I only am fooling around with this because I am doing some speed benchmarking, as described in this recent post...

 
Back
Top