How can I convert a 24-bit decimal to hexadecimal?
For example:
65626 to 1005A?
16767215 to FFD8EF?
Any help (assembly source) would be most welcome.
Login or register to post comments
How can I convert a 24-bit decimal to hexadecimal?
For example:
65626 to 1005A?
16767215 to FFD8EF?
Any help (assembly source) would be most welcome.
tip:
a 24 bit digit is stored as 3 consecutive bytes
a 8 bit digit (byte) is composed by two nibbles ( high-low ) each nibble have a range from 0 to 15 decimal (or in hexadecimal 0-F)
so to convert a 24 bit digit you repeat 3 times the convertion of a byte
each byte convertion is a problem like this:
giving a value from 0 to 15, do calculate the ascii code of the charater from "0" to "9" for values of the byte from 0 to 9 and from "A" to "F" from values of the byte from 10 to 15
"0" ascii code is 48 decimal
"A" ascii code is 65 decimal
so your problem is this:
if value < 10
asciicode = value + 48
else
asciicode = 65 + value - 10
http://wikiti.brandonw.net/index.php?title=Z80_Routines:Othe...
Convert byte by byte
You need something like this?
; Input: HL = Address of number to convert (3-bytes)
; DE = Address of hexadecimal ASCII string (6-bytes)
24bithex:
ld b,3 ;3 bytes
.loop
ld a,(hl)
call .Num1
ld a,(hl)
call .Num2
inc hl
djnz .loop
ret
.Num1 rra
rra
rra
rra
.Num2 or F0h
daa
add a,A0h
adc a,40h
ld (de),a
inc de
ret
I thinks he means, to convert a TEXTUAL DECIMAL input in to a 24bit value (as msx normally can do 16bit signed), and is a problem for integer large inputs.
The one ARTRAG posted, cpu and assembler are exhausted above 16bit range.
but it's the idea to use in 24bit.
easiest probably is to have things in memory instead registers and using this list:
list: .db 0x01,0x00,x00 ;1 , binary 24bit little endian .db 0x0a,0x00,x00 ;10 .... ;100 .... ;1000 .... listend:
used for conversions in both directions
From the page I posted:
OutHex8: ; Input: C ld a,c rra rra rra rra call Conv ld a,c Conv: and $0F add a,$90 daa adc a,$40 daa call PUTCHAR ;replace by bcall(_PutC) or similar ret
(quite close to NYYRIKKI's solution)
I thinks he means, to convert a TEXTUAL DECIMAL input in to a 24bit value (as msx normally can do 16bit signed), and is a problem for integer large inputs.
From the original post it is hard to tell what the actual problem is as storage format is not specified, but if I would need to decode ASCII decimal to 24-bit number I would probably do something like:
LD HL,0 ; CHL= 24-bit number LD C,0 LD DE,NUMBER ; as ASCIIZ-string .LOOP LD A,(DE) AND A RET Z LD B,A INC DE PUSH DE ; X10 LD A,C ADD HL,HL ADC A,A LD C,A PUSH HL ADD HL,HL ADC A,A ADD HL,HL ADC A,A POP DE ADD HL,DE ADC A,C LD C,A ; ADD LD A,B SUB "0" LD E,A LD D,0 LD A,D ADD HL,DE ADC A,C LD C,A POP DE JP .LOOP NUMBER: DB "1234567",0
If you move the SUB "0" to the test you can get it to stop at any non-decimal character.
Could be useful if the number is in the middle of a longer text but it costs 3 more cycles.
.LOOP LD A,(DE) SUB "0" CP 10 RET NC
Sure this CAN be optimized, but I can't think any real life situation where it would make difference even if you can save 1000 cycles. If the speed matters on this routine, then you have MUCH bigger problems elsewhere.
Thanks for the response, You solved my problem.
Btw: It was an ASCII decimal that I needed to convert to a 24-bit number.
Don't you have an account yet? Become an MSX-friend and register an account!
