Skip to content
Advertisement

Reading user input as an integer

I wrote an Assembly program (x86_64 Linux NASM) that prints an integer to console, based in the algorithm suggested my the comments in this post, which is basically this:

divide number x by 10, giving quotient q and remainder r
emit r
if q is not zero, set x = q and repeat

All works just fine under the following script:

section .bss
        integer resb 100        ; it will hold the EOL
        intAddress resb 8       ; the offset

section .text

        global _start:

_start:

        mov rax, 567
        call _printProc

        mov rax, 60
        mov rdi, 0
        syscall


_printProc: ; here goes the algorithm described above.

After compiling it, the number 567 gets printed on the screen (console).

But if I try to do the same but allowing the user to input the number to be printed as integer I don’t get what expected. Well, to do this I made the following changes (the algorithm stays the same):

section .bss
        integer resb 100        ; it will hold the EOL
        intAddress resb 8       ; the offset
        number resb 100

section .text

        global _start:

_start:

        ; getting user input
        mov rax, 0
        mov rdi, 0
        mov rsi, number
        mov rdx, 100
        syscall

        mov rax, [number]       ; passing the content at address number into rax
        call _printProc

        mov rax, 60
        mov rdi, 0
        syscall


_printProc: ; here goes the algorithm described above.

But in this case if I type 567 I get 171390517. In fact, if I type

0, I get 2608
1, I get 2609
2, I get 2610

and so on.

I’d appreciate if some of you have an idea about what is the problem in the second case and how could be fixed.

Advertisement

Answer

what happends when you call this

    ; getting user input
    mov rax, 0
    mov rdi, 0
    mov rsi, number
    mov rdx, 100
    syscall

is, that your entry ( e.g. “1004”) is written to the memory at “number”, character per character. Now you have the exact opposite problem you intended to solve: “how to convert an ASCII string into binary value”

the algorithm for this new problem could look like this:

(assuming char_ptr points to the string)
result = 0;
while ( *char_ptr is a digit )
    result *= 10;
    result += *char_ptr - '0' ;
    char_ptr++;
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement