Logical error in a C program to calculate the numbers of digits in an integer

Advertisements

I made this program to calculate of digits of an integer in C.
This program works well until I give an input of 10 digit number.
If I give input of an integer with a digit of more than 10 digits, the behavior of the code changes.

// Program to calculate the number of digits in an integers.

#include <stdio.h>

int main()
{
    int number;
    int count = 0;

    printf("Enter a number: ");
    scanf("%d", &number);
    while (number != 0)
    {
        number = number / 10;
        count++;
    }
    printf("The number of digits in an integer is : %d", count);
}

For example:

Output of program: 
$ gcc digits.c && ./a.out
Enter a number: 1234567890
The number of digits in an integer is : 10

The expected output is executed successfully. Let’s see one more example.

Output of program:
$ gcc digits.c && ./a.out
Enter a number: 12345678901234567890
The number of digits in an integer is : 1

Here, I gave input of 20 digits of an integer but it’s returning 1 digit. I don’t understand why this happens?

Can someone please explain to me what’s the logical mistake I did in my code?

>Solution :

integers can only store values from -2147483648 to 2147483647. if you change to unsigned long long it will work.

#include <stdio.h>

int main()
{
    unsigned long long number;
    int count = 0;

    printf("Enter a number: ");
    scanf("%llu", &number);  // use %llu here
    while (number != 0)
    {
        number = number / 10;
        count++;
    }
    printf("The number of digits in an integer is : %d", count);
}

output:

Enter a number: 12345678901234567890
The number of digits in an integer is : 20

Leave a ReplyCancel reply