Why does this C code appear to hang sometimes?

Advertisements

I’m trying to recreate the bitset class from C++ in C, as a small experiment. The idea behind this code is to have a struct with a flexible array member as the container for the bits, which will be represented by unsigned integers.

I feel like I’m probably using flexible array members wrong, and that’s what’s causing the issue, but that could be wrong.

When I run the code below, it will sometimes print and terminate immediately, and sometimes hang for up to a minute before terminating. While it’s hanging, the terminal won’t respond to any user input.

Compiling with GCC on Windows.

#include <stdio.h>
#include <stdlib.h>

typedef struct bitset {
    int size;  // Number of bits
    unsigned int data[];
} bitset;

bitset* bitset_init(int init_size) {
    bitset* bs =
        malloc(sizeof(bitset) + sizeof((init_size + sizeof(unsigned int) - 1) /
                                       sizeof(unsigned int)));
    bs->size = init_size;

    // Initialise bits to 0
    for (int i = 0; i < (init_size + 7) / 8; i++) {
        bs->data[i] = 0;
    }
    return bs;
};

int main() {
    bitset* x = bitset_init(45);
    printf("%d", x->data[0]);
}

Any help would be appreciated.

>Solution :

You have to allocate the correct amount of bytes, you are doing it wrong as your second sizeof which is nested with another sizeof is returning the size of size_t (not 100% sure that it is size_t but highly likely) or possible sizeof(int) which leads to the same problem.

Instead of

malloc(sizeof(bitset) + sizeof((init_size + sizeof(unsigned int) - 1) /
                                       sizeof(unsigned int)));

You have to do something like

malloc(sizeof(bitset) + init_size * sizeof(unsigned int));

But guessing from the for loop you might want to have something like

malloc(sizeof(bitset) + number_of_bits * sizeof(unsigned int));

The problem with your code has to do with the fact that you are writing past the end of the flexible array member (overflow) causing with this undefined behavior. Which explains what you observed.

Additional Note:

Looking at this

sizeof((init_size + sizeof(unsigned int) - 1) / sizeof(unsigned int))

it lloks like you are trying to actually use

(init_size + sizeof(unsigned int) - 1) / sizeof(unsigned int)

Which still would be wrong. If you one 1 byte per bit, you have to multiply the number of bits by sizeof(<whatever the type of the bit is>)

Leave a ReplyCancel reply