When creating loader for hashmap, my bucket points back to itself

Advertisements

I am learning C and debugging code for spell checking a chosen .txt file. My last hurdle here is the load function of my hashmap. It successfully reads a given word from a dictionary file, stores it, assigns it to a temp node , but the problem comes when I try to make my struct node table[] = tmp;

Any thoughts? Also my hash function is working, and is not the problem. Thank you all.

Edit: it seems as if the table->next doesn’t update to a pointer to the next item in the singly linked list. When i try to do new_word->next = table[index]; it doesnt change the address and stays as 0x00

bool load(const char *dictionary)
{
    // open up file to read dictionary to
    FILE *dict = fopen(dictionary, "r");
    if (dict == NULL)
    {
        printf("Error opening dictionary file.\n");
        return 1;
    }

    // create variable to store the current word in
    char *current_word = malloc(LENGTH * (sizeof(char)));
    if (current_word == NULL)
    {
        printf("Error in allocating memory.\n");
        return 1;
    }

    // initialize index for finding hash index, and count, to store word count
    int index;
    int count = 0;

    while (fgets(current_word, LENGTH, dict) != NULL)
    {
        //open up a node to store the current new word in, also store current word
        node *new_word = malloc(sizeof(node));
        if (new_word == NULL)
        {
            printf("Error allocating memory.\n");
            return 1;
        }
        // copy word to new world variable, store next pointer in variable as well,
        // set as new head of table listy
        strcpy(new_word->word, current_word);
        index = hash(new_word->word);
        new_word->next = table[index];
        table[index] = new_word;
        free(new_word);

        count++;
    }

>Solution :

The problem is silly: you free the node with free(new_word) after you link it into the hash table. Just remove this call.

Note also these remarks:

  • there is no need to allocate the current_word array. Just define it as a local object with automatic storage.

  • remember to close the file in case of error to avoid ressource leakage.

  • it is somewhat confusing to return true (ie: 1) in case of error. A more consistent API would be to return the number of words successfully read and -1 in case of error.

  • you should strip the trailing newline stored by fgets() in current_word.

  • do not comment the obvious.

Here is a modified version:

#include <errno.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>

#define LENGTH 80

typedef struct node {
    struct node *next;
    char word[LENGTH];
} node;

extern int hash(const char *s);
extern node *table[];

int load(const char *dictionary)
{
    FILE *dict = fopen(dictionary, "r");
    if (dict == NULL)
    {
        fprintf(stderr, "Cannot open dictionary file %s: %s\n",
                dictionary, strerror(errno));
        return -1;
    }

    char current_word[LENGTH];

    int index;
    int count = 0;

    while (fgets(current_word, LENGTH, dict) != NULL)
    {
        node *new_word = malloc(sizeof(node));
        if (new_word == NULL)
        {
            fprintf(stderr, "Error allocating memory.\n");
            fclose(dict);
            return -1;
        }
        // strip the trailing newline if any
        current_word[strcspn(current_word, "\n")] = '\0';
        strcpy(new_word->word, current_word);
        // link the new word at the head of the hash table slot list
        index = hash(new_word->word);
        new_word->next = table[index];
        table[index] = new_word;
        count++;
    }
    fclose(dict);
    return count;
}

Leave a ReplyCancel reply