Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

PyTorch errors – softmax not giving probabilities

I’m trying to implement a custom neural network model using PyTorch for a classification task.
When I inspect the output probabilities, they don’t sum up to 1. I’ve added a torch.nn.Softmax(dim=1) layer at the end of my model, which should normalize the output to probabilities, but it doesn’t seem to be working.

def custom_model(input_size, output_size):
    model = torch.nn.Sequential(
        torch.nn.Linear(input_size, 64),
        torch.nn.ReLU(),
        torch.nn.Linear(64, output_size),
        torch.nn.Softmax(dim=1)  # Softmax layer for classification
    )
    return model


input_size = 10
output_size = 5


model = custom_model(input_size, output_size)


criterion = torch.nn.CrossEntropyLoss()
optimiser = torch.optim.SGD(model.parameters(), lr=0.001)


input_data = torch.randn(32, input_size)
target = torch.randint(0, output_size, (32,))

output = model(input_data)

loss = criterion(output, target)


optimiser.zero_grad()
loss.backward()
optimiser.step()

print(f'Loss: {loss.item()}')

Can anyone help?

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

I think the issue is that the softmax function was put in the wrong place. I would replace softmax with torch.nn.CrossEntropyLoss(). This means that you get one.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading