Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Pytorch gradient calulation of one Tensor

I’m a beginner in pytorch and I’m probably stuck on a relatively trivial problem, but it’s not clearing up for me at the moment.

When calculating the gradient of a tensor I get a constant gradient of 1.
Shouldn’t the gradient of a constant however result in 0?

Here is a minimal example:

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

import torch
x = torch.tensor(50., requires_grad=True)

y = x

y.backward()

print(x.grad)
#Ouput: tensor(1.)

So why is the ouput 1 and not 0?

>Solution :

You are not computing the gradient of a constant, but that of the variable x which has a constant value 50. The derivative of x with respect to x is 1.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading