Im learning neural network in PyTorch and i came across with that:
#Loss function
criterion = nn.MSELoss()
#Optimizer
from torch import optim
optimizer = optim.Adam(MLP.parameters(), lr=args['lr'], weight_decay=args['weight_decay'])
def train(train_loader, MLP, epoch): #MLP is the model
MLP.train()
start = time.time()
epoch_loss = []
for batch in train_loader:
sample, label = batch
optimizer.zero_grad()
#Forward
pred = MLP(sample)
loss = criterion(pred, label)
epoch_loss.append(loss.data)
#Backward
loss.backward()
optimizer.step()
epoch_loss = np.asarray(epoch_loss)
end = time.time()
print('Epoch: {}, Loss: {:.4f} +/- {:.4f}, Time: {}'.format(epoch+1, epoch_loss.mean(), epoch_loss.std(), end-start))
return epoch_loss.mean()
Well, "criterion" and "optimizer" are objects that i didnt pass as parameters for my function "train" like i did with the model (MLP), but it worked. Does it work for any function or is it just a PyTorch’s thing?
>Solution :
This is not a Pytorch thing, these are called global (as opposed to local) variables. I would advise to get more familiar with the Python language and programming in general, if you want to get a grip on Pytorch.