Advertisements When using the Lightning’s built-in LR finder: # Create a Tuner tuner = Tuner(trainer) # finds learning rate automatically # sets hparams.lr or hparams.learning_rate to that learning rate tuner.lr_find(model) a lot of checkpoint lr_find_XXX.ckpt are created in the running directory which creates clutter. How can I make sure that these checkpoint are not created?… Read More Getting rid of the clutter of `.lr_find_` in pytorch lightning?
Advertisements I’m new to PyTorch and am working on a toy example to understand how weight decay works in learning rate passed into the optimizer. When I use MultiStepLR , I was expecting to decrease the learning rate in given epoch numbers, however, it does not work as I intended. What am I doing wrong?… Read More how MultiStepLR works in PyTorch