When I train a neural network using PyTorch, I get the following warning caused by the torchmetrics library:
/Users/dev/miniconda/envs/pytorch/lib/python3.10/site-packages/torchmetrics/utilities/prints.py:36:
UserWarning: Torchmetrics v0.9 introduced a new argument class
property calledfull_state_updatethat has not been set for this
class (SMAPE). The property determines ifupdateby default needs
access to the full metric state. If this is not the case, significant
speedups can be achieved and we recommend setting this toFalse. We
provide an checking functionfrom torchmetrics.utilities import check_forward_no_full_statethat can be used to check if the
full_state_update=True(old and potential slower behaviour, default
for now) or iffull_state_update=Falsecan be used safely.
I tried to suppress this warning by using the warnings package in my script:
with warnings.catch_warnings():
warnings.simplefilter("ignore")
However, the warning is still shown which is probably due to a function in prints.py of torchmetrics:
def _warn(*args: Any, **kwargs: Any) -> None:
warnings.warn(*args, **kwargs)
Is it possible to get rid of this warning from my script without changing the library code?
>Solution :
Use -W argument to control how python deals with warnings, consider following simple example, let dowarn.py content be
import warnings
warnings.warn("I am UserWarning", UserWarning)
warnings.warn("I am FutureWarning", FutureWarning)
then
python dowarn.py
gives
dowarn.py:2: UserWarning: I am UserWarning
warnings.warn("I am UserWarning", UserWarning)
dowarn.py:3: FutureWarning: I am FutureWarning
warnings.warn("I am FutureWarning", FutureWarning)
and
python -W ignore dowarn.py
gives empty output and
python -W ignore::UserWarning dowarn.py
gives
dowarn.py:3: FutureWarning: I am FutureWarning
warnings.warn("I am FutureWarning", FutureWarning)
See python man page for discussion of -W values