Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Why is the mse as a loss different from the mse as a metric in keras?

I have a regression model built in keras. The loss is mse.
The output during training is as follows:

4/4 [==============================] – 16s 1s/step – loss: 21.4834 – root_mean_squared_error: 4.6350 – full_mse: 23.5336 – mean_squared_error: 23.5336 – val_loss: 32.6890 – val_root_mean_squared_error: 5.7174 – val_full_mse: 32.6890 – val_mean_squared_error: 32.6890

Why is the mse as a loss different from the mse as a metric?
(loss = 21.4834; mse = 23.5336; why do these values differ? They ought to be the same.)

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

And why is this only the case for the traing set, not the validation set?
(val_loss = 32.6890; val_mse = 32.6890; these values are equal, as it ought to be.)

Any ideas?

>Solution :

I’m posting this as answer as it looks like it was the solution of the problem.

The training MSE loss ("loss") is calculated as a form of average over training, where the weights are changing. "metric" MSE ("mse") is calculated after the epoch without weights updating.

For validation ("val_loss" and "val_mse") both are calculated without weight updates.

Additionally it’s possible that the shown MSE loss is something like a moving average, where not all minibatches of the the epoch are weighted equally. I don’t think this is the case for the given problem as the validation values match. This depends on the implementation.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading