Does the learning curve suggest overfitting or an acceptable level of model performance? The results are based on xgboost. Do I need to re-tune the hyperparamters?
>Solution :
In the picture you can see a General overfitting graph. So, what would you say?
Usually the train and validation loss seem to converge to the same place until they don’t. Once that they start diverging the train loss continue decreasing and the validation loss increases or keeps the same. So, answering your question, yes, I would say you are overfitting from the iteration ~70
