If you’re doing something like early stopping, you might simply save the model with the best loss and let it run for 10 more epochs, but discard those model weights. I see that for individual metrics I can tell it to store the “max” or “min”, but what if I want the entire summary to happen based on a single metric’s max/min?
Hey Darrick, I am not sure I fully understand your use-case. Could you give me an example?
Sure, let’s say I’m training a model, but during training I save only the best performing model, for example, when the model has the best validation accuracy I save the weights. Let’s say I train for 50 epochs, but at epoch 40, the model has its peak accuracy. Therefore, the weights I have saved for this model reflect the state of the model at epoch 40. However, the weights and biases summary metrics reflect the state of the model at epoch 50.
Ideally, I’d like to change some simple setting which allows me to do this. I realize that I could probably put code in the training workflow to manually check accuracy and set the summary metrics, but it would be nice if there was more a setting for this purpose. I can already tell wandb to log the best accuracy (at epoch 40) instead of the final accuracy (at epoch 50), but I would like to tell wandb to give me all of the metrics at epoch 40 where the best accuracy was achieved.
Hey Darrick,
Apologies about the delay on this one. You should be able to set all the summary metrics in your code. Here is an example. Doesn’t this cover your use-case?
Best,
Arman
Hi Darrick,
We wanted to follow up with you regarding your support request as we have not heard back from you. Please let us know if we can be of further assistance or if your issue has been resolved.
Best,
Weights & Biases
Hey Darrick, since we have not heard back from you, I’ll be closing this ticket. But please let me know if you have further questions.
I don’t see any example along with your reply
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.