How to log the learning rate with pytorch lightning when using a scheduler?

I’ve been trying to find some documentation, I don’t want to save all the hyperparameters each epoch, just the learning rate.
Would be so great if you can help me out.

Cheers,

Oli

Hi Oli,

Just double checking, are you talking about running sweeps?

Cheers,
Artsiom

Hi there, I wanted to follow up on this request. Please let us know if we can be of further assistance or if your issue has been resolved.

Hi Oliver, since we have not heard back from you we are going to close this request. If you would like to re-open the conversation, please let us know!

I’m also wondering how this is done! Whether within a sweep configuration or not - when using a lr scheduler, I am trying to track the lr at epoch during training, as it is now dynamic. Even within a sweep, you will have some initial lr determined during the sweep, but it will not stay constant for the duration of training.

edit:

The example on the lightning site here worked for me:

>>> from pytorch_lightning.callbacks import LearningRateMonitor
>>> lr_monitor = LearningRateMonitor(logging_interval='step')
>>> trainer = Trainer(callbacks=[lr_monitor])

Passing the WandBLogger to the trainer I see my lr is logged on the wandb dashboard.

2 Likes

Hi there, yes,
sorry for the late reply, I didn’t get any notification from your answers, but now I got one from Chris’s.

What worked for chris also worked for me.

2 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.