How to log the learning rate with pytorch lightning when using a scheduler?

I’m also wondering how this is done! Whether within a sweep configuration or not - when using a lr scheduler, I am trying to track the lr at epoch during training, as it is now dynamic. Even within a sweep, you will have some initial lr determined during the sweep, but it will not stay constant for the duration of training.

edit:

The example on the lightning site here worked for me:

>>> from pytorch_lightning.callbacks import LearningRateMonitor
>>> lr_monitor = LearningRateMonitor(logging_interval='step')
>>> trainer = Trainer(callbacks=[lr_monitor])

Passing the WandBLogger to the trainer I see my lr is logged on the wandb dashboard.

2 Likes