Hi all,
I am a beginner of wandb and pytorch lighnting. Here I have something I want to ask.
Now I am training my model on HPC, which can not access the internet directly.
Due to this by default and I am using offline mode by default.
But something strange here is that I decided to use offline mode with pytorch lightning trainer,
after the sync I run still show the config hyperparameter correctly.
After some trying on open source network with online/offline mode,
this is only happened when I decided to use offline model with wandb logger.
I am confused and have no idea how to deal with this,
Any comments and advices are welcome, thanks!