Wandb.watch not logging parameters

I just started to use w&b to monitor the training of my few-shot learning NNs in Pytorch. I use wandb.watch(model, log=‘all’) but it only logs the gradients. Any idea what could be causing this? Also, is there an easy way to log the activation histograms of the different layers for pytorch?

Thanks!

Hi Nora, that is an interesting bug. The default is to log gradients. Can you try putting “all” in double quotes instead of single quotes?

Hi Leslie,

Thanks for the reply. Unfortunately, the double quotes didn’t change the result, still only gradients. When I put ‘parameters’ nothing is logged.

Can you tell me what version of wandb you’re using?

I recently installed it, version is wandb 0.12.5.

Are you using wandb.log in your code?

@norav , you might want to make sure if you’re logging at least one piece of data with wandb.log() first. Also, it would be worth mentioning that we only log gradients every 1000 steps by default.

Hi Nora, we wanted to follow up with you regarding your support request as we have not heard back from you. Please let us know if we can be of further assistance or if your issue has been resolved.

Hi Leslie,

The issue has not been resolved. I do use wandb.log() in my code for the loss and I use it now as well to log the parameters using hooks. I’m afraid this slows down my code but see no other solution since wandb.watch() is only logging gradients.

Can you let us know how many steps you are running?

I am running 50-100 steps. I use the command watch(model, log=‘all’, log_freq=1)

Can you send an image of what you see on your dashboard as to what is appearing and give us a script of how you are implementing wandb.log into your code?


This is the dashboard where I only get the gradients, not the parameters.

Here I initialize the wandb run and I pass it to the train functino (wrapper.fit()) where I log the loss with the next code snippet

loggingwb

Can you use wandb.log once before you use wanbd.watch? Wandb.watch won’t work properly if wandb.log is not there beforehand

Still no parameters with logging something before the watch:
run.log({‘trainidx’:trainidx})
run.watch(model, log='all, log_freq=1)

@norav , is there any chance you’re calling model.forward(inputs) instead of model(inputs)? Also, if you could share your script, that would be very helpful to reproduce the issue.

1 Like

Yes this seems to be it. Thank you!

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.