I just started to use w&b to monitor the training of my few-shot learning NNs in Pytorch. I use wandb.watch(model, log=‘all’) but it only logs the gradients. Any idea what could be causing this? Also, is there an easy way to log the activation histograms of the different layers for pytorch?
@norav , you might want to make sure if you’re logging at least one piece of data with wandb.log()first. Also, it would be worth mentioning that we only log gradients every 1000 steps by default.
Hi Nora, we wanted to follow up with you regarding your support request as we have not heard back from you. Please let us know if we can be of further assistance or if your issue has been resolved.
The issue has not been resolved. I do use wandb.log() in my code for the loss and I use it now as well to log the parameters using hooks. I’m afraid this slows down my code but see no other solution since wandb.watch() is only logging gradients.
Can you send an image of what you see on your dashboard as to what is appearing and give us a script of how you are implementing wandb.log into your code?
@norav , is there any chance you’re calling model.forward(inputs) instead of model(inputs)? Also, if you could share your script, that would be very helpful to reproduce the issue.