Difference between wandb.log and torch.utils.tensorboard.SummaryWriter.add_scalar

A quick question: is there any difference between wandb.log vs. torch.utils.tensorboard.SummaryWriter.add_scalar for logging numbers in wandb?
I’m wondering if one is faster / more preferred to the other? Thanks!

Hey @kaiwenw, When it comes to logging numbers to wandb specifically, both wandb.log and torch.utils.tensorboard.SummaryWriter.add_scalar can be used. However, wandb.log is the recommended method for logging data directly to wandb. In terms of performance and speed, wandb.log is generally faster than torch.utils.tensorboard.SummaryWriter.add_scalar when logging to wandb.

torch.utils.tensorboard.SummaryWriter.add_scalar is part of the TensorBoard library and provides a PyTorch integration. While it can be used to log data to wandb, it involves an extra step of forwarding the logged data from TensorBoard to wandb. This additional step introduces some overhead and may have a slightly slower performance compared to using wandb.log directly.

Hi @kaiwenw , since we have not heard back from you we are going to close this request. If you would like to re-open the conversation, please let us know!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.