I am using Weights & Biases with PyTorch Lightning and want to log progress messages to the Logs tab using wandb.termlog() inside a Trainer callback.
However, the messages only appear in the local Python terminal, and they do not show up in the W&B Logs tab on the server.
Here is a minimal example of my callback:
import wandb
from pytorch_lightning.callbacks import Callback
class WandbProgressLogger(Callback):
def __init__(self, log_every_n_steps=1):
self.log_every_n_steps = log_every_n_steps
def _log_progress(self, trainer, batch_idx, stage: str):
if not getattr(trainer, "is_global_zero", True):
return
if (trainer.global_step + 1) % self.log_every_n_steps != 0 and stage == "train":
return
try:
if stage == "predict":
num_batches = len(trainer.predict_dataloaders)
else:
dataloader = getattr(trainer, f"{stage}_dataloader")
num_batches = len(dataloader)
progress_percent = (batch_idx + 1) / num_batches * 100 if num_batches > 0 else 0.0
except Exception:
progress_percent = 0.0
metrics = {
f"progress/{stage}": progress_percent,
"progress/epoch": float(trainer.current_epoch),
}
if trainer.logger is not None:
trainer.logger.log_metrics(metrics, step=trainer.global_step)
wandb.termlog(f"[step {batch_idx}] {stage} progress: {progress_percent:.2f}%\n")
def on_train_batch_end(self, trainer, pl_module, outputs, batch, batch_idx, dataloader_idx=0):
self._log_progress(trainer, batch_idx, "train")
def on_predict_batch_end(self, trainer, pl_module, outputs, batch, batch_idx, dataloader_idx=0):
self._log_progress(trainer, batch_idx, "predict")
Environment / Additional Info:
- PyTorch Lightning
ddp/ multi-GPU - W&B Python SDK latest version
- Using
wandb.termloginside DDP callback - Expected: Messages appear in W&B Logs tab
- Actual: Messages only appear in local Python console
Could you advise how to make termlog messages from Lightning callbacks appear in the W&B Logs tab when using DDP?