How to log two variables at different increments of timesteps?

Hi there! I was wondering, how do I deal with having multiple variables to log, but one of those variables I only want to log every 100 timesteps? The wandb docs seem to suggest that I need to collect all my metrics into one log function call, but in my scenario above where I want to track one variable every step and another variable every 100 steps, I would need multiple log calls. I saw the docs for the define metrics function, but I’m not quite sure if that’s the way to handle this. How do I approach this in PyTorch? Thanks!

As an example, I currently have this Tensorboard logging that I’m trying to convert to wandb:

print(f"global_step={global_step}, episodic_return={info['episode']['r']}")
writer.add_scalar("charts/episodic_return", info["episode"]["r"], global_step)

if global_step % 100 == 0:
    writer.add_scalar(
        "losses/qf1_values", qf1_a_values.mean().item(), global_step
    )

Hi @chulabhaya , happy to help. The approach you are considering is correct. You can set a check in place and log a dictionary with the values you want and set the step value.

for i in range (300):
    if i%100==0:
        wandb.log({"value": i, "value": 100}, step =i)
    else:
        wandb.log({"value": 100})

The defined metric function allows you to have more control over the representation of your x axis and also how that axes is incremented. There are a few examples listed in the linked doc on how it functions. Please let me know if you have any questions.

1 Like

Thanks so much for confirming my approach! Much appreciated

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.