Gradient and Parameter Histograms are static across all timesteps

I am logging Stable-Baselines3 experiments, but my gradient and parameter histograms are constant across hundreds of timesteps, each composed of over 30,000 individual observations. I have verified that the parameters of my model are evolving over time via model checkpoints.

I am not sure what might be causing this. I have followed the guide at Stable Baselines 3 | Weights & Biases Documentation.

This should be the relevant part of my code:

    config = {
        "total_timesteps": model_parameters["n_steps"]*num_cpu*200,
        "log_interval": 1,

    run = wandb.init(
        sync_tensorboard=True,  # auto-upload sb3's tensorboard metrics
        save_code=True,  # optional
        name=run_name # optional

    wandbCb = WandbCallback(
    RewardCb = RewardCallback(eval_freq=model_parameters["n_steps"]*num_cpu)
    callbacks = CallbackList([


Hi @jjhubbard! Thank you for writing in! Could you please send us a workspace where you are expecting to see different histograms but they are showing up the same?

Unfortunately, I can’t share it as it contains proprietary code for my work. Is it possible to share a workspace without the code and file widgets?

Totally! You can add the graphs where you are experiencing this behavior with into a report and then share the report with me instead of the whole workspace.

Hi JJ,

We wanted to follow up with you regarding your support request as we have not heard back from you. Please let us know if we can be of further assistance or if your issue has been resolved.

Hi, since we have not heard back from you, we are going to close this request. If you would like to reopen the conversation, please let us know! Unfortunately, at the moment, we do not receive notifications if a thread reopens on Discourse. So, please feel free to create a new ticket regarding your concern if you’d like to continue the conversation.