Grouping runs in workspaces API based on config parameter values

Hi, I cannot make the grouping based on config parameters work when using workspaces API. It works for the summary metrics, but this is not something I want to group runs by.

grouping=[
                internal.Key(name=expr_parsing.to_backend_name(g)) for g in self.groupby
            ],

I’m assuming this is the line (in interfaces.py) that allows grouping by summary metrics since the default section for Key in internal.py is ‘summary’. I’ve tried changing it so it takes config as parameter, but it just doesn’t read the config parameter value and groups all the runs as if the parameter is None (‘-’).

I’d really appreciate more information on how to achieve this programatically since it works from the App API but not from the workspaces.

Thanks.

Hey @bojana-rankovic, thanks for writing in! You need to use groupby=[ws.Config("param")] where param is the name of my config parameter. Mind letting me know if the following code works for you?

import os
import wandb
import wandb_workspaces.workspaces as ws
import wandb_workspaces.reports.v2 as wr

entity=""
project=""

def grouping_example(entity: str, project: str) -> None:
    workspace: ws.Workspace = ws.Workspace(
        name="Grouped Runs Workspace",
        entity=entity,
        project=project,
        sections=[
            ws.Section(
                name="Grouped Runs",
                panels=[
                    wr.LinePlot(x="Step", y=["val_loss"]),
                    wr.LinePlot(x="Step", y=["val_accuracy"]),
                ],
                is_open=True,
            ),
        ],
        runset_settings=ws.RunsetSettings(
            groupby=[ws.Config("param")]
        )
    )
    workspace.save()
    print("Workspace with grouped runs saved.")

grouping_example(entity, project)

Hi there, I wanted to follow up on this request. Please let us know if we can be of further assistance or if your issue has been resolved.

Hi @bojana-rankovic, since we have not heard back from you we are going to close this request. If you would like to re-open the conversation, please let us know!