I am creating hyper parameter sweeps of a linear regression problem, which has gone without a hitch so far. However, I am working to improve my skills with wand. In particular, I logged dictionaries of dictionaries. Here is an example:
wandb.log({
'epoch': epoch,
'train':{'min_loss': t_min_loss, 'min_loss_epoch': t_min_loss_epoch},
'valid':{'min_loss': t_min_loss, 'min_loss_epoch': t_min_loss_epoch}
} , step=epoch, commit=True)
The metric should be the training loss. How does one specify it when using dictionary hierarchies? Have I done it correctly below?
# docs: https://docs.wandb.ai/guides/sweeps/configuration
sweep_config3 = {
'name' : 'broad_sweep',
'method' : 'random',
'metric' : {
'name': {'train': 'loss'},
'goal': 'minimize',
},
'parameters' : {
'lr' : {
'distribution': 'log_uniform_values',
'min': 1.e-3,
'max': 1.e-1},
'batch_size' : { 'value': 32 },
'optim' : { 'value': 'adamw' },
'nb_layers' : { 'values': [0, 2, 4] },
'pts_layer': { 'values': [5, 10, 30] },
'nb_epochs': { 'value': 200},
}
}