Multi-level nesting in yaml for sweeps

I am trying to start a sweep using this yaml file.

sweep.yaml

method: bayes
metric:
  goal: maximize
  name: val_f1_score
parameters:
  notes:
    value: ""
  seed:
    value: 42
  lr:
    values: [1e-3, 5e-4, 1e-4, 5e-5, 1e-5]
  epochs:
    value: 30
  augmentation:
    value: True
  class_weights:
    value: True
  optimizer:
    value: adam
  loss:
    value: categorical_crossentropy
  metrics:
    value: ["accuracy"]
  batch_size:
    value: 64
  num_classes:
    value: 7
  paths:
    - 
      data:
        value: ${hydra:runtime.cwd}/data/4_tfds_dataset/

wandb:
  -
    use:
      value: True
    project:
      value: Whats-this-rock

dataset:
  -
    id:
      value: [1, 2, 3, 4]
    dir:
      value: data/3_consume/
    image:
      size:
        value: 124
      channels:
        value: 3
    classes:
      value: 10
    sampling:
      value: None

model:
  -
    backbone:
      value: efficientnetv2m
    use_pretrained_weights:
      value: True
    trainable:
      value: True
    preprocess:
      value: True
    dropout_rate:
      value: 0.3

callback:
  -
    monitor:
      value: "val_f1_score"
    earlystopping:
      patience:
        value: 10
    reduce_lr:
      factor:
        values: [.9, .7, .5]
      min_lr: 0.00001
      patience:
        values: [1, 2, 3, 4]
    save_model:
      status:
        value: True
      best_only:
        value: True

program: src/models/train.py

Error: Invalid sweep config: invalid hyperparameter configuration: paths

Here’s the full traceback of the error:-

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/wandb/cli/cli.py", line 97, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/wandb/cli/cli.py", line 942, in sweep
    launch_scheduler=_launch_scheduler_spec,
  File "/usr/local/lib/python3.7/dist-packages/wandb/apis/internal.py", line 102, in upsert_sweep
    return self.api.upsert_sweep(*args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/wandb/apis/normalize.py", line 62, in wrapper
    raise CommError(message, err).with_traceback(sys.exc_info()[2])
  File "/usr/local/lib/python3.7/dist-packages/wandb/apis/normalize.py", line 26, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/wandb/sdk/internal/internal_api.py", line 2178, in upsert_sweep
    raise e
  File "/usr/local/lib/python3.7/dist-packages/wandb/sdk/internal/internal_api.py", line 2175, in upsert_sweep
    check_retry_fn=no_retry_4xx,
  File "/usr/local/lib/python3.7/dist-packages/wandb/sdk/lib/retry.py", line 129, in __call__
    retry_timedelta_triggered = check_retry_fn(e)
  File "/usr/local/lib/python3.7/dist-packages/wandb/sdk/internal/internal_api.py", line 2153, in no_retry_4xx
    raise UsageError(body["errors"][0]["message"])
wandb.errors.CommError: Invalid sweep config: invalid hyperparameter configuration: paths

I am using hydra and trying to replicate a config.yaml for wandb sweeps

config.yaml

notes: ""
seed: 42
lr: 0.001
epochs: 30
augmentation: True
class_weights: True
optimizer: adam
loss: categorical_crossentropy
metrics: ["accuracy"]
batch_size: 64
num_classes: 7

paths:
  data: ${hydra:runtime.cwd}/data/4_tfds_dataset/

wandb:
  use: True
  project: Whats-this-rock

dataset:
  id: [1, 2, 3, 4]
  dir: data/3_consume/
  image:
    size: 124
    channels: 3
  classes: 10
  sampling: None

model:
  backbone: efficientnetv2m
  use_pretrained_weights: True
  trainable: True
  preprocess: True
  dropout_rate: 0.3

callback:
  monitor: "val_f1_score"
  earlystopping:
    patience: 10
  reduce_lr:
    factor: 0.4
    min_lr: 0.00001
    patience: 2
  save_model:
    status: True
    best_only: True

Hi @udaylunawat!

Thanks for your request! This is already a planned feature which has recieved quite a few requests, I’ll go ahead and increase the priority of this feature request for you.

Thanks,
Ramit

Nesting other parameters inside a root level parameter might do the trick, I’ll try it and let you know.

method: bayes
metric:
  goal: maximize
  name: val_f1_score
parameters:
  notes:
    value: ""
  seed:
    value: 42
  lr:
    values: [1e-3, 5e-4, 1e-4, 5e-5, 1e-5]
  epochs:
    value: 30
  augmentation:
    value: True
  class_weights:
    value: True
  optimizer:
    value: adam
  loss:
    value: categorical_crossentropy
  metrics:
    value: ["accuracy"]
  batch_size:
    value: 64
  num_classes:
    value: 7
  paths:
    parameters:
      data:
        value: data/4_tfds_dataset/
  wandb:
    parameters:
      use:
        value: True
      project:
        value: Whats-this-rock
  dataset:
    parameters:
      id:
        value: [1, 2, 3, 4]
      dir:
        value: data/3_consume/
      image:
        parameters:
          size:
            value: 124
          channels:
            value: 3
      classes:
        value: 10
      sampling:
        value: None
  model:
    parameters:
      backbone:
        value: efficientnetv2m
      use_pretrained_weights:
        value: True
      trainable:
        value: True
      preprocess:
        value: True
      dropout_rate:
        value: 0.3
  callback:
    parameters:
      monitor:
        value: "val_f1_score"
      earlystopping:
        parameters:
          patience:
            value: 10
      reduce_lr:
        parameters:
          factor:
            values: [.9, .7, .5]
          patience:
            values: [1, 2, 3, 4]
          min_lr:
            value: 0.00001
      save_model:
        parameters:
          status:
            value: True
          best_only:
            value: True
program: src/models/train.py
2022-09-16 21:28:51,236 - wandb.wandb_agent - INFO - About to run command: /usr/bin/env python src/models/train.py augmentation=True batch_size=64 "callback={'earlystopping': {'patience': 10}, 'monitor': 'val_f1_score', 'reduce_lr': {'factor': 0.7, 'min_lr': 1e-05, 'patience': 2}, 'save_model': {'best_only': True, 'status': True}}" class_weights=True "dataset={'classes': 7, 'dir': 'data/4_tfds_dataset/', 'id': [1, 2, 3, 4], 'image': {'channels': 3, 'size': 224}, 'sampling': 'oversampling'}" epochs=30 loss=categorical_crossentropy lr=0.0001 metrics=['accuracy'] "model={'backbone': 'resnet', 'dropout_rate': 0.3, 'preprocess': True, 'trainable': True, 'use_pretrained_weights': False}" notes= num_classes=7 optimizer=adamax "paths={'data': 'data/4_tfds_dataset/'}" seed=42 "wandb={'project': 'Whats-this-rock', 'use': True}"
no viable alternative at input '{'earlystopping''
See https://hydra.cc/docs/next/advanced/override_grammar/basic for details

Looks like hydra doesn’t supports nesting other parameters.
Is there someway I can fix this?

The solution is to use dot notation instead of nested parameters as wandb (v0.13.3) sweeps doesn’t support nested parameters.

sweep.yaml

method: bayes
metric:
  goal: maximize
  name: val_accuracy
parameters:
  notes:
    value: ""
  seed:
    values: [1, 42, 100]
  lr:
    values: [1e-3, 5e-4, 1e-4, 5e-5, 1e-5]
  epochs:
    value: 100
  augmentation:
    value: True
  class_weights:
    value: True
  optimizer:
    values: [adam, adamax]
  loss:
    value: categorical_crossentropy
  metrics:
    value: ["accuracy"]
  batch_size:
    value: 64
  num_classes:
    value: 7
  train_split:
    values:
      - 0.70
      - 0.75
      - 0.80
  data_path:
    value: data/4_tfds_dataset/
  wandb.use:
    value: True
  wandb.mode:
    value: online
  wandb.project:
    value: Whats-this-rockv3
  dataset_id:
    values:
      - [1]
  image_size:
    value: 224
  image_channels:
    value: 3
  sampling:
    values: [None, oversampling, undersampling]
  backbone:
    values:
      [
        efficientnetv2m,
        efficientnetv2,
        resnet,
        mobilenetv2,
        inceptionresnetv2,
        xception,
      ]
  use_pretrained_weights:
    values: [True]
  trainable:
    values: [True, False]
  preprocess:
    value: True
  dropout_rate:
    values: [0.3]
  monitor:
    value: "val_accuracy"
  earlystopping.use:
    value: True
  earlystopping.patience:
    values: [10]
  reduce_lr.use:
    values: [True]
  reduce_lr.factor:
    values: [.9, .7, .5, .3]
  reduce_lr.patience:
    values: [1, 3, 5, 7, 13]
  reduce_lr.min_lr:
    value: 1e-5
  save_model:
    value: False

program: src/models/train.py
command:
  - ${env}
  - python
  - ${program}
  - ${args_no_hyphens}

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.

Hi Uday,

I just wanted to let you know that conditional sweeps is possible using Launch, check out this colab.

Best,
Luis