Parameter locked by sweep

My code roughly looks like htis:

wandb.sweep(params)
if params.find_lr:
    wandb.config.update({'learning_rate': lr_finder.run()})

However, I am getting the error "wandb: WARNING Config item ‘learning_rate’ was locked by ‘sweep’ (ignored update). "

Why can’t I update my learning rate? Can I somehow unlock the parameter configuration of the sweep?

Hi Tim! Thank you for writing in.

The warning message you’re seeing indicates that the learning_rate parameter is locked because it’s being managed by a W&B sweep. When you initiate a sweep, W&B locks the configuration parameters that are being swept over to ensure consistency across runs. This means that you cannot update the values of these parameters within the run using wandb.config.update.

If you are interested, I would love to submit a feature request for you to our engineering team to add an option of updating sweep parameters mid sweep.

I think adding an extra option for this is overkill. Instead I propose that the error which currently doesn’t allow me to overwrite the parameter should instead just be a warning. I image that overwriting a parameter could become a bit weird when the parameter being altered is one being swept over using bayesian hyperparameter optimization. I would imagine that then the behavior is undefined, so a warning is in place. However, if the parameter that I want to change isn’t swept over (i.e. just has one value) or I’m doing a random search, I don’t think overwriting the parameter would cause any problems.

I can imagine that such a change request would end up on the bottom of the pile, though.

For now the workaround for me is to log the lr from the lr_finder under a new config item: lr_from_lr_finder.

Hi Tim! You are completely right, that overwriting a parameter could become a bit weird and hard to manage when the parameter being updated is one being swept over using bayesian search, or accidentally updating it to something that was already run.

I’ll be more than happy to submit a request for our sdk team to make it a warning instead of an error, but to clarify, after you get warned, you are interested in still running the code and updating the parameter, or just getting a warning and ignoring the update line of code entirely?

We’ve definitely seen an increase in people asking for this feature, so I can’t really promise what the priority on this is going to be.

Thank you for the workaround! That makes a lot of sense.

I want the parameter to get overwritten of course. I don’t see any use case for calling the line wandb.update but it not doing exactly that. Why would your library decide to ignore a call to a function I made?! As a user I should be in charge. If executing some code may have weird side effects then inform me - don’t block me from doing it.

Thats a great point, Tim.

I have submitted the request ticket to our product team for customers to have the ability to edit their sweep config while the sweep is still active.

We’ll let you know when there are any updates on it, lately we’ve been seen an influx of users asking for this feature.

+1 for adding this feature. The problem described gets worse with nested configs. If you have a nested config where you want to sweep over one parameter inside the nested config, this error prevents you from updating the other fields under that parent parameter.

E.g. suppose your base config is:

lr: 0.1
data:
  source: datasource1
  n_samples: 20

If you want to sweep over data.n_samples in [20, 30] and still have the data.source field in your wandb.config, ideally you could have a sweep config like the following:

method: grid
metric:
  name: val_loss
  goal: minimize
parameters:
  data:
    parameters:
      n_samples: [20, 30]

and then later update the wandb.config with the full parameters than include the data.source field. But the current behaviour will lock the whole data parameter and prevent you from adding other child elements. This forces you to list every child parameter in the sweep config:

method: grid
metric:
  name: val_loss
  goal: minimize
parameters:
  data:
    parameters:
      n_samples:
        values: [20, 30]
      source:
        values: [datasource1]

which can be incredibly verbose and annoying when the base config changes.

That’s very odd behavior indeed, given that " W&B will flatten the [nested] names using dots in the W&B backend." according to Configure a Machine Learning Experiment

There’s also Define sweep configuration for hyperparameter tuning.
There’s not a lot of documentation on nested config, though…