I am running a hyperparameter sweep, but the config dict doesn’t seem to have the usual dict attribute ‘copy’. This is a problem because in the xgboost package, xgb.train()
requires the parameters to be copied when boosting. I’m unable to train xgboost in a sweep without xgb being able to .copy() the config. Here is a quick demo:
sweep_config = {
'method': 'random',
'parameters': {
'gamma': {
'values': [0.1, 0.2, 0.3]
},
'lambda': {
'values': [0.4, 0.5, 0.6, 0.7, 0.8]
},
'learning_rate': {
'values': [0.001, 0.01, 0.05, 0.1]
},
'objective': {
'value': 'multi:softprob',
},
'num_class': {
'value': 3
},
}
}
def train(config=None):
with wandb.init(
):
config = wandb.config
config.copy()
sweep_id = wandb.sweep(sweep_config, project="my_project_name")
wandb.agent(sweep_id, train, count=10)
The error message is:
Traceback (most recent call last):
File "/path/to/miniconda3/envs/myenv_py311/lib/python3.11/site-packages/wandb/sdk/wandb_config.py", line 162, in __getattr__
return self.__getitem__(key)
^^^^^^^^^^^^^^^^^^^^^
File "/path/to/miniconda3/envs/myenv_py311/lib/python3.11/site-packages/wandb/sdk/wandb_config.py", line 130, in __getitem__
return self._items[key]
~~~~~~~~~~~^^^^^
KeyError: 'copy'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/var/folders/_t/gnglhcjx7bq497x344n42txc0000gq/T/ipykernel_5206/2730897414.py", line 5, in train
config.copy()
^^^^^^^^^^^
File "/path/to/miniconda3/envs/myenv_py311/lib/python3.11/site-packages/wandb/sdk/wandb_config.py", line 164, in __getattr__
raise AttributeError(
AttributeError: <class 'wandb.sdk.wandb_config.Config'> object has no attribute 'copy'