Thank you for reaching out for support.
Weights & Biases sweeps allow you to define a search strategy for hyperparameter optimization. However, currently, you can only specify one search strategy per sweep. This means you can’t perform a grid search on certain hyperparameters and a Bayesian optimization on a different subset of parameters within the same sweep.
Here’s an example of how to set up a sweep with a grid search strategy:
python
sweep_config = {
"method": "grid", # grid search
"metric": {
"name": "accuracy",
"goal": "maximize"
},
"parameters": {
"num_layers": {
"values": [1, 2, 3, 4]
},
"optimizer": {
"values": ["adam", "sgd"]
}
}
}
sweep_id = wandb.sweep(sweep_config)
wandb.agent(sweep_id, function=train)
And here’s an example of how to set up a sweep with a Bayesian search strategy:
python
sweep_config = {
"method": "bayes", # bayesian optimization
"metric": {
"name": "accuracy",
"goal": "maximize"
},
"parameters": {
"learning_rate": {
"min": 0.001,
"max": 0.1
},
"batch_size": {
"min": 32,
"max": 256
}
}
}
sweep_id = wandb.sweep(sweep_config)
wandb.agent(sweep_id, function=train)
You can run these sweeps separately and compare the results on the Weights & Biases dashboard.
Sources:
- Hyperparameter Optimization with XGBoost using W&B Sweep
- HyperParameter Sweep
- Sweeps: An Overview
- Tune hyperparameters
Regards,
Carlo Argel