Sweeps hyperparameter tuning with cross validation

Hey,

I want to do hyperparameter tuning using sweeps for my project. The experiments all involve k-fold cross-validation. Prior to doing hyperparameter tuning, I was separating the folds into different runs in wandb.init() by assigning them to the same group but giving them the fold number as a name. That works well.
However, now that I’m using sweeps for hyperparameter tuning, wandb shows only one run per group (Fold 1) and all folds get packed together into this one run.

Is there any best practice how to do both hyperparameter search with sweeps and k-fold cross-validation?

Thanks a lot!

Hey there,

Here is an example on how to do k-fold cross-validation with sweeps.

Best,
Arman

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.