I have several parameters that I intend to grid sweep e.g. batch size, learning rate and optimizer. How can one specify the order in which the parameters are changed?
For instance, suppose I think choice of optimizer will matter least and learning rate will matter most. I’d like to try all possible learning rates on one optimizer before moving on to the next optimizer. How do I do this?
Based on the below plot, it looks like wandb might alphabetize the parameters and then change the last ones (alphabetically speaking) most frequently. Is that correct? If so, how do I change that?