The default parameters are still used during training when sweeps?

Hello, I am using sweeps to adjust parameters. I use two hyperparameters to control a weight coefficient. Sweeps generates some combinations of hyperparameters, but during the training process, the weight coefficients of all experiments are the same, equal to those generated by the default two hyperparameters. Why is this?

Hello @ssglight , thank you for reaching out and happy to help. Is it possible for you to share a sample toy code on how you are trying to set your Sweeps?

Thank you for your reply! I write a sweep.yaml file as follow:

Then I use different terminals and typy in: CUDA_VISIBLE_DEVICES=i wandb agent sweep_ID to start sweep.
The hyperparameters ‘a’ and ‘b’ control a weight coefficient together. I log the weight coefficient. But all log the same value as follow:
I’m confused.

Hi @ssglight , thank you for providing these. Could you please try another distribution (eg uniform ) than categorical and check if the same issue occurs? You can refer here, and let us know if what will be the result from your end.

Ok, I will try and tell you the result.

I use the random method and uniform distribution, and get the same result:

Hi @ssglight , may we ask for a link to your workspace so we can investigate this further.

Hi @joana-marie, sorry, I’m afraid not right now. Due to the emergence of this problem, I‘m manually tuning hyperparameters. The space of each user is limited, so I deleted the sweeps experiments. But I can run another set of sweeps later. I will tell you then. How do I share the workspace link?

Hello @ssglight , I understand and no worries at all~ You may grab it in the url field when you go to the sweep in question. Don’t forget to make the project to Public temporarily – you may also email us the link if this should not be seen here in the community.

Hi @ssglight , since we have not heard back from you we are going to close this request. If you would like to re-open the conversation, please let us know!