If not 0.0 <= lr: TypeError: '<=' not supported between instances of 'float' and 'dict'

program: segformer_val.py
method: bayes
 goal: minimize
 name: valid_loss

   distribution: log_uniform_values
   min: 0.0000001 #1e-7
   max: 0.001 #1e-3
   distribution: log_uniform_values
   min: 0.00001 #1e-5
   max: 0.1 #1e-1

   distribution: log_uniform_values
   min: 0.0001 #1e-4
   max: 1

Hi, I am new to Wandb
I am trying to do hyperparameter optimization for my AdamW optimizer and I tried following the docs for the sweeps and I keep running into the following error-

if not 0.0 <= lr:
TypeError: '<=' not supported between instances of 'float' and 'dict'

at the following line-

                            weight_decay= config['parameters']['weight_decay'],
                            eps= config['parameters']['eps'])

these are the lines before optimizer-

with open("config.yaml") as f:
    config =  yaml.safe_load(f)
run = wandb.init(config=config)

#Model and its configurations

Can anyone tell how to solve this?

I have resolved this issue, incase anyone encounters the same issue, you have to do

with open("config.yaml") as f:
    config =  yaml.safe_load(f)

                            weight_decay= wandb.config['weight_decay'],
                            eps= wandb.config['eps'])

Hi @pratyush01 glad you’ve worked out the solution to this issue, and thanks for taking the time to write it here for any future reference! We will close this ticket on our end too, and please feel free to reach out to us here again if you had any other questions/issues.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.