If not 0.0 <= lr: TypeError: '<=' not supported between instances of 'float' and 'dict'

program: segformer_val.py
method: bayes
metric: 
 goal: minimize
 name: valid_loss

parameters:
  lr:
   distribution: log_uniform_values
   min: 0.0000001 #1e-7
   max: 0.001 #1e-3
  
  weight_decay:
   distribution: log_uniform_values
   min: 0.00001 #1e-5
   max: 0.1 #1e-1

  eps:
   distribution: log_uniform_values
   min: 0.0001 #1e-4
   max: 1

Hi, I am new to Wandb
I am trying to do hyperparameter optimization for my AdamW optimizer and I tried following the docs for the sweeps and I keep running into the following error-

if not 0.0 <= lr:
TypeError: '<=' not supported between instances of 'float' and 'dict'

at the following line-

optimizer=torch.optim.AdamW(model.parameters(),lr=config['parameters']['lr'],
                            weight_decay= config['parameters']['weight_decay'],
                            eps= config['parameters']['eps'])

these are the lines before optimizer-

with open("config.yaml") as f:
    config =  yaml.safe_load(f)
run = wandb.init(config=config)

#Model and its configurations
model=SegformerForSemanticSegmentation.from_pretrained('nvidia/segformer-b1-finetuned-cityscapes-1024-1024',num_labels=2,ignore_mismatched_sizes=True)
feature_extractor=SegformerFeatureExtractor.from_pretrained('nvidia/segformer-b1-finetuned-cityscapes-1024-1024')

Can anyone tell how to solve this?

I have resolved this issue, incase anyone encounters the same issue, you have to do

with open("config.yaml") as f:
    config =  yaml.safe_load(f)

wandb.init()
#initialize_model 
optimizer=torch.optim.AdamW(model.parameters(),lr=wandb.config['lr'],
                            weight_decay= wandb.config['weight_decay'],
                            eps= wandb.config['eps'])

Hi @pratyush01 glad you’ve worked out the solution to this issue, and thanks for taking the time to write it here for any future reference! We will close this ticket on our end too, and please feel free to reach out to us here again if you had any other questions/issues.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.