program: segformer_val.py
method: bayes
metric:
goal: minimize
name: valid_loss
parameters:
lr:
distribution: log_uniform_values
min: 0.0000001 #1e-7
max: 0.001 #1e-3
weight_decay:
distribution: log_uniform_values
min: 0.00001 #1e-5
max: 0.1 #1e-1
eps:
distribution: log_uniform_values
min: 0.0001 #1e-4
max: 1
Hi, I am new to Wandb
I am trying to do hyperparameter optimization for my AdamW optimizer and I tried following the docs for the sweeps and I keep running into the following error-
if not 0.0 <= lr:
TypeError: '<=' not supported between instances of 'float' and 'dict'
at the following line-
optimizer=torch.optim.AdamW(model.parameters(),lr=config['parameters']['lr'],
weight_decay= config['parameters']['weight_decay'],
eps= config['parameters']['eps'])
these are the lines before optimizer-
with open("config.yaml") as f:
config = yaml.safe_load(f)
run = wandb.init(config=config)
#Model and its configurations
model=SegformerForSemanticSegmentation.from_pretrained('nvidia/segformer-b1-finetuned-cityscapes-1024-1024',num_labels=2,ignore_mismatched_sizes=True)
feature_extractor=SegformerFeatureExtractor.from_pretrained('nvidia/segformer-b1-finetuned-cityscapes-1024-1024')
Can anyone tell how to solve this?