How to show "f1_macro" when using hugging face transformer?

I am using hugging face transforer to fine tune a Bert model. How to show f1 with wandb?

training_args = TrainingArguments(
    output_dir='./results_'+folder+'/',          # output directory
    num_train_epochs = 3,              # total # of training epochs
    per_device_train_batch_size = n_batch,  # batch size per device during training
    per_device_eval_batch_size = n_batch,   # batch size for evaluation               
    weight_decay = 0.01,               # strength of weight decay
    logging_dir ='./logs',            # directory for storing logs
    learning_rate = lr,
    #warmup_steps = 1000,  # number of warmup steps for learning rate scheduler
    load_best_model_at_end = True,
    evaluation_strategy = 'steps',

does not work

1 Like

Hey @lawrencexu could you share some more information please? Can you share a link to a W&B Dashboard or a screenshot? Can you share the code in your “compute_metrics” function ? the W&B integration should pick up all metrics defined in there I think

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.