Problems logging Gradients with WandB and Pytorch Lightning

Hello,
I want to log my training with pytorch-lightning and wandb. Unfortunately, this is not working. Does someone have a suggestion? This would help a lot. I really would like to know what you think.

if args.wandb:
        name = remove_fold_suffix(name)
        name = f"{name}_{date_tag}"
        wandb_logger_pt = WandbLogger(project=args.project_name, name=name, reinit=True, tags=fold)
    monitor_metric = args.pretraining_metric
    maximize_minimize = args.pre_maximize_minimize
    checkpoint_callback, csv_logger_callback, args = set_log_dirs(logdir, monitor_metric,maximize_minimize, args, wandb_logger=wandb_logger_pt)
    model = choose_model(args)
    if wandb_logger_pt:
        wandb_logger_pt.watch(model, log='all')
    trainer = Trainer(
        max_epochs=args.training.epochs,
        logger=wandb_logger_pt,
        #log_every_n_steps = args.logging.log_every_n_steps,
        callbacks=[checkpoint_callback],
        accelerator="cpu",
        devices= 1,
        precision='16-mixed',
        strategy="ddp_find_unused_parameters_true",
        log_every_n_steps=1,
    )
    trainer.fit(model, train_dataloaders=train_loader, val_dataloaders=val_loader)
    print(f"Finished training, Threading: {threading.enumerate()}")
    clean_gpu_memory()
    found_validation_ids_to_validation_df(val_dataset, args, logdir)
    if args.wandb:
        wandb.finish()