Problem at: /home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/pytorch_lightning/loggers/wandb.py 193 experiment
Traceback (most recent call last):
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_init.py”, line 1170, in init
run = wi.init()
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_init.py”, line 629, in init
run = Run(
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_run.py”, line 566, in init
self._init(
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_run.py”, line 698, in _init
self._config._update(config, ignore_locked=True)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_config.py”, line 177, in _update
sanitized = self._sanitize_dict(
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_config.py”, line 237, in _sanitize_dict
k, v = self._sanitize(k, v, allow_val_change)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/sdk/wandb_config.py”, line 255, in _sanitize
val = json_friendly_val(val)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/site-packages/wandb/util.py”, line 671, in json_friendly_val
converted = asdict(val)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/dataclasses.py”, line 1073, in asdict
return _asdict_inner(obj, dict_factory)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/dataclasses.py”, line 1080, in _asdict_inner
value = _asdict_inner(getattr(obj, f.name), dict_factory)
File “/home/csgrad/mbhosale/anaconda3/envs/pathldm1/lib/python3.8/dataclasses.py”, line 1110, in _asdict_inner
return type(obj)((_asdict_inner(k, dict_factory),
TypeError: first argument must be callable or None
I verified the config looks just fine as below
{‘target’: ‘pytorch_lightning.loggers.WandbLogger’, ‘params’: {‘project’: ‘tcga-brca’, ‘name’: ‘06-11T11-10_plip_imagenet_finetune_PanNuke’, ‘save_dir’: ‘logs/06-11T11-10_plip_imagenet_finetune_PanNuke’, ‘offline’: False, ‘id’: ‘06-11T11-10_plip_imagenet_finetune_PanNuke’, ‘resume’: None, ‘config’: {‘name’: ‘’, ‘resume’: ‘’, ‘base’: [‘/home/csgrad/mbhosale/phd/Pathdiff/PathLDM/configs/latent-diffusion/mask_cond/plip_imagenet_finetune_PanNuke.yaml’], ‘train’: True, ‘no_test’: False, ‘project’: None, ‘debug’: False, ‘seed’: 23, ‘postfix’: ‘’, ‘logdir’: ‘logs’, ‘scale_lr’: False, ‘wandb_name’: None, ‘wandb_id’: None, ‘logger’: True, ‘checkpoint_callback’: True, ‘default_root_dir’: None, ‘gradient_clip_val’: 0.0, ‘gradient_clip_algorithm’: ‘norm’, ‘process_position’: 0, ‘num_nodes’: 1, ‘num_processes’: 1, ‘devices’: None, ‘gpus’: None, ‘auto_select_gpus’: False, ‘tpu_cores’: None, ‘ipus’: None, ‘log_gpu_memory’: None, ‘progress_bar_refresh_rate’: None, ‘overfit_batches’: 0.0, ‘track_grad_norm’: -1, ‘check_val_every_n_epoch’: 1, ‘fast_dev_run’: False, ‘accumulate_grad_batches’: 1, ‘max_epochs’: None, ‘min_epochs’: None, ‘max_steps’: None, ‘min_steps’: None, ‘max_time’: None, ‘limit_train_batches’: 1.0, ‘limit_val_batches’: 1.0, ‘limit_test_batches’: 1.0, ‘limit_predict_batches’: 1.0, ‘val_check_interval’: 1.0, ‘flush_logs_every_n_steps’: 100, ‘log_every_n_steps’: 50, ‘accelerator’: None, ‘sync_batchnorm’: False, ‘precision’: 32, ‘weights_summary’: ‘top’, ‘weights_save_path’: None, ‘num_sanity_val_steps’: 2, ‘truncated_bptt_steps’: None, ‘resume_from_checkpoint’: None, ‘profiler’: None, ‘benchmark’: False, ‘deterministic’: False, ‘reload_dataloaders_every_n_epochs’: 0, ‘reload_dataloaders_every_epoch’: False, ‘auto_lr_find’: False, ‘replace_sampler_ddp’: True, ‘terminate_on_nan’: False, ‘auto_scale_batch_size’: False, ‘prepare_data_per_node’: True, ‘plugins’: None, ‘amp_backend’: ‘native’, ‘amp_level’: ‘O2’, ‘distributed_backend’: None, ‘move_metrics_to_cpu’: False, ‘multiple_trainloader_mode’: ‘max_size_cycle’, ‘stochastic_weight_avg’: False}}}
I am clueless as to how to debug this further. Any pointers are genuinely appreciated.