About WandbModelCheckpoint : TypeError: ModelCheckpoint.__init__() got an unexpected keyword argume

model=model.fit(
normed_train_data,
train_labels,
epochs=config.epochs,
batch_size = config.batch_size,
verbose=0,
shuffle=True,
steps_per_epoch = int(normed_train_data.shape[0] / config.batch_size) ,
validation_data = (normed_valid_dataset, valid_labels),
callbacks=[
WandbMetricsLogger(),
WandbModelCheckpoint(filepath=“my_model_{epoch:02d}”)
])

TypeError Traceback (most recent call last)
File :16

File /opt/conda/lib/python3.10/site-packages/wandb/integration/keras/callbacks/model_checkpoint.py:94, in WandbModelCheckpoint.init(self, filepath, monitor, verbose, save_best_only, save_weights_only, mode, save_freq, options, initial_value_threshold, **kwargs)
81 def init(
82 self,
83 filepath: StrPath,
(…)
92 **kwargs: Any,
93 ) → None:
—> 94 super().init(
95 filepath=filepath,
96 monitor=monitor,
97 verbose=verbose,
98 save_best_only=save_best_only,
99 save_weights_only=save_weights_only,
100 mode=mode,
101 save_freq=save_freq,
102 options=options,
103 initial_value_threshold=initial_value_threshold,
104 **kwargs,
105 )
106 if wandb.run is None:
107 raise wandb.Error(
108 “You must call wandb.init() before WandbModelCheckpoint()
109 )

TypeError: ModelCheckpoint.init() got an unexpected keyword argument ‘options’

is there an updated version for implementing wandbcallbacks ???

Hello, This a reply from our support bot designed to assist you with your Weights & Biases related queries. To reach a human please reply to this message.

The error you’re encountering, TypeError: ModelCheckpoint.__init__() got an unexpected keyword argument 'options', suggests that the WandbModelCheckpoint callback is being initialized with an argument (options) that it does not expect. This could be due to a mismatch between the version of wandb you are using and the version in which the options parameter was introduced or removed for the WandbModelCheckpoint.

To resolve this issue, you should ensure that you are using the correct arguments for WandbModelCheckpoint as per the documentation. Based on the provided context, the WandbModelCheckpoint does not list options as a valid parameter. Here’s how you can correctly use WandbModelCheckpoint:

WandbModelCheckpoint(
    filepath="my_model_{epoch:02d}",
    monitor="val_loss",  # or another metric you wish to monitor
    verbose=1,
    save_best_only=False,
    save_weights_only=False,
    mode="auto",
    save_freq="epoch",
    initial_value_threshold=None
)

Make sure to remove the options parameter from your WandbModelCheckpoint initialization, as it is not recognized based on the provided context.

Additionally, ensure that wandb.init() is called before using WandbModelCheckpoint to ensure that your W&B run is properly initialized. This is crucial for the callback to function correctly, as it relies on the active run to log the model checkpoints.

If you are looking for updated versions of Wandb callbacks or additional features, it’s a good practice to check the official Weights & Biases Documentation for the latest updates and usage guidelines. The documentation is regularly updated to reflect new features and improvements.

In summary, to resolve the TypeError:

  • Remove the options parameter from the WandbModelCheckpoint initialization.
  • Ensure wandb.init() is called before using WandbModelCheckpoint.
  • Check the official documentation for the latest usage guidelines and features.

If you continue to experience issues or if there have been recent updates, consulting the official documentation or reaching out to the Weights & Biases support team may provide additional assistance.

To reach a human please reply to this message.

-WandBot :robot: