Hi, I am running a model which uses the same embedding layer (and variables) several places in the model. During training, I use the standard WandbCallback() with no additional parameters, however, I get this warning from TensorFlow:
WARNING:tensorflow:Found duplicated Variables in Model’s weights. This is usually caused by Variables being shared by Layers in the Model. These Variables will be treated as separate Variables when the Model is restored. To avoid this, please save with save_format="tf".
The code I am running is way to long to post here, but the source of the issue is a “Keras embedding” layer being used multiple times in a layer. The standard save format used by wandb saves the weights of these independently and therefore on loading the model, these will no longer correspond to the same weights, as during further training, the backprop will update these layers independently. So the core issue is that the save_format which comes standard in W&B, is not very functional for models with shared layers, and I cannot find a way to change this or pass some argument to W&B which specifies save_format.
Apologies for the delay on my follow up response. Currently is configured to work with the latest version of Tensorflow TF 2.X where the default model save_format is "tf". I agree we should support more customization here and allow the user to set their own format. I’m making a feature request for this internally and will let you know when we make progress on this. In the meantime please let me know if you have additional questions.