I created a universal sentence encoder layer as an instance of keras.layers.TFSMLayer
and used it in a binary classifier. Training the model results in the following error:
Epoch 1/10
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
[<ipython-input-7-d7863a64cd24>](https://localhost:8080/#) in <cell line: 0>()
2 y_train = keras.ops.array([[1], [0], [0]])
3
----> 4 nn_model_history = nn_model.fit(
5 x = x_train,
6 y = y_train,
1 frames
[/usr/local/lib/python3.11/dist-packages/keras/src/layers/layer.py](https://localhost:8080/#) in _get_regularization_losses(self)
1152 weight_regularization_losses = []
1153 for variable in self.trainable_weights:
-> 1154 if variable.regularizer is None:
1155 continue
1156 if backend.in_stateless_scope() and not in_symbolic_scope():
AttributeError: 'UninitializedVariable' object has no attribute 'regularizer'
The Keras layers code fails when the trainable weights lack a regularizer
attribute.
I've reproduced the error in this Colab notebook.
Comment From: mehtamansi29
Hi @rlcauvin -
Thanks for reporting the issue. Here in while defining regularizer in you used use_hack = False
so the code with use_hack = False
condition doesnot run.
If you use use_hack = True
, model get trained properly without error.
Attached gist here for reference.
Comment From: rlcauvin
Yes, the hack works in this case, but it shouldn't be necessary, right? I.e. there is a bug in the keras.layers.TFSMLayer
or other Keras layers code, right?
Comment From: hu6r1s
In the _get_regularization_losses
function, if variable.regularizer
is None
, the loop continues to the next iteration. However, since variable
is an UninitializedVariable
and does not have a regularizer
attribute, an error occurred. I modified the code to skip the variable even if it does not have a regularizer
attribute.
Thus, your code now works correctly. However, I am not sure if this is the fundamental solution to the problem.
Comment From: rlcauvin
Thank you, @hu6r1s. I also wonder if it solves the root of the problem. Is the root of the problem the handling of the case where the variable.regularizer
attribute doesn't exist, or perhaps that the code that loads the TensorFlow saved model isn't properly setting the variables or regularizers?
What is the process for fully understanding what is causing this kind of problem? Who has the expertise?
Comment From: hu6r1s
@rlcauvin I'll have to look into it.
Comment From: hu6r1s
@rlcauvin . I'm not entirely familiar with the Universal Sentence Encoder (USE), but it seems that the issue arises because the trainable_weights in the TFSMLayer wrapping USE does not contain any regularizable variables. This might be related to how USE handles trainable weights when used with TFSMLayer.
Comment From: rlcauvin
Thanks for looking into it, @hu6r1s. I wonder if there is a bug in keras.layers.TFSMLayer
. In previous versions of Keras and TensorFlow, using
use_layer = tensorflow_hub.KerasLayer("https://tfhub.dev/google/universal-sentence-encoder/4", trainable = True)
worked fine. I would expect keras.layers.TFSMLayer
to offer equivalent functionality in Keras 3 without these errors.
Comment From: hu6r1s
Then TMSFLayer must be the problem. I hope my commit fixes it.
Comment From: mattdangerw
We might need to make sure all tf.Variables restored from the saved model are wrapped as Keras Variables (so they have the same attrs, etc). I will try this out.