Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embedding layer after a Dense layer raises UserWarning: Gradients do not exist #826

Open
antipisa opened this issue Jan 30, 2025 · 0 comments
Assignees
Labels

Comments

@antipisa
Copy link

antipisa commented Jan 30, 2025

System information.

  • Have I written custom code (as opposed to using a stock example script provided in Keras): Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Red Hat 7
  • TensorFlow installed from (source or binary): source
  • TensorFlow version (use command below): 2.17
  • Python version: 3.11.8
  • GPU model and memory:
  • Exact command to reproduce:

You can collect some of this information using our environment capture script:

https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh

You can obtain the TensorFlow version with:
python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"

Describe the problem.

Adding an Embedding layer after a Dense layer does not work, or rather raises a UserWarning during model training that gradients do not exist.


import tensorflow as tf
from tensorflow import keras
from keras.layers import Dense, Embedding, Flatten, Input, Rescaling, concatenate
from keras.models import Model


num_indices = 10
embedding_dim = 5
p = 5

input_layer = Input(shape=(p,), name='input_layer')
embedding_branch = Dense(num_indices, activation='tanh')(input_layer)
embedding_branch = Rescaling(5, 5, name='rescaling')(embedding_branch)
embedding_branch = Embedding(input_dim = num_indices, output_dim = embedding_dim, name='embedding')(embedding_branch)

dense_branch = Dense(20, activation='relu')(input_layer)
dense_branch = Dense(3, activation=relu')(dense_branch)

final_layer = concatenate([ dense_branch, embedding_branch ], name='concatenate_layer')

final_out = Dense(1, activation='linear')(final_layer)
model = Model(inputs=input_layer, outputs=final_out)
model.compile(optimizer='adam', loss='mean_squared_error')
model.summary()
model.build(input_shape=(None, p))

X = np.random.randn(100, p)
y = np.random.randn(100)

history = model.fit(X, y, epochs=30, verbose=0)



python3.11/site-packages/keras/src/optimizers/base_optimizer.py: UserWarning: Gradients do not exist for variables ['kernel', 'bias'] when minimizing the loss. If using `model.compile()`, did you forget to provide a `loss` argument?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants