Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in functional model #20719

Open
Surya2k1 opened this issue Jan 3, 2025 · 1 comment
Open

Bug in functional model #20719

Surya2k1 opened this issue Jan 3, 2025 · 1 comment
Assignees
Labels

Comments

@Surya2k1
Copy link
Contributor

Surya2k1 commented Jan 3, 2025

I think there seems a bug in functional model.

Case-1: When inputs=outputs for the model construction but training different output shape: Training success

import keras
from keras import layers
import numpy as np

input_1 = layers.Input(shape=(3,))
input_2 = layers.Input(shape=(5,))

model_1 = keras.models.Model([input_1, input_2], [input_1, input_2])
print(model_1.summary())
model_1.compile(optimizer='adam',metrics=['accuracy','accuracy'],loss=['mse'])

#Notice I am passing different output size for training but still training happens
model_1.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], 
                    [np.random.normal(size=(10,1)),np.random.normal(size=(10,2))])

print('Training completed')

Case 2: Same as Case-1 but different behavior with different mismatched output shapes(than case-1) for training: Error during loss calculation. But I expect Error during graph execution itself.

#With diffrent output shapes than model constructed its raising error while calculating the loss.
#Instead it should have raised shape mismatch error during graph execution.
model_1.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], 
                    [np.random.normal(size=(10,2)),np.random.normal(size=(10,4))])

Case 3: With Unconnected inputs and outputs

input_1 = layers.Input(shape=(3,))
input_2 = layers.Input(shape=(5,))

input_3 = layers.Input(shape=(1,))
input_4 = layers.Input(shape=(2,))

model_2 = keras.models.Model([input_1, input_2], [input_3, input_4])
model_2.compile(optimizer='adam',metrics=['accuracy','accuracy'],loss=['mse'])

#Passing correct input and ouputs fails because these are not connected.
model_2.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], [np.random.normal(size=(10,1)),np.random.normal(size=(10,2))])

Got error below which is correct but it is not useful for end users. Instead it should have raised error during graph construction.

177         output_tensors = []
    178         for x in self.outputs:
--> 179             output_tensors.append(tensor_dict[id(x)])
    180 
    181         return tree.pack_sequence_as(self._outputs_struct, output_tensors)

KeyError: "Exception encountered when calling Functional.call().\n\n\x1b[1m139941182292272\x1b[0m\n\nArguments received by Functional.call():\n  • inputs=('tf.Tensor(shape=(None, 3), dtype=float32)', 'tf.Tensor(shape=(None, 5), dtype=float32)')\n  • training=True\n  • mask=('None', 'None')"

I tried to fix an issue similar to case-3 by raising Error during graph build itself in PR #20705 where I noticed this issue related to case1(From failed Test case). Please refer the gist.

@harshaljanjani
Copy link
Contributor

To my knowledge, Keras builds the computation graph in a JIT/lazy fashion, and I'm not too sure if addressing Case 2 is in the spirit of the paradigm. Shape validation typically happens when the model actually starts training, which is probably why the Case 2 error comes up during loss calculation and not graph construction. Having said that, I’d love to hear if there’s anything else to consider when trying to bypass it, especially since I’ve only recently started contributing to the library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants