Bug description

When exporting a keras model in tf_saved_model format to use it in tensorflow-serving server I've encountered the following error:

Traceback (most recent call last):
  File "/home/tom/kml/repos/various/tf_serving_client/tf_serving_client/min_example.py", line 17, in <module>
    model.export('/home/tom/tmp/test_export', format='tf_saved_model',
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/models/model.py", line 544, in export
    export_saved_model(
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/export/saved_model.py", line 656, in export_saved_model
    export_archive.write_out(filepath, verbose=verbose)
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/export/saved_model.py", line 510, in write_out
    self._filter_and_track_resources()
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/export/saved_model.py", line 574, in _filter_and_track_resources
    tvs, ntvs = _list_variables_used_by_fns(fns)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/export/saved_model.py", line 677, in _list_variables_used_by_fns
    concrete_functions = [fn.get_concrete_function()]
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/polymorphic_function.py", line 1256, in get_concrete_function
    concrete = self._get_concrete_function_garbage_collected(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/polymorphic_function.py", line 1226, in _get_concrete_function_garbage_collected
    self._initialize(args, kwargs, add_initializers_to=initializers)
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/polymorphic_function.py", line 696, in _initialize
    self._concrete_variable_creation_fn = tracing_compilation.trace_function(
                                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/tracing_compilation.py", line 178, in trace_function
    concrete_function = _maybe_define_function(
                        ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/tracing_compilation.py", line 283, in _maybe_define_function
    concrete_function = _create_concrete_function(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/tracing_compilation.py", line 310, in _create_concrete_function
    traced_func_graph = func_graph_module.func_graph_from_py_func(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/framework/func_graph.py", line 1060, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/tensorflow/python/eager/polymorphic_function/polymorphic_function.py", line 599, in wrapped_fn
    out = weak_wrapped_fn().__wrapped__(*args, **kwds)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/utils/traceback_utils.py", line 122, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/home/tom/.virtualenvs/tf_serving_client/lib/python3.12/site-packages/keras/src/layers/input_spec.py", line 160, in assert_input_compatibility
    raise ValueError(
ValueError: Layer "functional" expects 2 input(s), but it received 1 input tensors. Inputs received: [<tf.Tensor 'input_A:0' shape=(1,) dtype=float32>]

Code to reproduce the issue

import tensorflow as tf
import keras

from keras import layers



if __name__ == "__main__":
    input_0 = layers.Input(name='input_0', shape=(1,))
    input_1 = layers.Input(name='input_1', shape=(1,))

    output = input_0 + input_1

    model = keras.Model(inputs=(input_0, input_1), outputs=(output,))


    model.export('/home/tom/tmp/test_export', format='tf_saved_model',
                 input_signature=[tf.TensorSpec((1,), dtype=input_0.dtype, name=f'input_A'),
                                  tf.TensorSpec((1,), dtype=input_0.dtype, name=f'input_B')]
                 )

When skipping input_signature parameter, everything works as expected. However, the background why I'm providing input_signature is that I want to give the inputs different names.

Environment:

python 3.12 requirements.txt to reproduce the issue

absl-py==2.2.2
astunparse==1.6.3
build==0.8.0
certifi==2025.4.26
charset-normalizer==3.4.1
click==8.1.8
flatbuffers==25.2.10
gast==0.6.0
google-pasta==0.2.0
grpcio==1.71.0
h5py==3.13.0
idna==3.10
keras==3.9.2
libclang==18.1.1
Markdown==3.8
markdown-it-py==3.0.0
MarkupSafe==3.0.2
mdurl==0.1.2
ml_dtypes==0.5.1
namex==0.0.9
numpy==2.1.3
opt_einsum==3.4.0
optree==0.15.0
packaging==25.0
pep517==0.13.0
pip-tools==6.9.0
protobuf==5.29.4
pydot==4.0.0
Pygments==2.19.1
pyparsing==3.2.3
requests==2.32.3
rich==14.0.0
setuptools==80.3.1
six==1.17.0
tensorboard==2.19.0
tensorboard-data-server==0.7.2
tensorflow==2.19.0
termcolor==3.0.1
typing_extensions==4.13.2
urllib3==2.4.0
Werkzeug==3.1.3
wheel==0.45.1
wrapt==1.17.2

Comment From: sonali-kumari1

Hi @ebnertom -

I have reproduced this issue with latest version of keras(3.9.2) and I got the same error in this gist. We will look into this issue and update you.

Comment From: sonali-kumari1

Hi @ebnertom - To resolve the error :

ValueError: Layer "functional" expects 2 input(s), but it received 1 input tensors. Inputs received: [<tf.Tensor 'input_A:0' shape=(1,) dtype=float32>]

You can use input_signature parameter with @tf.function decorator and define a custom function which lets you maps your original inputs(input_0, input_1) to the renamed inputs (input_A, input_B) like this:

@tf.function(input_signature=[
        tf.TensorSpec((None, 1), dtype=tf.float32, name='input_A'),
        tf.TensorSpec((None, 1), dtype=tf.float32, name='input_B')
    ])
    def serving_function(input_A, input_B):
        return model([input_A, input_B])

You can pass this custom function as signatures argument when calling model.export. Attaching gist for you reference. Thanks!

Comment From: github-actions[bot]

This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you.

Comment From: github-actions[bot]

This issue was closed because it has been inactive for 28 days. Please reopen if you'd like to work on this further.