Whenever we try to save a keras-hub model after quantization, we are unable to load the quantized model. I've tried from_preset() method for that model, and also keras.models.load_model nothing works.
I've attached notebook https://colab.research.google.com/gist/pctablet505/b5ef8ab36dceb58527e992b571aefb70/keras-quantized-model-not-loading.ipynb
ValueError Traceback (most recent call last)
4 frames /content/keras_hub_repo/keras_hub/src/models/task.py in from_preset(cls, preset, load_weights, kwargs) 196 # images, audio). 197 load_task_weights = "num_classes" not in kwargs --> 198 return loader.load_task(cls, load_weights, load_task_weights, kwargs) 199 200 def load_task_weights(self, filepath):
/content/keras_hub_repo/keras_hub/src/utils/preset_utils.py in load_task(self, cls, load_weights, load_task_weights, **kwargs) 701 else: 702 jax_memory_cleanup(task.backbone) --> 703 self._load_backbone_weights(task.backbone) 704 return task 705
/content/keras_hub_repo/keras_hub/src/utils/preset_utils.py in _load_backbone_weights(self, backbone) 754 # Download the sharded weights. 755 _ = get_file(self.preset, sharded_filename) --> 756 backbone.load_weights(filepath) 757 758
/content/keras_repo/keras/src/utils/traceback_utils.py in error_handler(args, *kwargs)
120 # To get the full stack trace, call:
121 # keras.config.disable_traceback_filtering()
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
/content/keras_repo/keras/src/saving/saving_lib.py in _raise_loading_failure(error_msgs, warn_only) 648 warnings.warn(msg) 649 else: --> 650 raise ValueError(msg) 651 652
ValueError: A total of 183 objects could not be loaded. Example error message for object
Layer 'token_embedding' expected 1 variables, but received 0 variables during loading. Expected: ['embeddings']
List of objects that could not be loaded:
[
Comment From: sonali-kumari1
Closing this issue as a duplicate. You can track the progress for this issue here #21378. If you have any other concerns, please feel free to open a new issue. Thanks!