DIETClassifier how to save model as tf-serving format

Hello guys, Recently, I am using rasa to develop a chatbot,but I encountered a technical problem. I read DIETClassifier source code,I found i can;t save it as Tf-serving model format. anybody konws how to save it as Tf-serving model format?


    def persist(self, file_name: Text, model_dir: Text) -> Dict[Text, Any]:
        """Persist this model into the passed directory.

        Return the metadata necessary to load the model again.

        if self.model is None:
            return {"file": None}

        model_dir = Path(model_dir)
        tf_model_file = model_dir / f"{file_name}.tf_model"


            model_dir / f"{file_name}.data_example.pkl", self._data_example
            model_dir / f"{file_name}.label_data.pkl", self._label_data
            model_dir / f"{file_name}.index_label_id_mapping.pkl",

        entity_tag_specs = (
            [tag_spec._asdict() for tag_spec in self._entity_tag_specs]
            if self._entity_tag_specs
            else []
            model_dir / f"{file_name}.entity_tag_specs.json", entity_tag_specs

        return {"file": file_name}
1 Like

Hi @wangcheng, I’m not sure if that is possible as we are using a custom model, e.g. we are not using the standard fit method of Keras, for example. But I am also not very familiar with tf-serving.

Did you tried something already?

@wangcheng Out of curiosity what is the use-case for this? You’re not able to serve the model using Rasa core?

hi - so in my use case I want to serve different clients with differenet models and entities (model per customer). TF Serving can help with that…