Hi All,
We have few custom components used with tensorflow_embedding pipeline and custom RASA NLU docker image deployed in GCP kubernetes cluster.
We have a requirement where user wants to create and store intent and description in Mongo DB to train future model with updated data.
We want to expose a REST endpoint so that user can send a request to read the stored data from the Mongo DB and create a training data file and train the model.
I am looking for the best design/solution to address this.
Here is my my current design /deployment approach to address this requirement.
-
Created custom RASA NLU docker image that includes all custom components deployed in kubernetes cluster.
-
Created a separate python web application to expose a REST endpoint. User can invoke this endpoint when he wants to train the model with updated data stored in the Mongo DB.
-
Custom application reads the data from the database creates a training data file and send request to custom RASA NLU docker images train endpoint to train the model with latest data.
-
We are storing trained model on google cloud storage.
Please let me know if you have any better and easy solution as compare to above one.
Thanks, Virendra