RASA NLU Custom Docker Image in kubernetes cluster

Hi All,

We have few custom components used with tensorflow_embedding pipeline and custom RASA NLU docker image deployed in GCP kubernetes cluster.

We have a requirement where user wants to create and store intent and description in Mongo DB to train future model with updated data.

We want to expose a REST endpoint so that user can send a request to read the stored data from the Mongo DB and create a training data file and train the model.

I am looking for the best design/solution to address this.

Here is my my current design /deployment approach to address this requirement.

  1. Created custom RASA NLU docker image that includes all custom components deployed in kubernetes cluster.

  2. Created a separate python web application to expose a REST endpoint. User can invoke this endpoint when he wants to train the model with updated data stored in the Mongo DB.

  3. Custom application reads the data from the database creates a training data file and send request to custom RASA NLU docker images train endpoint to train the model with latest data.

  4. We are storing trained model on google cloud storage.

Please let me know if you have any better and easy solution as compare to above one.

Thanks, Virendra

1 Like