How to use a database to store trained NLU Data instead of generating a flat file traning_data.json

Is it possible to store training data into a Database (preferably MongoDB) when I train the RASA NLU using rasa_core.run and use the same database to verify user intent instead of generated training_data.json file? Thanks in advance!

Hm, we use the databases to store the conversations usually. Why do you want to use a DB for nlu data?

For security and Scalability point of view we would like to store the trained data into MongoDB. Is it possible?. FYI… I am aware of MongoTracker store for conversation But I want to store trained data. Also is it possible to store the intent and stories into the database? Thanks in advance.

So at the moment, it’s not possible out of the box to load the data from a DB. However there is actually a PR open for this – we will be abstracting the data-loading process so that you can write your own data importers for a use case like this. Check out the PR here. Of course, it’s still an open PR, so it’ll be a little while before it is in a released version.

Hope it will be implemented soon. Thank you for quick response.

1 Like