Rasa nlu and core separate container setup

Hi Team,

Can you please help to setup the rasa core and rasa nlu separately as a docker container for the latest version of rasa . Can see in the legacy docs that it was very much possible for the older version but can it be possible for the newer version too? if yes then what are the steps or any supporting document can you please provide .

Hi @JoySaha , Core and NLU used to be separate libraries but are now one, as of mid 2019, and the recommended way to run a Rasa Open Source model is as a combined model. Can I ask why you’d like to run the two separately?

You can run Core and NLU servers separately in 2 different containers, here is the documentation around that – however each container will still need to have the entire rasa image in it, so your images won’t be any smaller.

Thanks Ella for the supporting docs it is really helpful.

We want to placed rasa nlu at one server as a container and placed the rasa core in the other server as a container based. Can we use rasa nlu as NLP engine for other dialogue management tool as well is it possible?

If thats your goal then yes you can have the two separate servers in different containers!

If you want to use Rasa NLU with other dialogue management tools, that is also possible. The docs I shared will be helpful for that, as well as this page: Using NLU Only