Using two models in parallel on Rasa Server

I see. I think we support this and the docs for this are found here.

Technically you could have a proxy on the server, which routes traffic internally to two models that you’re hosting. You can tell rasa run to run a specific model.

rasa run --enable-api -m models/<model-a>.tar.gz --port 12345
rasa run --enable-api -m models/<model-b>.tar.gz --port 12346
1 Like