Server for Rasa in production

We are using rasa(core + nlu) for building our chat bot. External applications use Rasa server HTTP endpoints(APIs) for conversation with the bot. In local environment we are just running the Rasa server using the below command. rasa run -m models --enable-api --log-file out.log

  1. Is this rasa server kind of a HTTP server that comes in built with Rasa?
  2. Can we use this rasa server in Production environment? If not, how do we deploy our Rasa chatbot APIs in any other servers?

Hello Srinivasa,

You are correct that this HTTP api comes with the Rasa server via that --enable-api argument in the cli command. This is documented more at HTTP API if you want to see the different endpoints and what it does.

Typically people will use something like NGINX etc and sit it infront of the server and use that for SSL etc but yes you would be able to use this in production if setup with the SSL part, etc.

More information on some of this can also be found at Running the Server

You can also use the Docker option as well which gives you some flexibility with inter containers communicating and only allowing your Bot for example to communicate to a web frontend container etc and now only your apps are communicating over the local docker connection, etc and you don’t have to expose your bot to the wide world etc.

Let me know if this doesn’t answer any of your questions and I can follow up.

Thanks

2 Likes

Hello @btotharye,

I’m planning to set up a web app with a system include: gunicorn, nginx, django and rasa. And i will use docker mostly. From what i understand from your answer, with correct configuration of servers such as gunicorn and nginx, i can get the rasa server ready for production ? Will it be enough to handle traffic such as 500, 600 people talking to the bot ?

Also, can you explain more about not exposing the bot to the outside world ? I made and web app with Django to control conversation between the user and the bot through a chat UI, is that secure enough or i have to take more advanced measure ?

Your answer is very appreciated. Thank you.

Hey @fuih probably the best option for running a bot in production would be to look at Rasa X deployment since it includes all the nginx and everything you need.

More information can be found on this at Installation and Setup and Deploy to a Server

Let me know if you have any questions after reviewing this or trying it out, this will also allow you to share your bot as you build it with others to get feedback and have them test it and use this data to re-train and such.

1 Like

Probably just like in telegram, Rasa has its own server, through which it communicates between your bot and the user. So the answer to the first question is Yes. As for the second point, you need to host your chat bot on your own server . Then everything will be stable and work well. Each API from each service has an indication in the library of what works and how. So everything else depends on you and your ability to install the server.