Rasa concurrent chatbot questions handling

Can rasa handle 15000 concurrent customer if we deploy it on server using docker. Server have 80 cores, 64gb ram. @nik202

@noman579 - try a load testing tool such as locust, swarm your service and see when it breaks. you can configure concurrent users on locust to do load tests. It also gives you points of optimization you can make on the server.

I also am not sure whether docker is useful in a single server setup beside packaging and you could just do that with conda. Docker takes a fair chunk of memory from your machine, which you wouldn’t want given your need for high concurrency on a single machine

@souvikg10 You mean i try to overload docker with locust library using conda?

well you have a locust python script running on your local machine from where you can call your service hosted on the server(docker/non-docker). your locust script can set number of concurrent calls to make to the server in order to test performance and load.

regarding deploying your rasa app I suggested conda instead of docker to deploy your rasa app on the server since it is simpler and docker wouldn’t make sense unless you scale horizontally.

@souvikg10 I have linux cli server for running docker but local system is running windows. What should i do. Could i install conda on linux cli server?