How to run multiple rasa or rasa x bots on one server

Hey, i have created a chatbot. now i want to distribute it to clients and i want that all client bots should run on a single server. For example when two clients registered there will be two projects created by their names and they will login and access their bots, both should be run on the same server externally and internally i want something that should do communication between related client and his chatbot and then return response through the main server. i am using it with python/django. i shall be very thankful to someone who helps. :slight_smile:

Welcome to the community!

there will be two projects created by their names and they will login and access their bots,

You mean on the front-end right? As long as you give each user a unique sender_id in the requests you make to your inputchannel, this shouldn’t be an issue – each person will have their own unique conversation with the bot.

Or do you mean that each client should be able to create separate bots? In that case, the bots should not be running on the same server.

1 Like

Hey @erohmensing thanks for reply, I just want to elaborate that i have developed a django project for chatbots, when a user is registered on my site there a project is created in the folder with the unique name & id of the user(i.e if 5 users registered then 5 rasa projects will be created against users). Now i want that when those users will login and open their bots(rasa-x for training), they all should be run only on one server, like on local server we can login only one client at a time but i want to log them in at the same time with their on credentials.

Right, this doesn’t really answer my question though. Are these 5 rasa projects totally different bots? Or are you trying to have 5 people collaborate on one bot in rasa x?

You can’t have 5 clients running 5 different bots on the same server, because only one bot can respond to the requests on a single server.

Can you not just host the bots on different ports?

Hi Rizwan,

I would ask you please to review the Rasa X license and faq (both linked from Rasa X ). Please remember that building a hosted (SaaS) solution with Rasa X is not permitted under the Rasa X Community Edition License.

yes, each bot will be hosted on different ports internally, but externally each user will access his bot on port 80 or 443, not on rasa-x’s own port like: localhost:5002

@PureLogics with regards to what @amn41 said, can I ask your specific use case for this, i.e. who these clients are?

@erohmensing it was just an example to let you understand my question.

Sure, but doesn’t mean it’s any less important :smiley:

You can host it on whatever domain you like if you deploy it to a server: Deploy to a Server

But, according to the license of rasa core and nlu. We can use it for SaaS. Am I right?

Rasa x is free to use and proprietary. Not to be used as SaaS.

yes, you can use the open source framework for anything that the apache 2.0 license allows, which includes saas

Hi @erohmensing I may have a similar question. Is it possible for rasa to send different response based on the user persona? For example, suppose I have 3 age groups of user, can I respond to each age group with different responses? In that case, do I need to create 3 different bots trained on different domain.yml?

Hi @jhzape, this would be something best done with slots. You can store the user’s age in a slot and use slots to influence conversations. You should do this in one bot, not multiple bots :slight_smile:

Hi @erohmensing, thanks for the reply. Yes, now I am trying to extract metadata of age group in rasa POST request and retrieve the group id in custom actions, branching into corresponding utter template.

Hi @PureLogics , In rasa server can we make different multiple bots access through one rasa server. is it possible.

Yes, You can use multiple ports to assign to the rasa server container and actions server. considering you are using rasa open source.

I have kind of similar question,

there are multiple clients and different stories and training data thus different model for different client ID

  1. there will be one rasa server running and based on client id, rasa server will load respective model and configuration and will answer accordingly
  2. there will be different RAsa server running for every client, there will be one common node-server that will check the client id and call to respective server, get the response and return to the caller client.

If your different clients have different stories and training data, each of them should have a rasa x setup with its own rasa server.