Inconsistent performance Rasa X server and Rasa X local mode

Hi, I’ve trained a model locally with rasa.It runs smoothly and achieves my expectation when using the Rasa X local mode. Then, I tried to move it to a server, I have configured all the necessary settings on a server and upload the pre-trained model to the server. But when I was testing the identical model on the Rasa X server, the chatbot’s responses were almost all wrong.

To be mentioned, the nlu model confidence and action selection confidence on Rasa X server are all larger than one while on the local mode both of them range from 0 to 1. I think it means the server version is ignoring the normalization process?

Any kind help is appreciated.

@Zasc What versions of Rasa X and Rasa Open Source are you using in server mode and what versions are you using in local mode?

For local mode:
Rasa X: 0.37.1
Rasa Open Source: 2.4.0

For server mode:
Rasa X: 0.37.1
And for Rasa Open Source in server mode, I’m not sure about it since I have only walked through the docker compose installation of Rasa X in server mode.

So do I need to apply some other methods to install the Rasa Open Source in server mode?

@tyd I have built a new model and did some experiments on it. The main problem with it is when I am inputting intents that will trigger the rules, the agent will not respond following the rules. It will give some random action instead. Another problem is when I clicking on the “restart” button, the conversation will not be restarted. And it will also give some random actions.

@Zasc You can figure out what version of Rasa Open Source is running on a Rasa X server by adding /api/version to the URL of your Rasa X instance. I am guessing you are training a model using Rasa Open Source 2.4.0 but your Rasa X 0.37.1 server is using Rasa Open Source 2.3.4 and causing these issues

@tyd I have figured out that Rasa Open Source 2.3.1 was used on the server. And I have changed it to version 2.4.0. Now the model is running smoothly and output following the stories and rules with model confidence all ranging from 0 to 1. Thank you very very much!

1 Like

hi @Zasc, I am having some different behaviour in rasa x local mode and in a server. It is not exactly the problem you had, but just to check I would like to use same Rasa Open Source versions, how did you change the open source version you had in your server? thx in advance

hi @anarucu, as tyd said, you can check the version of your Rasa Open Source versions just by adding /api/version to the URL of your Rasa X instance. And to change the open source version, since I have adopted the docker compose installation, I just changed the variables defined in the .env file under /etc/rasa. The following link may help you.

Customize Your Deployment

thankyou very much!!