How to use the NLU endpoint with Rasa X?

Hello,

My question is about correctly connecting two separate servers (using NLG and NLU endpoints) to Rasa X, for now in local mode, although I will eventually attempt it with docker-compose.
So far, I have successfully managed this with Rasa Open Source (version 2.8.x).

Versions:

Rasa Version      :         2.8.21
Minimum Compatible Version: 2.8.9
Rasa SDK Version  :         2.8.3
Rasa X Version    :         1.0.1
Python Version    :         3.8.12
Operating System  :         Linux-5.4.0-94-generic-x86_64-with-glibc2.27

Context:
I am currently building a bilingual bot (think code-switching, not 2 separate monolingual bots with the same content).
I have trained:

  • 1 language-agnostic core model (with Rasa)
  • 2 language-specific NLU models (with Rasa)
  • 1 small language identification model (custom)

I have implemented:

  • An NLU server which
    • identifies the language of the input with the language identification model,
    • pass the message to the correct Rasa NLU model,
    • and finally returns intent + detected language (as an entity)
  • An NLG server which
    • uses the detected language to pick the correct set of responses,
    • then retrieve the individual response based on the action predicted by the core model

What works:
After starting my NLU/NLG servers (same port), it works as expected in Rasa Open Source with the endpoints:

nlg:
  url: "http://localhost:6001/nlg"
nlu:
  url: "http://localhost:6001"

and running:

rasa shell --model models/core-model-agnostic-v2.tar.gz --endpoints endpoints.yml

My NLG/NLU servers are called just fine:

[2022-01-17 15:42:53 +0100] - (sanic.access)[INFO][127.0.0.1:42304]: POST http://localhost:6001/model/parse  200 884
[2022-01-17 15:42:53 +0100] - (sanic.access)[INFO][127.0.0.1:42306]: POST http://localhost:6001/nlg  200 253

The problem:
However, I am not managing to achieve this when trying to do the same with Rasa X:

rasa x --endpoints endpoints.yml -d domain.yml  --model models/core-model-agnostic-v2.tar.gz --config configs/config_core.yml

At this point, Rasa X starts okay, but:

  • The NLU server is not contacted at all (no log visible for my server)
  • The Core model works fine, but is useless without intents from the NLU side
  • The NLG server is contacted, but is useless without sensible actions predicted (I haven’t implemented the fallback yet)
# Story from Rasa X Interactive Learning panel
- story: Story from Conversation ID 8a6ff9ff8c5847e1905db5ba76c6375f
  steps:
  - intent: Hello
  - action: action_default_fallback
# NLG server works but is useless
# NLU server is not contacted at all
[2022-01-17 15:50:56 +0100] [212935] [WARNING] Could not find response `utter_default`
[2022-01-17 15:50:56 +0100] - (sanic.access)[INFO][127.0.0.1:42752]: POST http://localhost:6001/nlg  200 39

The TL;DR question:
How do I connect a custom NLU server to Rasa X?

Thank you for your help!

@E-dC confirm me are you able to run NLU and NLG connect with Rasa X or are you trying with docker?

@E-dC please share rasa --version?

Hi @nik202,

I have been able to connect the NLU and NLG servers to Rasa Open Source, but not with Rasa X at all.
So far I have tried (and failed) with Rasa X in local mode, but haven’t tried with docker yet.

rasa --version:

Rasa Version      :         2.8.21
Minimum Compatible Version: 2.8.9
Rasa SDK Version  :         2.8.3
Rasa X Version    :         1.0.1
Python Version    :         3.8.12
Operating System  :         Linux-5.4.0-94-generic-x86_64-with-glibc2.27

Thank you for your reply

I think my biggest confusion is why the NLG/NLU servers work just fine and as expected in Rasa Open Source, but only the NLG endpoint is picked up by Rasa X.
What does Rasa X do which changes the way Rasa OSS picks up endpoints?

If Rasa X does not support a NLU endpoint but does support NLG/other endpoints, it would be a good thing to update the documentation to mention it, as it is somewhat surprising: I probably would have tried to find another setup for a multilingual bot, had I known I wouldn’t be able to deploy it on Rasa X :confused:
I’ll create an issue on the Rasa repo.

@E-dC Hmm… Are you able to archive this whilst using rasa open source?

Indeed I am able to do that with Rasa Open Source (I have edited the output to make it more readable):

(venv_rasa2)$ ./run_bot.sh 
⏲️  Starting up servers

⚡ Start Rasa action server
INFO     rasa_sdk.endpoint  - Starting action endpoint server...
INFO     rasa_sdk.executor  - Registered function for 'validate_org'.
INFO     rasa_sdk.endpoint  - Action endpoint is up and running on http://0.0.0.0:5055

⚡ Start rasa_helpers NLG/NLU servers
[INFO] First-time loading responses for eng from responses/responses_eng.yml
[INFO] First-time loading responses for swh from responses/responses_swh.yml
[INFO] First-time loading model for eng from ./models/nlu-model-eng-v2.tar.gz
[INFO] First-time loading model for swh from ./models/nlu-model-swh-v2.tar.gz
[INFO] Goin Fast @ http://0.0.0.0:6001
[INFO] Starting worker [655519]

⚡ Start Rasa shell
INFO     rasa.model  - Loading model models/core-model-agnostic-v2.tar.gz...
INFO     root  - Starting Rasa server on http://localhost:5005
INFO     rasa.model  - Loading model models/core-model-agnostic-v2.tar.gz...
INFO     root  - Rasa server is up and running.
Bot loaded. Type a message and press enter (use '/stop' to exit): 
Your input ->  Hello                                                                                                                                                                          
{'text': 'Hello',
 'intent': {'name': 'greet', 'confidence': 0.9929808378219604},
 'entities': [{'start': 0, 'end': 0, 'value': 'eng', 'entity': 'detected_lang'}]}
(sanic.access)[INFO][127.0.0.1:49020]: POST http://localhost:6001/model/parse  200 1332
(sanic.access)[INFO][127.0.0.1:49022]: POST http://localhost:6001/nlg  200 17
(sanic.access)[INFO][127.0.0.1:49024]: POST http://localhost:6001/nlg  200 323
Hello there!
(sanic.access)[INFO][127.0.0.1:49026]: POST http://localhost:6001/nlg  200 32
First off, here is my official privacy notice...
Your input ->  Jambo                                                                                                                                                                          
{'text': 'Jambo',
 'intent': {'name': 'greet', 'confidence': 0.8884895443916321},
 'entities': [{'start': 0, 'end': 0, 'value': 'swh', 'entity': 'detected_lang'}]}
(sanic.access)[INFO][127.0.0.1:49030]: POST http://localhost:6001/model/parse  200 1317
(sanic.access)[INFO][127.0.0.1:49032]: POST http://localhost:6001/nlg  200 18
Jambo!

@E-dC and when you open Rasa X from the terminal and type these messages you not getting any response?

The NLG endpoint does work, so I am getting a response (it would be the utter_default of the action_default_fallback if I had implemented it already), but it’s a useless one since the NLU endpoint is not contacted at all (logs from original post):

# Story from Rasa X Interactive Learning panel
- story: Story from Conversation ID 8a6ff9ff8c5847e1905db5ba76c6375f
  steps:
  - intent: Hello
  - action: action_default_fallback
# NLG server works but is useless
# NLU server is not contacted at all
[2022-01-17 15:50:56 +0100] [212935] [WARNING] Could not find response `utter_default`
[2022-01-17 15:50:56 +0100] - (sanic.access)[INFO][127.0.0.1:42752]: POST http://localhost:6001/nlg  200 3

@E-dC If you provide the NLU server path in endpoints.yml ? Ref: NLU-Only Server will it help?

Opps I guess you did that :frowning:

May be it will supprt on server side installation ?

My colleague did try a server-side installation with no success (I haven’t tried it personally though)…
Surely if a NLU endpoint is supposed to be supported by Rasa X, then it should be possible to use it with Rasa X locally as well: I wanted to try it on my own machine first in order to fully understand what needed to be done for it to work.

At this point, I think the situation is one of:

  • Rasa X does not support using a NLU endpoint, but this fact is not documented (a “gotcha” of sorts)
  • Rasa X does support a NLU endpoint, but either:
    • There is something specific/unexpected to do, and it is undocumented
    • There is a bug.

Could anyone from the Rasa team shine a light on that issue, by any chance?

I totally agree with you! Etienne.

pinging @koaning for the suggestion and help, please.

@E-dC I’m less of an expert on the Rasa X side of things, but I think you’re correct. My impression is that Rasa X assumes a single NLU model. That’s the impression that I’m getting from reading this architecture diagram. To quote the page:

It should be clear from this description that Rasa Open Source can run completely independently of Rasa X. Rasa X on the other hand depends on the Rasa Open Source service for handling conversation data, model training, and running.

While the approach in the video is a bit “overkill” (they made their own CMS) I think this tech talk from N26 may serve as an inspiration on how to work in multi-lingual settings. I think multi-lingual settings are very much “an open problem area” where there isn’t a clear consensus on what an ideal solution looks like, but the talk mentions a couple of very sensible ideas.

Thank you for your answers @koaning.

Yes, it does look like we’re going to have to do without Rasa X for our bot: my colleague already figured out a deployment without Rasa X, but I wanted to ensure that there was nothing I’d missed which would have made using Rasa X possible.

I do remember watching the N26 talk, this is essentially the approach I’ve taken, by creating a single language-agnostic core model, and n language-specific NLU models. The main difference (as far as I’m aware) is that in our case a language identification model sits in front of the NLU models.
More room for error, but the benefits outweigh the issues in our use case.

Content management for multiple languages is difficult, I’m not so surprised that N26 ended up making their own CMS!

1 Like