Integrate Rasa NLU with an external conversation manager

All,

What is the best approach to integrate Rasa NLU (intent interpreter) with an external conversation manager that is written in Java? Do I need to build Flask microservice between the two?

When you run Rasa as a service, you also host the NLU model over REST.

rasa run --enable-api

This exposes the NLU model as well. You’re likely interested in the parse endpoint which will take predict intents/entities. You may want to wrap this service in a docker container though.

That said, what conversation manager do you want to use? Is there a reason why you’re not using the Rasa conversation manager? The rasa run command also exposes our conversation tracker over REST.

1 Like

@koaning thank you for your response. The reason is that my company already has an existing home grown conversation manager that was written in Java.

1 Like

@koaning thanks again. Could you please provide a mode detailed step by step instruction to use the parse endpoint? I have the Rasa server up and running, please see the picture. How can I call the parse endpoint to predict intents/entities?

As is shown in the docs the endpoint lives at http://localhost:5005/model/parse assuming you’re running it locally. You should be able to send a POST request with a json body containing a text field to get your predictions.

1 Like

what’is Conversation Manage?

The NLU part of the pipeline does intent prediction and entity detection.

The conversation manager/policy part of the pipeline takes this as input together with the conversation sofar and uses this to predict the next action.

@KheireddineAzzez Conversation Manager is the equivalent of dialogue management component of Rasa Open Source. In my company, we have developed one in-house and we wanted to leverage that with the NLU model.

@koaning when wrapping this in a docker container, is CMD [“start”, “–enable-api”] the correct syntax to perform rasa run --enable-api inside of the container? Please see my complete Dockerfile below.

FROM rasa/rasa:2.5.0-full
USER root
WORKDIR /app
COPY . /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
CMD ["start", "--enable-api"]
USER 1001

If I look at the Dockerfile definition then it seems like the standard entrypoint is rasa but the full command is rasa run --enable-api, not rasa start --enable-api.

Maybe this works:

FROM rasa/rasa:2.5.0-full
USER root
WORKDIR /app
COPY . /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
CMD ["run", "--enable-api"]
USER 1001

@koaning Yessss! This works! Thank you so much! You’re amazing!

1 Like

@koaning one more question, now that the service is up and running in the container, how do I call this service from outside the container? Our home-grown conversation manager is written in Java so I need to provide the endpoint for the conversation manager.

This depends a bit on how you’re running your container, but assuming that security is configured to open up the right port you should be able to communicate with it over HTTP.

You should see a port number in the logs whenever this command is run:

rasa run --enable-api

But there’s also a setting to set the port number. You should be able to find more info via:

rasa run --help
1 Like