Rasa in Docker

Hi everyone,

I want to add a rasa instance that I already have trained and it’s already working in the prompt in docker.

How can I do this?

I just found examples of getting a pre built image of rasa, getting a new image. I want to add MY rasa instance in a docker image, the rasa that is already running in my machine.

Thanks in advance!

1 Like

I’m facing the same issue

1 Like

Hi @miohana and @KPrado, please take a look at the Running Rasa with Docker page, specifically the Running the Rasa Server section. Hope it helps!

Thanks @fede!

My doubt is if I should use the pre built rasa image or I create my own image based on the archives in the folder.

For example:

image: rasa/rasa:latest-full OR, if I already have a image installed: build: .


It is recommended that you use or pre-built image, and add you model files using the -v parameter for docker run.

I have deployed my Rasa Model into a custom Flask application and extracted the data from its response and restructured its responses to my format and sent that as response. This service runs on my own Docker image. The way I did it was by adding a Dockerfile into my project which consists of the following commands: specifying the installation of Ubuntu, python 3.6, and then installing the python packages from my requirements.txt using pip and calling the python your_rest_service.py or any other command to get the backend service to start. Once doing so, you cd into that directory and do docker build -t image_name . which builds the docker image and then starting it using docker run -d -p 8090:8090 image_name (i.e host port:container port) with host port being the external port in which your service is communicated from and container port being the your service is internally executed under.

In order to customize your model’s response, you would have to load the model into your service. Their python library lets me get their default nlu prediction JSON using the following:

from rasa.nlu.model import Interpreter

interpreter = Interpreter.load("path_to_untared_nlu_model")

nlu_resp = interpreter.parse("some phrase")