Dockerizing my rasa chatbot application that has botfront

Hi @mangesh…I am trying to deploy my rasa chatbot which is running locally now and by using ngrok i am connecting to other channels

Now I want to deploy my bot to azure web app services so that i use that permanent url for communication with mu bot

@CAkhil probably you should create a post with this query so that people who have already done rasa deployment on azure will guide you on what’s the way to go with it. If dockerization is part of the process, you can always ping me.

Hi @mangesh you dockerised your bot and used aws ec2 to get a url right?.. Could u send those final files and steps

@mangesh that would be helpful…please share those final files

Hi @CAkhil , sorry for the late response. I only dockerised, deploying on ec2 was someone else’s job but i think that should be an easy step if you search it up. Here, I’m sharing with you content of three files that you are required to place in your project in order to dockerize it. These are 2 Dockerfile and 1 docker-compose.yml.

After you have made sure your system has docker and docker-compose installed in it,

Create the first Dockerfile (just a new file with name “Dockerfile”) in root directory of the project and should have this content -

FROM rasa/rasa:2.8.0
WORKDIR '/app'
COPY . /app
USER root
# WORKDIR /app
# COPY . /app
COPY ./data /app/data
RUN  rasa train
VOLUME /app
VOLUME /app/data
VOLUME /app/models
CMD ["run","-m","/app/models","--enable-api","--cors","*","--debug" ,"--endpoints", "endpoints.yml", "--log-file", "out.log", "--debug"]

Note that you can change the rasa version here.

Create second Dockerfile in actions folder and place this content -

FROM rasa/rasa-sdk:2.8.0
WORKDIR /app
COPY requirements.txt requirements.txt
USER root
RUN pip install --verbose -r requirements.txt
EXPOSE 5055
USER 1001

Note that you have to put requirements.txt file containing python libraries that you installed and used in the actions folder.

So, That was all the dockerfiles we needed, now in the root directory, you can see a file called endpoints.yml. change the name localhost to action_server (we’ll register this name in the docker-compose file for container of action server) in the action_endpoint and it should look like -

action_endpoint:
 url: "http://action_server:5055/webhook"

Finally, create a docker-compose.yml file in the root directory and place these content -

version: '3'
services:
    rasa:
      container_name: "rasa_server"
      user: root
      build: 
        context:  .
      volumes:
      - "./:/app"
      ports: 
        - "5005:5005"
    action_server:
      container_name: "action_server"
      build: 
        context: actions
      volumes:
        - ./actions:/app/actions
        - ./data:/app/data
      ports:
        - 5055:5055

After all this you can just run the command

docker-compose up --build

This is all what is required to deploy your bot on docker, you should also refer the answer of nik for anything extra he has mentioned but this much is only required to run the bot successfully.

2 Likes

Hi @nik202 ,

I am getting an error while excuting the above code:

@rajas.black share the process you are using? docker-related files and commands you had used?

I have followed the exact steps as you have described.

@rajas.black I trust you but while copying only developers make mistakes, this is my personal experience. If you can share that will help me to answer your issue. Thanks.

I just followed the steps mentioned by @mangesh and it worked fine. I am really not sure what could have been the issue. I am pretty sure, I copied the contents correctly since I am doing it from morning and multiple times.

Only difference is --verbose while pip install requirements.txt

Anyways, thanks a lot both of you!

@rajas.black Sure, happy to help and suggest you. The last suggestion, delete the created images and again spin the docker container. Good Luck!

Thank you guys (Mangesh and @nik202 ) for providing the steps to dockerize rasa chatbot application that has botfont. After following these steps, I am able to launch my application through docker container where rasa server is running. However, action server is still failing where I am getting the below error message

“ERROR: for action_server Cannot start service action_server: failed to create shim: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: “./entrypoint.sh”: stat ./entrypoint.sh: no such file or directory: unknown ERROR: Encountered errors while bringing up the project.”

I am wondering that why we have not started action server using entrypoint.sh in the Dockerfile (inside Actions directory). Also, I have tried clearing unused docker containers/images by running “docker system prune” command but still getting the above error.

Any help/pointers in this direction will be appreciated.

@Rijul share all the files docker and docker-compose.

@nik202 - Thank you for your response. Please find below the directory tree and content of files.


Directory Tree

image


CONTENT: Dockerfile: Inside actions folder

image


CONTENT: Dockerfile: Inside rasa folder


CONTENT: docker-compose.yml: Inside Project folder

image

@Rijul I guess this is not the exact code as mentioned in the thread. I’d recommend sticking with basic. Try to delete the images and re-run the build.

@nik202 - The code is almost same. I only changed some paths and rasa/rasa-sdk versions to meet the requirements of my project. Rasa server is already up and running with this code. I am getting the above error only for action server. Therefore, I need to resolve this error so that I can make action server running.

I found the root cause of the error "“ERROR: for action_server Cannot start service action_server: failed to create shim: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: “./entrypoint.sh”: stat ./entrypoint.sh: no such file or directory: unknown ERROR: Encountered errors while bringing up the project.”.

Earlier, I was directly copying the models which were pre-trained locally while building the docker container. The solution is to keep the command “RUN rasa train” in the Dockerfile. I made some other changes too to make it work for my customized project setup. However, the solution for the above error is to train the model while building the container.

Thanks everyone for your discussions and support.