Training an assistant in Portuguese, Rasa NLU not running with Docker

I am working with Rasa NLU. I want to train a language model in Portuguese and have it running inside a Container. I can train the language dataset but I am not being able to get it to run.

I’ve created an Image from the official rasa_nlu, running with the spacy Portuguese pipeline, and placed in a Container on Docker.

I am able to use the rasa_nlu.traincommand to train the language model without problems, or at least that what it seems. When I try to run it using the data that I trained, I get an error message complaining about missing parameters on the command that I used.

Here is the docker-compose service that I try to use when running the container:

rasa_nlu:
    image: rasa_nlu_pt
    volumes:
      - ./models/rasa_nlu:/app/models
    command:
      - start
      - --path
      - /app/models 
  

and it gives the following error message:

usage: run.py [-h] -d CORE [-u NLU] [-v] [-vv] [--quiet] [-p PORT]
              [--auth_token AUTH_TOKEN] [--cors [CORS [CORS ...]]]
              [--enable_api] [-o LOG_FILE] [--credentials CREDENTIALS]
              [-c CONNECTOR] [--endpoints ENDPOINTS] [--jwt_secret JWT_SECRET]
              [--jwt_method JWT_METHOD]
run.py: error: the following arguments are required: -d/--core

The same happens if I run it without other containers:

$ docker run   -v $(pwd):/app/project   -v $(pwd)/models/rasa_nlu:/app/models -
p 5000:5000 rasa_nlu_pt start --path app/models
usage: run.py [-h] -d CORE [-u NLU] [-v] [-vv] [--quiet] [-p PORT]
              [--auth_token AUTH_TOKEN] [--cors [CORS [CORS ...]]]
              [--enable_api] [-o LOG_FILE] [--credentials CREDENTIALS]
              [-c CONNECTOR] [--endpoints ENDPOINTS] [--jwt_secret JWT_SECRET]
              [--jwt_method JWT_METHOD]
run.py: error: the following arguments are required: -d/--core

I used the same command to run the service with the English spacy pipeline provided by Rasa and it worked as it should, but now it is giving this error message. That other information I am missing?

which image are you running?

Is it the official docker image, or you have built your own docker file. From the looks of it, it seems like you are running the rasa core image instead

I am running my own docker file, with a image taken from the official one and with the Portuguese spacy model.

Here is the code:

FROM rasa/rasa_nlu:latest

RUN python -m spacy download pt_core_news_sm

RUN python -m spacy link pt_core_news_sm pt

COPY nlu_config.yml config.yml

I did a bit of digging of the docker image rasa_nlu and indeed i see this in the entrypoint.sh

case ${1} in
    start)
        exec python3 -m rasa.core.run --enable_api "${@:2}"
        ;;
    run)
        exec "${@:2}"
        ;;
    train)
        exec python3 -m rasa.core.train -s project/stories.md -d project/domain.yml -o ./model "${@:2}"
        ;;
    download)
        download_package ${@:2}
        ;;
    *)
        print_help
        ;;
esac

When you execute the command start in the docker run command, it executes exec python3 -m rasa.core.run --enable_api "${@:2}"

which is odd. @tmbo is it normal that the official rasa nlu image is running rasa core by default in the entrypoint?

@CaioTsubake - According to me the default command of NLU shouldn’t be executing rasa core.

i opened an issue linked to it - Dockerfile executes rasa core by default · Issue #3229 · RasaHQ/rasa_nlu · GitHub

Thank you @souvikg10. I have a question, how can I run Rasa NLU using a spacy model other than the “en” one while this issue is solved?

i think instead of running the start command try this

python3 -m rasa.nlu.run --model ''your model''

p.s i am looking through github for this, for the latest. I think there are some changes ongoing in rasa core. I don’t know much about the latest version because the last one i have used was nlu 0.14.0

1 Like

Thanks for the help @souvikg10. I managed to sidestep the problem by using the 0.14.0 version of the NLU library instead of the latest one on my docker build. It seems to be equal in all the parts that I was needing.

I should try to look into the new version when I have time. Thanks for all the help.

1 Like