How to store model loaded from AWS Bucket on a persistent volume with docker-compose?

Hi all,

I’m deploying my RASA bot by loading a model from an AWS bucket. Since containers get recreated sometimes, the model gets lost again and needs to be reloaded when recreating the containers. To avoid this I would like to store the model on a volume but I do not know how I need to configure this in my docker-compose file as I do not have an initial models folder that I could define as a persistent volume. This is part of my docker-compose.yml:

  rasa-server:
    image: rasa-bot:latest
    working_dir: /app
    build:
      context: ./
      dockerfile: Dockerfile
    volumes:
    ## - ./models /app/models
    command: bash -c "rasa run --model model.tar.gz --remote-storage aws --endpoints endpoints.yml --credentials credentials.yml --enable-api --cors \"*\" --debug --port 5006" 

And this is my Dockerfile:

FROM python:3.7.7-stretch AS BASE

RUN apt-get update \
    && apt-get --assume-yes --no-install-recommends install \
        build-essential \
        curl \
        git \
        jq \
        libgomp1 \
        vim

WORKDIR /app

RUN pip install --no-cache-dir --upgrade pip

RUN pip install rasa==3.1
RUN pip3 install boto3

ADD . .

What I’m not sure about is this part in my docker-compose.yml:

    volumes:
    ## - ./models /app/models

Will the model fetched from AWS storage be saved in app/models? And then, since my command is still the same like

–model model.tar.gz --remote-storage aws

it will fetch it again from AWS instead of looking in a local folder, is that correct? One solution to that, could be somehow check in Docker whether a volume exists and then run a second docker-compose.yml that uses the model from the volume. But how to do so?

@stephens you seem to be an expert with that :slight_smile:

I’ve been working mostly with Kubernetes but take a look at the setup for sharing a volume across containers. There’s a post here.