My rasa chatbot works fine on my local machine. My rasa version is:
Rasa Version : 3.6.15
Minimum Compatible Version: 3.5.0
Rasa SDK Version : 3.6.2
Python Version : 3.9.18
Operating System : macOS-14.1.1-arm64-arm-64bit
My directory structure is:
.
├── Dockerfile
├── config.yml
├── credentials.yml
├── data
│ ├── nlu.yml
│ ├── rules.yml
│ └── stories.yml
├── domain.yml
├── endpoints.yml
└── models
└── 20231219-234027-gray-entropy.tar.gz
My docker file is:
# Use an official Rasa image as a base image
FROM python:3.9.18
# Set the working directory
WORKDIR /app/rasa
# Copy the contents of the current directory to the container
COPY . .
RUN pip install rasa==3.6.15
# Specify the command to run when the container starts
CMD ["rasa","run", "--enable-api", "--cors", "*", "--debug"]
But whenever i dockerize my app it shows the log:
DEBUG rasa.core.policies.ted_policy - Failed to load ABCMeta from model storage. Resource ‘train_TEDPolicy3’ doesn’t exist.
DEBUG root - No model server endpoint found after server start.
INFO root - Enabling coroutine debugging. Loop id 187650336839136.
(just few logs which i found confusing)
Mind you, the shell works, but it only gives default fallback responses.
Edit1:
When I try to run the same configuration locally:
conda create -n rasa3 python=3.9.18 ipython
pip install rasa==3.6.15
everything works fine. The rasa shell works as it should. The default response does not get triggered. Is there a problem with my docker python image?