We’re using rasa_core and rasa_nlu docker images to run an English bot on a virtual machine hosted on Azure. We are adding an Arabic bot soon.
The problem is when running docker-compose up to run the Arabic bot, even after specifying in the docker-compose.yml file to use another port, the rasa_core, action_server and rasa_nlu are all running on the same ports as the English bot. (5005 for rasa_core instead of 6005 for example).
Here is the docker-compose.yml file:
version: '3.0' services: rasa_core: image: rasa/rasa_core:latest ports: - 6005:6005 volumes: - ./models/rasa_core:/app/models - ./config:/app/config command: - start - --core - models - -c - rest - --endpoints - config/endpoints.yml - -u - current/ rasa_nlu: image: rasa/rasa_nlu:latest-tensorflow volumes: - ./models/rasa_nlu:/app/models command: - start - --path - models action_server: image: custom/rasa_core_sdk:latest volumes: - ./actions:/app/actions
I also edited the config/endpoints.yml and changed the ports.
I think there is something missing in my configuration and that I need to change the ports somewhere else but I can’t figure it out.
Could you please help?
Many thanks in advance.