Connect rasa X to a existing rasa docker

I have a chatbot running in a rasa docker under a docker-compose, is there a way to connect rasa X to it without installing rasa with pip separately for running rasa X?

4 Likes

I have the same situation. Can someone help?

1 Like

Does the guide here Deploy to a Server help you?

No, it doesn’t include this information. Can someone please assist?

1 Like

I think you would need to migrate your old conversation to Rasa X: https://rasa.com/docs/rasa-x/import-conversations/

without installing rasa with pip separately for running rasa X?

what do you mean by that? It’s running in docker - no need to install anything with pip (besides docker-compose :smile:)

Hi,I am having the same problem. Is there anybody know sth about it?

1 Like

@Tobias_Wochinger, maybe I mis-understand the Importing Conversations page, but this sounds like it is a good way to try out Rasa X by importing conversations.

I’ve completed migrating a couple of bots to Rasa 1.0 and would like to setup Rasa X for on-going use. Like @damao I’m looking to implement Rasa X with these bots.

In my case, I don’t care so much about the existing conversations but I have an existing production setup with two servers running the same bot setup for redundancy and load balancing. I’d like at docker-compose configuration that will use my existing bot, if possible without significant changes to the bot.

Is there documentation on using Rasa X with an existing Rasa bot?

The architecture graphic shows Rasa X serving up the model to the Rasa bot. Is this the only approach to using Rasa X, I have to change the architecture of my existing bots?

Is there documentation on using Rasa X with an existing Rasa bot?

I mean currently you are probably running your bot with rasa run, aren’t you? Using Rasa X would then just mean to do rasa run x, upload your trained model and you are good to go.

Yes, I’m using rasa run and I want to add Rasa X to my existing docker compose setup. Here’s the new Rasa X service I’ve added to my docker-compose.yml:

  rasa-x:
    image: "rasa/rasa-x:${RASA_X_VERSION}"
    networks: ["rasa-network"]
    expose:
      - "5002"
    volumes:
      - ./models:/app/models
      - ./environments.yml:/app/environments.yml
      - ./data/logs:/logs
      - ./data/auth:/app/auth
    environment:
      SELF_PORT: "5002"
      RASA_MODEL_DIR: "/app/models"
      PASSWORD_SALT: ${PASSWORD_SALT}
      RASA_X_USER_ANALYTICS: "0"
      SANIC_RESPONSE_TIMEOUT: "3600"

What are the other services in your docker-compose.yml? I think the easiest approach would it be to follow this guide Deploy to a Server and then to upload the model.

I’m pasting my docker-compose.yml below and the associated nginx.conf.

I’ve spent hours trying to translate my setup to the Deploy to a Server approach. I’m coming to the conclusion that I have to throw out my existing Rasa production architecture container setup.

If you want to use Rasa X, you must start over and deploy exactly as described in Deploy to a Server.

I wish I could add a Rasa X container to my current architecture but that doesn’t seem possible.

docker-compose.yml:

version: "3.4"

services:
  action:
    image: 'action'
    networks: ["nginx_zbox-network"]
    container_name: "claimzen-action"
    depends_on:
      - mongo
    ports:
      - "5055:5055"
    volumes:
      - "./src:/app"
      - "./data:/data"
      - "./src/actions:/app/actions"
    command: python -m rasa_core_sdk.endpoint --actions actions

  rasa:
    image: 'rasa/rasa:latest-full'
    networks: ["nginx_zbox-network"]
    container_name: "claimzen-rasa"
    ports:
      - "5005:5005"
    depends_on:
      - action
      - mongo
    environment:
    - TFHUB_CACHE_DIR=/app/tfcache
    - DATA_DIR=/app/data
    volumes:
      - "./:/app"
    command: run --enable-api --model /app/models --endpoints /app/endpoints.yml --credentials /app/credentials.yml --port 5005 --log-file /app/data/logs/rasa_core.log --debug

  mongo:
    image: "mongo:3.7-jessie"
    networks: ["nginx_zbox-network"]
    container_name: "claimzen-mongo"
    ports:
    - "27017:27017"
    volumes:
    - ./data/mongo:/data/db

  ui:
    image: 'semantic'
    networks: ["nginx_zbox-network"]
    container_name: "claimzen-ui"
    ports:
      - 81:80
    environment:
      PORT: 80

networks:
  nginx_zbox-network:
    external: true

nginx.conf:

server {
    listen       80;
    server_name testbot.us;

    location / {
        return 301 https://$host$request_uri;
#        proxy_pass http://ui:80/;
    }

    location /webhooks {
      proxy_pass  http://claimzen-rasa:5005/webhooks;
    }

    location /.well-known/acme-challenge/ {
     root /var/www/certbot;
    }
}

server {
    listen 443 ssl;
    server_name testbot.us;

    ssl_certificate /etc/letsencrypt/live/testbot.us/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/testbot.us/privkey.pem;
    include /etc/letsencrypt/options-ssl-nginx.conf;
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;

    location / {
        proxy_pass http://ui;
    }

    location /webhooks {
      proxy_pass  http://claimzen-rasa:5005/webhooks;
    }
}
1 Like

Does this architecture diagram only apply to enterprise? What about CE?

Any updates on this? I would also like to add rasa X to my current docker-compose setup and use it to refine my chatbot.

In my opinion the “deploy to server” guide doesn’t really help with the docker-compose configuration. In my case I’m running the docker-compose on my local for testing and then deploying the same image (with minor changes) to my server, so the guide doesn’t really help.

Any chance it would be possible to have rasa-x as a service in the docker-compose file, like we have with rasa and action server?

1 Like

@jacek top of my list is for Rasa X to work with an existing Rasa implementation but this isn’t supported. After spending quite a bit of time trying to figure out how to merge my docker-compose Rasa implementation with Rasa X CE I realized I would have to start over.

I wrote a CLI tool called rasacli to help with importing data along with a brief blog post here.