@dima Right! So this is the issue, as Rasa 3. X is not yet compatible with Rasa 1.X and that why you are not able to install Rasa X with Rasa 3.X Ref:Compatibility Matrix
@dima As far as I worked on the combination and important information Rasa X is designed for a server installation ( means it show Git/Github link only on server installation) Please read the documentation carefully. If you want to install rasa x on server, you can use the following combination which is I’m also using i.e rasa 2.8.11 with rasa x 0.42.5 (I’m using docker installation) Ref: Docker Compose Installation
I have run the helm install command on my local ubuntu installation and it seems to have worked
d@d-Virtual-Machine:~/670rasa$ helm upgrade --install --namespace 670rasa --values rasa-values.yml --version 1.14.0 rasabot rasa/rasa
Release "rasabot" has been upgraded. Happy Helming!
NAME: rasabot
LAST DEPLOYED: Sat Jan 22 20:27:51 2022
NAMESPACE: 670rasa
STATUS: deployed
REVISION: 2
TEST SUITE: None
NOTES:
rasa 2.8.11 has been deployed!
##############################################################################
#### NOTICE: Telemetry is enabled ####
##############################################################################
Telemetry is enabled. Visit our website to learn more: https://rasa.com/docs/rasa/telemetry/telemetry/
##############################################################################
#### The deployment is running with the following configuration ####
##############################################################################
Endpoints:
Lock Store enabled: false
Event Broker enabled: false
Tracker Store enabled: true
Model Server enabled: false
Additional components:
NGINX:
Enabled: true
TLS: false
Redis:
Installed: false
External: false
RabbitMQ:
Installed: false
External: false
PostgreSQL:
Installed: true
External: false
Loaded model: https://github.com/RasaHQ/rasa-x-demo/blob/master/models/model.tar.gz?raw=true
To access Rasa from outside of the cluster, follow the steps below:
1. Get the Rasa URL by running these commands:
export SERVICE_PORT=$(kubectl get --namespace 670rasa -o jsonpath="{.spec.ports[0].port}" services rasabot)
kubectl port-forward --namespace 670rasa svc/rasabot ${SERVICE_PORT}:${SERVICE_PORT} &
echo "http://127.0.0.1:${SERVICE_PORT}"
NGINX is enabled, in order to send a request that goes through NGINX you can use port: 80
What am I missing?
The url /api/version still has {} for the rasa version.
rasa-x is the rasa x deployment and rasabot is the rasa open source deployment. They both appear to successfully be on the same cluster. Is there configuration needed to get rasa-x to see rasa on the same cluster?
@nik202 thank you so much for the help I definitely appreciate it!
I am very new to Machine Learning and went the Helm Chart route because I have an AWS account. This project is for a class where we have to collaborate with a partner on creating a Rasa chatbot so I am trying to set up a Rasa X instance somewhere my partner and I can both access it. I think the issue with my current attempt may be that the connection between the event broker of Rasa Open Source and Rasa X is not established.
What would be the easiest way to set up an instance of Rasa X/Open Source? Thank you in advance for any help!
Best,
Dima
@dima Well, I’d encourage installing rasa X only using any method which you like and are able to install, my expertise is in docker.
For installing rasa X you required rasa open-source which is a basic requirement.
But, if you have one project running in rasa open source and you want to link with Rasa X then its a separate concept, and then the event broker comes in importance.
For me, it’s really hard to explain.
If two teams working on the same project here are a few steps which can help you guys:
Install Rasa X on server complete installation i.e action server image too
Connect Github with Rasa X and on both the different systems configure Github in your editor so that every time you push it will update the code and you can able to train on Rasa X.