Hi, I have a Rasa bot deployed in a Kubernetes cluster with the official Rasa X helm chart. Everything is working fine. I use the Rasa open-source instance/pod running inside Rasa X environment to serve production traffic. I can see the conversations happening in the Rasa X interface, as intended. But now I want to consume the events happening in my bot and pass those events to another external Kafka cluster, to generate analytics.
I see that there is a RabbitMQ instance running inside the Rasa X deployment. I think it’s being used by Rasa X interface. But I didn’t explicitly mention any event broker in my
To achieve my goal, which is to forward events to Kafka, I have two options -
- Create a consumer connected to the RabbitMQ running inside Rasa X deployment and consume events from
rasa_eventsqueue. To go with this option, I need a separate service running which will consume the events and pass it to my Kafka cluster. Is it doable? Will it mess up with the Rasa X UI?
- Directly forward/produce events to the Kafka cluster which is already running. I see that Rasa can also produce events to Kafka clusters, if I add it in the
endpoints.yml. This option is preferred because it doesn’t require running a new consumer service Can I add multiple event brokers in the
So from the above two options, which one should I go with? I am hesitant to make any breaking changes in the test cluster (currently being used by test users) or the production cluster (currently being used by customers) to avoid any downtime. I’m using Rasa version 2.8.0 and Rasa X 0.43. Thanks in advance.