I’d like to log all conversations to my own logfile that would be consumed by filebeat and forwarded to Kibana.
A custom event broker seems like a good way to go but Rasa X already runs a RabbitMQ broker for its purposes and it doesn’t look like I can have a 2nd event broker. When I add a second event_broker: section to the endpoints.yml I get an error.
Other options would be to:
Create a custom tracker store that writes a copy of the tracker information to a logfile.
Get the events from RabbitMQ but I don’t see a straight forward way to tap into the RabbitMQ messages. The RabbitMQ logfiles don’t include the messages that were sent.
Periodically poll the Rasa HTTP getTrackerConvesation endpoint to collect conversations.
Hi @stephens, you’re right in that multiple event brokers aren’t currently supported, although I don’t see a reason why we shouldn’t do this in principle. Is this something you’d be interested in contributing? For your case, we’d need something like a FileEventProducer, which can then be invoked in addition to the Pika one.
In the short term, I think your custom tracker store solution sounds good. The easiest way of doing this would probably be to write/append to a file in the tracker_store's save() method, exactly where the event publishing happens with stream_events().
Hello @stephens yes i looked! thanks…
So…my solution was use the endpoints.yml to create some queues rabbitMQ (pika)… and after i created a consumer to get events in a queue and finnaly send to my another aplication… in my case i sent my events to dashbot.io and sent to kafka by another queue too
I suggest that endpoints.yaml accepts more than one broker…That way we can easily connect to other integrations already created via broker.