Hey everyone,
The Rasa 3.16 Release is here. This one covers a lot of ground: new IDE tooling, faster, more accurate models with updated command generator prompts, and a built-in CSAT pattern. Here’s what’s new in 3.16:
Rasa MCP Tools: your IDE now speaks Rasa.
Your AI coding assistant is great at Python. But it doesn’t natively know how to build conversational experiences with Rasa. Rasa Tools fixes that. It’s a local MCP server that connects to Cursor, VS Code Agent Mode, Claude Code, and JetBrains AI. Once connected, your copilot can read your project structure, write and validate what you have built, train your model, and debug from logs and conversation context. One setup command:
rasa tools init
Offline mode ships a local llms.txt bundle for teams in regulated environments who can’t hit the docs site.
New command generator templates for GPT-5.1, GPT-5.2, Claude Sonnet 4.5.
GPT-5.2 and the latest Claude models now outperform GPT-4o on command generation accuracy in Rasa. Set the model in endpoints.yml, and the new command generator templates load automatically.
Built-in CSAT.
pattern_customer_satisfaction triggers automatically after a conversation ends, and collects a csat_score, and only fires once per session. The pattern, its responses, and its activation conditions are all customizable.
Additional Fixes and improvements
-
Voice agents now send a filler message (think, “let me look that up for you”) before executing tool calls. No more silence while the agent works. The acknowledgment streams immediately, and the full response follows when the tools complete -
Sessions now carry a UUID on all events and end explicitly as ConversationInactiveorSessionEnded -
MCP tool calls can forward user context _metawithout touching the LLM context -
Multilingual voice no longer requires separate deployments: ASR and TTS reconfigure at runtime from the language slot -
Depreciated HumanHandoffpattern has been removed from the codebase and default prompt templates.
More in the full release notes here. Drop your questions in the thread.