Is there any way to use trained rasa model outside rasa shell

I am working on a rasa chatbot. For training the chatbot, I added all the training data to the nlu.md file. I added the stories to the stories.md file. I configured the domain.yml file and I also created a few custom actions that the bot should run when the user asks a particular question. Now I trained the chatbot using the rasa train command. This created a zipped file inside the models folder.

I am now trying to add speech recognition to the chatbot. for this I am using the SpeechRecognition library in python and google’s speech to text API.

I want the converted text from the google STT as user input to the trained rasa model. As of now, I am only able to use the trained model inside rasa shell or rasa x. But I am creating a web UI for the chatbot with django, and along with the speech recognition part of the code, I would like to call the trained rasa model from the custom code.

How do I do that?

What happens to the actions server when calling the model from my own custom code, and will the bot be able to follow the flow as specified in the stories with which I trained the model?

Hey @sashaank.

Seems like you’re trying to run the bot from inside the script. Rasa provides “Agents” that you can use to achieve this.

You can use methods like handle_text in case of CollectingOutputChannel. For example:

>>> from rasa.core.agent import Agent
>>> from rasa.core.interpreter import RasaNLUInterpreter
>>> agent = Agent.load("examples/restaurantbot/models/current")
>>> await agent.handle_text("hello")
[u'how can I help you?']

More information is available at the Agent docs

1 Like

Thanks mate! It works perfect! The URL of Agent docs seems not correct now, however it doesnt matter, it’s easy to google rasa agent!