Parse message using trained model

I have trained my model using rasa init project. It works as expected while I chat using rasa shell . I wanted to have my python api where I wanted to load the model and parse user message.

I am using following code from rasa.core.agent import Agent agent = Agent.load(model_path=model_path) response = await agent.parse_message(user_message)

it parse the correct intent but not returning the next question it supposed to ask. so I tried agent.handle_text(user_message) but it didn’t work as expected.

Here’s the scenario. user - I have some issue with the abcd system bot - Could you please describe your issue? user - types issue bot should return some answer.

I wanted to return my bot response from python API.

when I trained my model, I use the following code in my domain file

version: “3.1” intents:

  • test_help forms: test_form: required_slots:
    • test_issue slots: test_issue: type: text influence_conversation: true mappings:
    • type: from_text conditions:
      • active_loop: test_form requested_slot: test_issue

responses: utter_ask_test_issue: - text: Could you please describe your issue?

it works fine on rasa shell. I also noticed that handle_text works fine when we define everything in stories.yml but not working properly with forms.

Please provide solution

  • Start rasa using a deployment method (discussed here) or the rasa run command with --enable-api. The cli options for run are described here.

Your model should now be loaded and you can send and receive responses via the REST channel.