Automated testing of stories with modularization

Hi all,

I would like to check if there is a way for me to test stories that takes in fixed user inputs. I understand that I can use rasa test but I don’t think I can use checkpoints for it. With this, I hope that let’s say I want to write the “weather” module, I can just write the nlu, stories, rules, domain and the tests together.

Another challenge I am facing is when the user fills in a form with slot mapping from_text, as I need to include the intent but the user can type anything they want.

Any suggestions to achieve the above will be greatly appreciated!

Thank you

rasa test will test entire conversations. It’s like an integration test which tests your assistant from a user’s point of view. What different use case would you want to solve with these modular tests?

Hi @Tobias_Wochinger ,

Yup, I would like to test the entire conversation. I managed to get checkpoints working (I had some indentation issues). I would like to check if I can find out where does the test failed (currently I only have a json file with the metrics) so that I can narrow down the failed action. Also, I have cases where I have a form with slot_type: text where user can type anything they want (for example: feedback form). In this case, I would like the test to skip intent classification. How do I go about doing it? It currently requires me to state the intent after the user’s input.

Additionally, is there any tutorials for writing tests in RASA 2.0?

Thank you.

I don’t think I would use checkpoints in my test stories :thinking: Coming from a software engineering background I’d argue that every test should be independent of others and stand for its own. If you provide an output directory to rasa test it should write a file called failed_stories.yml (I believe it’s the results/ directory by default) pointing you to the wrongly predicted action.

This is still for the old training data format but should be fine: Write Tests! How to Make Automated Testing Part of Your Rasa Dev Workflow

Hi @Tobias_Wochinger, can I check with you if it is possible to exclude the “intent” portion when writing the test stories for form with slots with the type from_text?

For example:

- action: feedback_form
- user: |
      This chatbot is up to my standard
  intent: inform_feedback

Can we omit the last intent: inform_feedback? This is to avoid creating redundant intents which may unnecessarily complicate the model.

Thank you.

test stories also verify the intent classification so the intent part is strictly required as we need that to compare actual intent with the expected intent.

Thanks for the reply. I am checking because my form can have multiple questions that input a number (can be quantity of items, frequency, etc), 2 questions to input date and multiple questions for free text input. (Eg. For feedback, the user can write good, no comments, i love it, etc). But I think I’ll my best to lump them together as a single intent for form filling.

Can I check for testing, can I include multiple utterances?

- user: |
      Hi, hello
  intent: greet