Performance decline on implementing more intents

I am new to Rasa. I have the following problems: As my bot (intent) grows, I begin to see some problems (drop) with its performances

  1. I have a scenario where the implementation of newer intents affects the performance of previous intents
  2. Even at that if I implement a new intent or some intents, I try it first on the same nlu data it was trained on but sometimes I get the response I set in the nlu_fallback.

Why is this happening? My bot works well previously but like I mentioned, with more intents, the performance begins to decline.

1 Like

Hello Charles,

Thank you for your question and welcome to Rasa! :slight_smile: It looks like your new intent is not recognised by Rasa NLU. If you would like to share some examples from your training data (nlu.yml, stories.yml and domain.yml) that would be helpful. Also what language do you use?

Some very first tips to try to debug would be the following :

  • Try to run with --debug flag to see what NLU is doing (as already mentioned here also)
  • Add enough examples (in nlu.yml and stories.yml) so that the model can generalise correctly
  • The pipeline that you use also affects the classification so be careful on that too (an example can be found here)
  • If your intents are similar you can also use entities to distinguish as noted here

Since you are new to Rasa, a very insightful blog post about intent classification is this one and maybe to have a better understanding on the Rasa pipeline you could also check this one.

Thank you for your response. I understand adding more examples to your nlu.yml but I do not for stories.yml. Basically, for each intent, I normally include just one scenario. something like,

  • story: show the covid testing cost steps:

    • intent: testing_costs
    • action: utter_testing_costs
  • story: show fines and penalties steps:

    • intent: fines_penalties
    • action: utter_fines_penalties
  • story: show covid prevention measures steps:

    • intent: prevention
    • action: utter_prevention

is this the best way to add story examples?

Hello Charles,

Thank you for sharing an example of your stories.yml.

You are right, you do not need to keep adding stories (I meant having some stories) as long as the amount of stories you have, covers the different conversation turns you want your bot to handle. Each story should represent a complete dialogue between the user and the bot so if the possible conversations that your bot can handle only have 2 steps like in the examples you show, you are good.

If you get the nlu_fallback response then that means that the intent classification has a lower confidence than the nlu_threshold (config.yml) and some more ideas to improve that would be :

  • make sure that your intents are not too specific and each one represents a broad intent that the user is trying to do.
  • align your intent classifier with the rest of your pipeline (config.yml)

These two are also mentioned in this video from Rachael in section common errors (min 4:29).

  • also this explanatory post might be helpful to give you some more insights.

Thank you for your response. I followed your last post in addition to some tricks and now, I think my knowledge on overcoming challenges like the ones I stated is starting to improve. I have been able to reduce the problems.

However, do you seem to know why the following/below is happening?

In test_stories.yml I have something like

- user: |
      the [covid 19]{"entity":"disease"} cases in [rwanda]{"entity":"GPE"}
    intent: statistics
  - action: action_check_statistics.

But during test, it fails and in failed_test_stories.yml, I see the report as:

- intent: statistics  # predicted: statistics: the [covid 19](disease) cases in [rwanda](GPE)[rwanda](GPE)
    entities:
    - disease: covid 19
    - GPE: rwanda
  - action: action_check_statistics

From the comment, you see it’s almost correct except my model is predicting duplicate entities.

NOTE: This seems to be happening for some 3 other intents also. So do you by any chance know why the above is happening?

Hello Charles,

To be honest I am not sure about the reason behind the duplicate entities in the comment. I am currently working on an issue related to test_stories.yml and failed_test_stories.yml and I have not encountered this case in any of my tests so far. :thinking:

Could you also please share :

  1. how many examples you have per intent in your nlu.yml file?
  2. the contents of your config.yml file so I can check your NLU pipeline, classifier and your configuration in general ?

Thank you in advance!

And great job on using “some tricks” and improving on your bot debugging process!

[SOLVED] Basically, I was training my DIETClassifier to recognize some words in my nlu examples as entities that are already recognized by spacy. So both Spacy and my DIETClassifier were extracting those words.

Running rasa shell nlu helped me debug

2 Likes