How to let users use the latest intent without explicitly ask for it

How to let users use the latest intent without explicitly ask for it

For example, in a banking bot, users can ask for an intent: transfer_money by: “Please transfer $200 to James Bond.”

This utterance will trigger the transfer_money intent. Then the user asked:”$100 to Eva.” But the bot doesn’t understand it is a transfer_money intent.

It is common in the real world that users may want to continue the latest intent. The question is how to let users use the latest intent without explicitly ask for it?

1 Like

Good question!

But in this case, why not just add ”$100 to Eva.” in the intent examples?

- intent: transfer_money
  examples: |
    - Please transfer [$200](amount) to [James Bond](name)
    - Transfer [50$](amount) to [Chris](name)
    - Can you send [100$](amount) to [Alina](name) please?
    - [$100](amount) to [Eva](name)
...
1 Like

@alinalizhang Train the bot with training examples for the specific intent, in your case transfer_money.

Till the time it’s is the same question it is fine i.e tranfer_money.

For example:

Please transfer $200 to [James Bond](person)
Can you transfer $200 to [James](person)
$100 to [Eva](person)
100$ to [Eva](person) account
Please send $100 to [Eva](person)
Transfer 100 to [Rocky](person)
etc etc

I hope it make sense?

1 Like

good point. the example I provided is not accurate enough to describe the problem. Let us look at the following example: The banking bots can handle 2 main features:

  • intent_statement: check the statement of an account
  • intent_spend: check how much users spent on a specific store

For example, case A:

users: “show me my statement for June 2021” bots: show statement for June 2021 users: “how about May 2021” bots: don’t understand it is for intent_statement

case B:

users: “how much did I spend at Starbucks in June 2021” bots: “You spent $200 on Starbucks in June 2021.” users: “how about May 2021” bots: don’t understand it is for intent_spend

The same utterance “how about May 2021” can be used for different intents. How to make users use previous intent without explicitly ask for it?

1 Like

@alinalizhang Well, there are some benchmarks and some main triggering words which we need to consider or when we are training the data, we consider. So, in both your cases A and B, For A: first statement is perfect as it have some meaning, like show statement for the month of June-2021, but on the otherhand, second statement making bot confuse how about " what???" May 2021 “what??”. It will state aways enter into fallback or out of scope, I hope you getting some random response, with low confidence value also. if you trained your bot based on some examples or annotation, it will reply whatever you will ask but in NLU and NLP, we need to maintain certain benchmark, that’s why there are options like Fallback, out of score, default messages etc. Your point is valid, we can not restrict user query, but query should be also logical and more specific about what they are seeking. Else on banking site, I can even ask “Can you recommend some movie on Netflix” or “I want to order a pizza” :stuck_out_tongue:

@alinalizhang You can agree with me or even dis-agree, its my personal experience while design the bot/user conversation, else at the backend its only AI not human :stuck_out_tongue:

@alinalizhang “The same utterance “how about May 2021” can be used for different intents. How to make users use previous intent without explicitly ask for it?” I missed :frowning:

1 Like

if the users were talking to a human banker rather than bots, the human agent would be able to understand the context, for example,

case A:

  • users: “show me my statement for June 2021”
  • bots: show statement for June 2021
  • users: “how about May 2021”
  • bots: doesn’t understand it is for intent_statement
  • HUMAN: “sure, this is your statement in May 2021.”

case B

  • users: “how much did I spend at Starbucks in June 2021”
  • bots: “You spent $200 on Starbucks in June 2021.”
  • users: “how about May 2021” bots: don’t understand it is for intent_spend
  • HUMAN: “You spent $300 on Starbucks in May 2021”

A contextual bot should be able to understand this user request like a human agent.

1 Like

@alinalizhang Yes, you are right but you need train the bot for that, In your Case A and Case B its right, when we talk to human agent, they reply; but also keep in mind we are talking about statement which is valid trigger word for human agent, if you suddenly ask to human agent and say can you tell me statement of my groceries, agent will say what?? If it’s explicitly mention then agent will tell else agent will say, sorry I not get you.

This is a very interesting point. To be honest, I don’t know if it’s possible with Rasa, but maybe it is.

Maybe there’s a way to build a Custom Pipeline Component that sort of prioritizes the intent that was previously detected, or maybe End-to-End training offers this possibility, or something else, but I’m not sure!

@nik202 has a point though. The user should still know they’re talking to a Level 3 bot and be fine with being asked by the bot to clarify the intent (two-stage fallback).

Of course, it would be better if the bot could understand from context as you want to, and again, there might be a way to do it. This is just a discussion and I’m in no way telling you to not achieve your goal :slight_smile: Hopefully someone will read this thread and help us!

3 Likes

@ChrisRahme thank you for sharing the End-to-End training article. An example in the article is very similar to the problem we discussed:

“mortgages and savings accounts both have interest rates associated with them. And people also talk about ‘saving up’ for a mortgage deposit. To a bank, these are separate products, but to a customer these are just means to an end”

As Alan N pointed out in the article, "As developers, we love splitting larger problems up into separate components. But to achieve fluid conversation, we have to accommodate that users don’t respect the boundaries we draw. "

1 Like