Supervised Response Selector [Experimental]

I’m about to test it …

In the docs it says it supports multiple questions per intent.

## intent: chitchat/ask_name
- what's your name
- who are you?
- what are you called?

The question is, does it support multiple responses too ? Something like :

## ask name
* chitchat/ask_name
    - my name is Sara, Rasa's documentation bot!
    - my name is Sara
    - Sara
1 Like

is there an example project ?

In your responses.md file are you able to have multi-line text responses, including bullet points? What about buttons?

2 Likes

@sten No, multiple responses are not supported as of now because the current training regime doesn’t support multiple ground truth responses.

There isn’t an example project as of now, but if you follow the blog and make changes to files as suggested, you’ll be able to build one.

1 Like

@mjspeck No, currently we only support plain text.

1 Like

Is there any way to use the fallback policy with the supervised response selector, similar to how it’s used with a normal intent classifier? Would be useful if the chatbot is not confident about which FAQ is being asked. If not, are there any plans to develop such a feature soon?

3 Likes

@mjspeck That could be a good addition, although we haven’t thought of developing it yet.

I encountered same error “Can not train a classifier. Need at least 2 different classes”.

I found that when I train from rasa x the error appears but if train from console like “rasa train --debug” it works well.

Also i can’t use response selection with default_ask_affirmation

@Nick_Kazakov Yes, integration of response selector with Rasa X is in progress. We’ll let you know once it is fully integrated.

On your next question, what do you mean by default_ask_affirmation?

1 Like

I meant this Fallback Actions Also it is implemented here rasa-demo/actions.py at master · RasaHQ/rasa-demo · GitHub Will i have opportunity using response selector to choose answer if low nlu confidence?

@Nick_Kazakov Not yet. But that’s a good point. We’ll give this a thought. Thanks

Hello, and what if I need to run story depending on what specific intent was identified. It is possible to transform these two scenarios with retrieval actions?

* greet
  - utter_greet
  - utter_introduce_1
  - utter_possibilities
* goodbye
  - utter_goodbye
  - utter_always_here

It would be great to be able to use both aggregate intent and specific intent.

Classic stories:

* chitchat/greet
  - utter_greet
  - utter_introduce_1
  - utter_possibilities
* chitchat/goodbye
  - utter_goodbye
  - utter_always_here

Form stories:

* contact_sales
    - utter_moreinformation
    - sales_form
    - form{"name": "sales_form"}
* chitchat
    - respond_chitchat
* contact_sales
    - utter_moreinformation
    - sales_form
    - form{"name": "sales_form"}
* chitchat/goodbye
    - utter_goodbye
    - action_deactivate_form
    - form{"name": null}

Is there a way to execute “respond_chitchat” for example from a custom action? I’ve tried dispatcher.utter_template(“respond_chitchat”) which doesn’t work, any ideas?

@tocosastalo It isn’t possible as of now, but will give more thought to it how it should be incorporated. Thanks

This is a great step towards handling single-turn interactions separately. However, I have a few questions based on how I was using the traditional approach for single turn interactions along with rasa core and how using response retrieval models would change it.

Let me explain with an example, I have 4 intents and 1 entity of Car Model

intents:
  - cost
  - specification
  - downPayment
  - expectedDelivery

entity:
  - carModel

slots:
 carModel:
    type: categorical
    values:
        - modelX
        - cybertruck

actions:
  - cost
  - specification
  - downPayment
  - expectedDelivery
  - cost_modelX
  - specification_modelX
  - downPayment_modelX
  - expected_delivery_modelX
  - cost_cybertruck
  - specification_cybertruck
  - downPayment_cybertruck
  - expectedDelivery_cybertruck

obvious way of making this work is when both intent and entity is specified in the the utterance, for example, ‘what are the specs of cybertruck?’ or ‘when can i expect the delivery of Model X?’

But for a much better experience you cant force the user to mention a car model everytime such as this conversation

> What are the specs of Cybertruck?
> Whats the cost of Cybertruck?
> What about the down payment of Cybertruck?
> when can i expect delivery of Cybertruck?

instead you’d like the flow of conversation from user to be

> What are the specs of Cybertruck?
> How about the cost?
> What about the down payment?
> when can i expect its delivery?

So i want to use the slot set in the first utterance to influence the next predictions (to predict cost_cybertruck without user mentioning it ), and when that same slot is not set (implying that user never mentioned an entity in previous turns) you provide generic response (thus, only predicts cost action).

I was writing stories as following:

Generic cost response story

* cost
   - cost

Generic specification response story

* specification
   - specification

Generic downPayment response story

 * downPayment
    - downPayment

Generic expectedDelivery response story

 * expectedDelivery
    - expectedDelivery

Intent + Entity story (similar stories exist for all intent+entity combinations)

 * downPayment{"carModel": "modelX"}
    - slot{"carModel": "modelX"}
    - downPayment_modelX

Story to remember slot and influence next turn predictions

 * specification{"carModel": "cybertruck"}
    - slot{"carModel": "cybertruck"}
    - specification_cybertruck
 * cost
    - cost_cybertruck
 * downPayment
    - downPayment_cybertruck
 * expectedDelivery
    - expectedDelivery_cybertruck 

So if I were to use response retrieval models, how would that affect the stories? Since all of the responses would be mapped to something like faq. So how can I mimic the above behaviour with response retrieval models?

1 Like

I think a fallback policy for response selector would be a really nice feature.

1 Like

Hi @dakshvar22! Do you continue on Rasa X and response retrieval integration? Thanks!

Hi @madstuntman, it is definitely on the roadmap but no release date has been set yet.

3 Likes

Has anyone developed FAQ bot with this option that is actually being used on their website or somewhere else?

If someone did, can he say does it make sense to work with this if I’m planning to have 100 FAQ questtions about a restaurant implemented with it. :smiley:

@ kavan

Did you manage to do what you asked?