I am well aware of using action_listen in the stories.md file, as well as utter a string through there. But now I am trying to listen to the user inside an action ie. after I get the last input from the tracker I dispatch a message in the action but now I want to again listen to the user input, get that input and do something based on that input by calling another action inside this action. Is that possible? I haven’t been able to get that working.
Basically, I want to do the action_listen inside a custom action, get the updated user input from the tracker and do something based on that input. Please let me know if anybody has done something similar, or is aware of it. I would be happy to provide more details/explanations.
Okay I don’t think I understand how that would work here, can you help me figure that out for the following custom action
class Emotion(Action):
def name(self):
return 'emotion'
def run(self, dispatcher, tracker, domain):
input = tracker.latest_message["text"]
print(input)
interpreter = RasaNLUInterpreter('./models_full_em/nlu/default/current')
out = (interpreter.parse(input))
print(out)
emotion_intent = out['intent']['name']
print(out['intent']['name'])
response_sad = """Are you expressing sadness"""
response_joy = """Are you expressing happiness"""
if emotion_intent == "joy":
dispatcher.utter_message(response_joy)
elif emotion_intent == "sadness":
dispatcher.utter_message(response_sad)
#LISTEN FOR USER INPUT
#TAKE USER INPUT
#GET THAT INPUT AND PASS IT THROUGH THE TRACKER CALLING ANOTHER ACTION
#THEN I JUST CALL ANOTHER ACTION
Sadness.run(self, dispatcher, tracker, domain)
return
How exactly would I go about the comments in the above action? The call to another action is working, but I want to pass the new user input through it.
I have tried using dispatcher.utter_template("utter_sad", tracker) and FollowupAction("action_listen") in there.
Let me know if you need more information. Thank you!
I am not sure what those functions do, can you elaborate what those functions would return? Here is another thing I am trying to figure out if you have nay comments about this:
class Emotion(Action):
def name(self):
return 'emotion'
def run(self, dispatcher, tracker, domain):
input = tracker.latest_message["text"]
buttons = [{'title': 'yes', 'payload': '/affirm_yes{"Affirm": "Yes"}'}, {'title': 'no', 'payload': '/affirm_no{"Affirm": "No"}'}]
print(input)
yes_no = tracker.get_slot('Affirm')
print("Before:",yes_no)
interpreter = RasaNLUInterpreter('./models_full_em/nlu/default/current')
out = (interpreter.parse(input))
print(out)
emotion_intent = out['intent']['name']
print(emotion_intent)
response_sad = """Are you expressing sadness"""
response_joy = """Are you expressing happiness"""
if emotion_intent == "joy":
dispatcher.utter_message(response_joy)
elif emotion_intent == "sadness":
dispatcher.utter_message(response_sad)
#TRY TO GET THE USER TO SAY YES/NO
dispatcher.utter_button_message("Are you feeling sad?",buttons)
#TRY TO SET THE SLOT "AFFIRM" WITH A YES/NO VALUE
SlotSet('Affirm',yes_no)
yes_no = tracker.get_slot('Affirm')
print("Look here:", yes_no)
if yes_no == "No":
#FollowupAction("action_listen")
Sadness.run(self, dispatcher, tracker, domain)
return
It is a lot of mess but basically once it goes into the else statement for sadness (which it does) I want the user to enter a yes/no from the buttons I have defined above and based on that set the slot affirm to a yes/no value. Now based on if the slot value is yes/no I want to do different things, what I am struggling with is setting those slots after the button, so once the buttons are displayed the user enters a yes/no I need to store that value in the slot (which is not happening). What is the correct way of updating the slot values or taking the user input and saving it somehow?
@akelad@MetcalfeTom could you help figure out if this is something that can be done, or if I am in the right direction? Would really appreciate it, thank you!
So I believe your slot isn’t getting set because the slotset event is not getting returned. It would work if you use return[SlotSet('Affirm',yes_no)]. But you’ll be out of your function and I guess you don’t want that to happen.
About the functions I mentioned above, it’s something different. Maybe you can take a look at Form Actions and see if it’ll be useful?
Yes that is correct, I am trying to set slots within the custom function and reusing them again in the same function rather than at the end of it. Also, I am trying to do this in a nested fashion as you might have noticed I am calling another NLU model inside my custom action because I want to run my user input through a different classifier.
Is anybody aware if we can set those slots through the buttons and then pick up those slots in the same custom function like I am trying to do? I am not sure if that is possible, but I would like to confirm.
@srikar_1996 The reason I am using a different intent classification model is because I am trying to develop a system with a lot of intents which in the lower levels are very similar to each other and would basically run off very specific examples. So I cannot run a single intent classifier for that when I splitting my functionalities on various levels as it would keep going wrong a lot more than right.
Yes that was kind of my question, i am just trying to explore Rasa as much as I can while getting the best out of it for my use case. I haven’t found a way to do that.
@akelad So like you see two if else statements, I would like to have a lot more so would you recommend having different stories for each of them and a different action to handle each case? Do you think there is a more abstract option to handle this?
@akelad my biggest roadblock is that I am trying to call another NLU model inside an existing Rasa core bot. So for example I have a bot with 5 intents: greet, function, information, emotion, bye. Now I have 7 emotions under the emotion intent so I whenever the bot hits the emotion intent it calls a custom action to run another NLU model on top to identify which kind of emotion it is. Now for each emotion I want to have 5 ways of handling it, so that means I would like to run another NLU model through another action to model conversations around these 5 different ways. So if you get the idea it splits like a tree based on the intent it identifies at each level.
Now I am making stories for all these interactions in the main bot stories and it kind of works pretty well. But now I am trying to implement a yes/no fallback system which will basically try and fix if a wrong intent is encountered and allow the user to provide input as to what is the right intent. That is where I am having trouble, setting these different slots and transferring that information from one slot to another.
What would be your take on this? Am I going about this right? From what I understand I should be using more custom actions to handle these cases, but my question would be, can I link stories from one bot, and then instead of calling and using an NLU model call another core bot inside the custom action and try to link the conversation to this custom bot.
So basically you want to move the user and the conversation from one bot to another while maintaining a certain subset of bot state during the transfer?
Correct, but I am not sure that can be achieved in Rasa or not? I am just trying different things to explore different directions to exactly fit my use case.
I can see the sense in multiple NLU models. I intend something similar with my dialog manager Hermod where both the triggering hotword and core NLU vocab can cause NLU model switching so that commands like “Talk to the calculator” cause subsequent http calls to RASA NLU to be directed at a model optimised for mathematics. Considerably eases the problem of overlap of intent examples. RASA server neatly supports multiple models in one process.
I have wondered about running multiple core models in the context of an Agent with multiple Skills that each have NLU and Core training data. I guess overlap between stories is much less of a problem and that the best solution is a single core process with all stories.
Your case for nested emotional NLU requests doesn’t seem too tricky.
A main discriminator model and then however many emotional refinement models you need plus one extra async request before handing off to the bot.
NLU parse then CORE message then CORE predict
becomes
NLU parse then NLU parse then CORE message then CORE predict