Rasa 3.7 CALM support korean

How can I support koean in rasa chatbot? (3.7.0) I want Rasa chatbot to reply in korean, but I coudn’t. I’m using LLMCommandGenerator which is using openai, FlowPolicy. Setting "language: “ko” doesn’t works. (in config.yml) HELP ME!

Hi @Rrrriaa. Help me understand your usecase a little bit, before we think about solutions :slightly_smiling_face:

Do you want to develop your chatbot in English and then translate the responses to Korean using LLMs, or will your chatbot be built in Korean?

1 Like

Hi @Balowen ! I want my chatbot be bulit in Korean! I found that changing “utter_free_chitchat_response” can help building chatbot in Korean. Default “utter_free_chitchat_response” was set in English so that chatbot spoke English only.

  utter_free_chitchat_response:
    - text: "안녕하세요! 무엇을 도와드릴까요?"
      metadata:
        rephrase: True
        rephrase_prompt: |
          You are an incredibly friendly assistant. Generate a short 
          response to the user's comment in simple korean.

          User: {{current_input}}
          Response:

So problem is solved! By the way, Thanks for replying!

1 Like

Hi @Rrrriaa, Happy to hear that you solved the problem! In case you want to rephrase all responses in Korean, it is possible too. Here is an example way how you could achieve that:

  1. Enable the ContextualResponseRephraser in your endpoints.yml file for all responses: docs
  2. Configure the Rephraser to use a customized prompt like so:
nlg:
  type: rasa_plus.ml.ContextualResponseRephraser
  rephrase_all: True
  prompt: prompts/response-rephraser-template.jinja2
  1. Create a response-rephraser-template.jinja2 file in prompts folder.
  2. Customize the prompt in that file, eg.:
The following is a conversation with
an AI assistant. The assistant is helpful, creative, clever, and very friendly.
Rephrase the suggest AI response staying close to the original message and retaining
its meaning. Use simple korean.
Context / previous conversation with the user:
{{history}}
{{current_input}}
Suggested AI Response: {{suggested_response}}
Rephrased AI Response:

Please check the Security Considerations page to learn about possible threats and limitations of using LLMs to generate rephrased responses.

1 Like

For this to work the response needs to be generated in the target language? I tried with Hindi language, it did not convert entire text in Hindi

The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly. Rephrase the suggest AI response staying close to the original message and retaining its meaning. Use simple Hindi language as target language. Context / previous conversation with the user: {{history}} {{current_input}} Suggested AI Response: {{suggested_response}} Rephrased AI Response:

Currently I am using Google translator for translating for each response.But the response time is very slow

Your results will depend on the LLM translation abilities to Hindi. By default it is set to use gpt-3.5 and you can configure it here: Contextual Response Rephraser.

I have tried with both GPT 3.5 and GPT 4. Here is the log

Conversation: USER: हवामान AI: Weather updates: overcast clouds with temperature 308.04 Kelvin

Summary: summarization=The user asked for the weather update and the AI responded that there are overcast clouds with a temperature of 308.04 Kelvin. 2024-04-17 15:21:32 DEBUG rasa_plus.ml.contextual_response_rephraser - [debug ] nlg.rephrase.prompt prompt=The following is a conversation with an AI assistant, this assistant. The assistant is helpful, creative, clever, and very friendly. This is an assistant for elderly. Rephrase the suggest AI response staying close to the original message and retaining its meaning. Use simple Hindi language as target language. Context / previous conversation with the user: The user asked for the weather update and the AI responded that there are overcast clouds with a temperature of 308.04 Kelvin.

Suggested AI Response: What else I can help you with? Rephrased AI Response: Translate entire response in Hindi 2024-04-17 15:21:32 DEBUG urllib3.connectionpool - Starting new HTTPS connection (1): api.segment.io:443 2024-04-17 15:21:34 DEBUG urllib3.connectionpool - https://api.segment.io:443 “POST /v1/track HTTP/1.1” 200 21 2024-04-17 15:21:34 DEBUG rasa.shared.utils.llm - [debug ] llmfactory.create.llm config={‘_type’: ‘openai’, ‘request_timeout’: 5, ‘temperature’: 0.3, ‘model_name’: ‘gpt-4’, ‘max_tokens’: 256} 2024-04-17 15:21:34 DEBUG openai - message=‘Request to OpenAI API’ method=post path=https://api.openai.com/v1/chat/completions 2024-04-17 15:21:34 DEBUG openai - api_version=None data=‘{“messages”: [{“role”: “user”, “content”: “The following is a conversation with\nan AI assistant, this assistant. The assistant is helpful, creative, clever, and very friendly. This is an assistant for elderly.\nRephrase the suggest AI response staying close to the original message and retaining\nits meaning. Use simple Hindi language as target language.\nContext / previous conversation with the user:\nThe user asked for the weather update and the AI responded that there are overcast clouds with a temperature of 308.04 Kelvin.\n\nSuggested AI Response: What else I can help you with?\nRephrased AI Response:\nTranslate entire response in Hindi”}], “model”: “gpt-4”, “temperature”: 0.3, “max_tokens”: 256}’ message=‘Post details’ 2024-04-17 15:21:37 DEBUG urllib3.connectionpool - https://api.openai.com:443 “POST /v1/chat/completions HTTP/1.1” 200 None 2024-04-17 15:21:37 DEBUG openai - message=‘OpenAI API response’ path=https://api.openai.com/v1/chat/completions processing_ms=2727 request_id=req_76cc612c922a4c3af11491c2de1edf65 response_code=200 2024-04-17 15:21:37 DEBUG rasa_plus.ml.contextual_response_rephraser - [debug ] nlg.rewrite.complete response_text=What else I can help you with? updated_text=मैं आपकी और किस प्रकार सहायता कर सकता हूँ? 2024-04-17 15:21:37 DEBUG rasa.core.processor - [debug ] processor.actions.policy_prediction action_name=utter_can_do_something_else policy_name=FlowPolicy prediction_events=[DialogueStackUpdate(“”“[{“op”: “replace”, “path”: “/0/step_id”, “value”: “END”}]”“”), DialogueStackUpdate(“”“[{“op”: “remove”, “path”: “/0/frame_type”}, {“op”: “add”, “path”: “/0/previous_flow_name”, “value”: “weather”}, {“op”: “replace”, “path”: “/0/type”, “value”: “pattern_completed”}, {“op”: “replace”, “path”: “/0/flow_id”, “value”: “pattern_completed”}, {“op”: “replace”, “path”: “/0/frame_id”, “value”: “IWGBGX17”}, {“op”: “replace”, “path”: “/0/step_id”, “value”: “START”}]”“”), FlowCompleted(flow: read_weather_updates, step_id: 0_action_fetch_weather), DialogueStackUpdate(“”“[{“op”: “replace”, “path”: “/0/step_id”, “value”: “0_utter_can_do_something_else”}]”“”), FlowStarted(flow: pattern_completed)] 2024-04-17 15:21:37 DEBUG rasa.core.processor - [debug ] processor.actions.log action_name=utter_can_do_something_else rasa_events=[BotUttered(‘मैं आपकी और किस प्रकार सहायता कर सकता हूँ?’, {“elements”: null, “quick_replies”: null, “buttons”: null, “attachment”: null, “image”: null, “custom”: null}, {“metadata”: {“rephrase”: true}, “utter_action”: “utter_can_do_something_else”}, 1713347497.4638562)] Here is reohrase jinja template The following is a conversation with an AI assistant, this assistant. The assistant is helpful, creative, clever, and very friendly. This is an assistant for elderly. Rephrase the suggest AI response staying close to the original message and retaining its meaning. Use simple Hindi language as target language. Context / previous conversation with the user: {{history}} {{current_input}} Suggested AI Response: {{suggested_response}} Rephrased AI Response: Translate entire response in Hindi

It looks like your prompt is a bit wrong – you shouldn’t add anything to the prompt after the Rephrased AI Response:. Could you please try with the following prompt?

The following is a conversation with an AI assistant. 
The assistant is helpful, creative, clever, and very friendly.
Rephrase the suggested AI response staying close to the original message and retaining its meaning. 
Use simple Hindi.
Context / previous conversation with the user:
{{history}}
{{current_input}}
Suggested AI Response: {{suggested_response}}
Rephrased AI Response:

Please also make sure you enable rephrase_all parameter in your endpoints if you wish to translate all responses:

  • for Rasa Pro >=3.8.x:

    nlg:
      type: rephrase
      rephrase_all: true
      prompt: prompts/response-rephraser-template.jinja2
    
  • for Rasa Pro <=3.7.x:

    nlg:
      type: rasa_plus.ml.ContextualResponseRephraser
      rephrase_all: true
      prompt: prompts/response-rephraser-template.jinja2
    

Thanks. I have figured out that setting rephrase to False as below

utter_can_do_something_else: 
  - text: "Is there anything else I can assist you with?"
    metadata:
      rephrase: False

reduces my Rasa repsonse times significantly (I have voice bot so I am sending request to Rasa Rest API)

INFO:root:Conversion took -1(STT) -->: 1.13 seconds
INFO:root:Recognized  हेलो
INFO:root:Conversion took -2(Rasa Response) -->: 0.05 seconds
INFO:root:response_content -->  [{'recipient_id': 'cG9SWHts9cqxb-19AAAL', 'text': "Warm greetings from MitramCares! I'm your friendly helper. How can I make things easier for you today?"}, {'recipient_id': 'cG9SWHts9cqxb-19AAAL', 'text': 'Is there anything else I can assist you with?'}]
INFO:root:Conversion took -3 (Translation)-->: 0.60 seconds
INFO:root:Conversion took -4(TTS) -->: 0.74 seconds

I am better to be with Google translation. Can we please have a further disucssion on a call, please let me know.

Thanks,
Geeta

As you’ve noticed, adding a step to rephrase responses with a large language model (LLM) increases response time, as each rephrasing requires a separate call to your LLM provider. Considering the performance needs of your voicebot, continuously using an LLM for rephrasing might not be practical. It’s good to hear you found an alternative solution.

For optimizing the rephraser component, I recommend trying a smaller and faster model like GPT-3.5. This should help reduce latency without significantly sacrificing quality. Using a more powerful and resource-intensive model like GPT-4 might be overkill for this application.

Thanks for trying out Rasa Pro!

I am planning to use rephraser for responses where needed, even if it has to be small model like GPT 3.5 .

Becuase earlier I had GPT3.5 in rephraser then also I had performance issues.

I want to understand your view on this strategy of selective rephraser enabling and avoiding rephraser for responses like utter_can_do_something_else

Should this be an Okay pattern?

Yes, I believe it’s very sensible to selectively choose which responses to rephrase. Being explicit about when to employ rephrasing can help manage performance while maintaining quality where it’s most needed.