Rasa Pro Tutorial with Azure OpenAI Service not working

Greetings!

I am having problems running the Rasa Pro tutorial locally. My config.yml looks like this:

recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
  llm:
    model_name: gpt-3.5-turbo
    api_type: azure
    api_base: https://[CUSTOM].openai.azure.com/
    api_version: 2023-12-01-preview
    engine: [CUSTOM]

policies:
- name: FlowPolicy

In addition, I have stored the OpenAI API key as an environment variable (OPENAI_API_KEY) within my terminal. If I execute rasa train in my terminal I get the following error:

(.venv) ~/PycharmProjects/rasa-pro-calm-demo
rasa train
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/pydantic/_migration.py:283: UserWarning: `pydantic.error_wrappers:ValidationError` has been moved to `pydantic:ValidationError`.
  warnings.warn(f'`{import_path}` has been moved to `{new_location}`.')
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/opentelemetry/context/__init__.py:22: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
  from pkg_resources import iter_entry_points
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
  declare_namespace(pkg)
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
  declare_namespace(pkg)
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
  declare_namespace(pkg)
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('ruamel')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
  declare_namespace(pkg)
/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/google/rpc/__init__.py:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
  pkg_resources.declare_namespace(__name__)
2024-04-30 16:39:53 INFO     rasa.tracing.config  - No endpoint for tracing type available in endpoints.yml,tracing will not be configured.
2024-04-30 16:39:59 INFO     rasa.cli.train  - [info     ] Started validating domain and training data... event_key=cli.train.run_training
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] Validating intents...          event_key=validator.verify_intents_in_stories.start
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] Validating uniqueness of intents and stories... event_key=validator.verify_example_repetition_in_intents.start
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] Validating utterances...       event_key=validator.verify_utterances_in_dialogues.start
2024-04-30 16:40:17 WARNING  rasa.validator  - [warning  ] The utterance 'utter_float_slot_rejection' is not used in any story, rule or flow. event_key=validator.verify_utterances_in_dialogues.not_used utterance=utter_float_slot_rejection
2024-04-30 16:40:17 WARNING  rasa.validator  - [warning  ] The utterance 'utter_categorical_slot_rejection' is not used in any story, rule or flow. event_key=validator.verify_utterances_in_dialogues.not_used utterance=utter_categorical_slot_rejection
2024-04-30 16:40:17 WARNING  rasa.validator  - [warning  ] The utterance 'utter_free_chitchat_response' is not used in any story, rule or flow. event_key=validator.verify_utterances_in_dialogues.not_used utterance=utter_free_chitchat_response
2024-04-30 16:40:17 WARNING  rasa.validator  - [warning  ] The utterance 'utter_boolean_slot_rejection' is not used in any story, rule or flow. event_key=validator.verify_utterances_in_dialogues.not_used utterance=utter_boolean_slot_rejection
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] Story structure validation...  event_key=validator.verify_story_structure.start
2024-04-30 16:40:17 INFO     rasa.core.training.story_conflict  - Considering all preceding turns for conflict analysis.
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] No story structure conflicts found. event_key=validator.verify_story_structure.no_conflicts
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] validation.flows.started
2024-04-30 16:40:17 INFO     rasa.validator  - [info     ] validation.flows.ended
2024-04-30 16:40:22 WARNING  rasa.engine.validation  - [warning  ] `pattern_chitchat` has an action step with `action_trigger_chitchat`, but `IntentlessPolicy` is not configured. event_key=flow_component_dependencies.pattern_chitchat.intentless_policy_not_configured
2024-04-30 16:40:23 INFO     rasa.engine.training.hooks  - Starting to train component 'LLMCommandGenerator'.
2024-04-30 16:40:23 INFO     rasa.dialogue_understanding.generator.llm_command_generator  - [info     ] llm_command_generator.flow_retrieval.enabled
2024-04-30 16:40:24 INFO     openai  - error_code=invalid_api_key error_message='Incorrect API key provided: *******************. You can find your API key at https://platform.openai.com/account/api-keys.' error_param=None error_type=invalid_request_error message='OpenAI API error received' stream_error=False
2024-04-30 16:40:24 ERROR    rasa.dialogue_understanding.generator.flow_retrieval  - [error    ] Failed to populate the FAISS store with the provided flows. error=AuthenticationError(message='Incorrect API key provided: *******************. You can find your API key at https://platform.openai.com/account/api-keys.', http_status=401, request_id=None) error_type=AuthenticationError event_key=flow_retrieval.populate_vector_store.not_populated
2024-04-30 16:40:24 ERROR    rasa.dialogue_understanding.generator.llm_command_generator  - [error    ] Flow retrieval store isinaccessible. error=AuthenticationError(message='Incorrect API key provided: *******************. You can find your API key at https://platform.openai.com/account/api-keys.', http_status=401, request_id=None) event_key=llm_command_generator.train.failed
Traceback (most recent call last):
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/graph.py", line 526, in __call__
    output = self._fn(self._component, **run_kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/dialogue_understanding/generator/llm_command_generator.py", line 213, in train
    self.flow_retrieval.populate(flows.user_flows, domain)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/dialogue_understanding/generator/flow_retrieval.py", line 204, in populate
    self.vector_store = FAISS.from_documents(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/schema/vectorstore.py", line 510, in from_documents
    return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/vectorstores/faiss.py", line 911, in from_texts
    embeddings = embedding.embed_documents(texts)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/embeddings/openai.py", line 488, in embed_documents
    return self._get_len_safe_embeddings(texts, engine=self.deployment)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/embeddings/openai.py", line 374, in _get_len_safe_embeddings
    response = embed_with_retry(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/embeddings/openai.py", line 107, in embed_with_retry
    return _embed_with_retry(**kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
  File "/Users/User/.pyenv/versions/3.10.14/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/Users/User/.pyenv/versions/3.10.14/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/langchain/embeddings/openai.py", line 104, in _embed_with_retry
    response = embeddings.client.create(**kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/openai/api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
openai.error.AuthenticationError: Incorrect API key provided: *******************. You can find your API key at https://platform.openai.com/account/api-keys.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/bin/rasa", line 8, in <module>
    sys.exit(main())
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/__main__.py", line 133, in main
    cmdline_arguments.func(cmdline_arguments)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/cli/train.py", line 61, in <lambda>
    train_parser.set_defaults(func=lambda args: run_training(args, can_exit=True))
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/cli/train.py", line 104, in run_training
    training_result = train_all(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/api.py", line 105, in train
    return asyncio.run(
  File "/Users/User/.pyenv/versions/3.10.14/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/model_training.py", line 246, in train
    return await _train_graph(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/model_training.py", line 336, in _train_graph
    await trainer.train(
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/training/graph_trainer.py", line 106, in train
    await graph_runner.run(inputs={PLACEHOLDER_IMPORTER: importer})
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/runner/dask.py", line 103, in run
    dask_result = await execute_dask_graph(run_graph, run_targets)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/runner/dask.py", line 254, in execute_dask_graph
    await fire_task()
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/runner/dask.py", line 246, in fire_task
    task_result = await _execute_task(dsk[key], data)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/runner/dask.py", line 176, in _execute_task
    return await func(*awaited_args)
  File "/Users/User/PycharmProjects/rasa-pro-calm-demo/.venv/lib/python3.10/site-packages/rasa/engine/graph.py", line 533, in __call__
    raise GraphComponentException(
rasa.engine.exceptions.GraphComponentException: Error running graph component for node train_LLMCommandGenerator0.

It seems that the configuration in the config.yml regarding the LLM is not taken over.

Thanks for your help!

The error message clearly shows that the API key provided is incorrect. You should double check that you’ve correctly set up the environment variable OPENAI_API_KEY in your terminal.

Make sure that you’ve copied the API key accurately and that there are no extra spaces or characters and verify that the API key hasn’t expired and that it has the necessary permissions to access the OpenAI service.

Once you’ve confirmed that the API key is correct, try running the rasa train command again. If the issue persists, you may need to troubleshoot further by checking the API key’s permissions and verifying the endpoint configuration in your config.yml file.

Hi, this is due to the flow_retrieval being enabled by default. Note this line from your debug output:

2024-04-30 16:40:23 INFO rasa.dialogue_understanding.generator.llm_command_generator - [info ] llm_command_generator.flow_retrieval.enabled

If your assistant has a large number of flows you may find this feature helpful. This feature is enabled by default and uses an embeddings model. It defaults to the OpenAI endpoints if not configured.

To disable the feature you can add:

flow_retrieval:
  active: false

In your LLMCommandGenerator configuration.

If you’ve deployed an embeddings model in your Azure environment and want to experiment with the flow_retrieval feature your configuration would look similar to:

  flow_retrieval:
    active: true
    embeddings:
      engine: [name of your text-embedding-ada-002 model]
      api_base: https://[your endpoint].openai.azure.com
      api_type: azure
      api_version: "2023-05-15"

You can find more information on configuring the flow retrieval in our docs here.

1 Like

Thank you Emily for your advise. Within the config.yml the API type is explicitly set to azure, so therefore I find it confusing that Rasa tries to authenticate with the OpenAI API and not with Azure. Also I use those same credentials on a daily basis, so I know that those are definitely correct. → I was wrong!

EDIT: No, Emily you were right. I had a small typo in my API key. After fixing that I still got an error regarding the Flow retrieval feature but with the hint of Chris I got that fixed. So now everything is running. Thanks!

This solved my issue and I was able to train the model. Thank you so much! You are great, Chris! Maybe a little hint in the documentation would have been helpful, since I just started experimenting with Rasa. For this reason it was not clear to me and I mistakenly focussed only on the explicit error message. But even then in my opinion for a beginner it is still hart to decipher that the problem is rooted within the flow_retrieval feature.

After the help of Emily and Chris I was able to run the tutorial locally on my machine with the Azure OpenAI Service with the following config.yml:

recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
  flow_retrieval:
    embeddings:
      model: "text-embedding-ada-002"
      openai_api_type: "azure"
      openai_api_base: "MY_API_BASE"
      openai_api_version: "2023-12-01-preview"
      openai_api_key: "MY_API_KEY"
      deployment: "MY_DEPLOYMENT"
  llm:
    model_name: "gpt-3.5-turbo"
    openai_api_type: "azure"
    openai_api_base: "MY_API_BASE"
    openai_api_version: "2023-12-01-preview"
    openai_api_key: "MY_API_KEY"
    deployment: "MY_DEPLOYMENT"

policies:
- name: FlowPolicy

Thank you for the feedback and we will look to improve the documentation around this feature!

Hii I have used the same above config file(with my credentials) but still getting error and not able to train the model in my local. Pls anyone help

ProviderClientAPIException: ProviderClientAPIException: Failed to embed documents Original error: litellm.APIError: AzureException APIError - Error code: 404 - {‘error’: {‘code’: ‘404’, ‘message’: ‘Resource not found’}} Getting this error…