I’m trying to use Azure OpenAI with local Rasa CALM environment installed on my laptop:
Rasa Version : 3.10.7
Minimum Compatible Version: 3.10.0rc1
Rasa SDK Version : 3.10.0
Python Version : 3.9.13
Operating System : Windows-10-10.0.19045-SP0
Python Path : C:\Projects\rasa\Scripts\python.exe
I’ve created a demo project via “rasa init” and edited config.yml as per the documentation:
recipe: default.v1
language: en
pipeline:
- name: SingleStepLLMCommandGenerator
llm:
model: gpt-4
provider: azure
api_type: azure
api_base: https://xyz.openai.azure.com/
api_version: 2024-02-15-preview
deployment: gpt-4
policies:
- name: FlowPolicy
# - name: EnterpriseSearchPolicy
# - name: RulePolicy
assistant_id: 20241020-144759-scared-pocket
I’ve also set RASA_PRO_LICENSE and AZURE_API_KEY environment variables.
However when I run “rasa train” I get the following error:
(rasa) PS C:\Projects\rasa-examples\transfer_money> rasa train
2024-10-20 19:08:59 INFO rasa.tracing.config - No endpoint for tracing type available in endpoints.yml,tracing will not be configured.
2024-10-20 19:09:01 ERROR rasa.shared.providers.llm._base_litellm_client - [error ] Environment variables: ['OPENAI_API_KEY'] not set. Required for API calls. event_key=base_litellm_client.validate_environment_variables missing_environment_variables=['OPENAI_API_KEY']
2024-10-20 19:09:01 ERROR rasa.shared.utils.llm - [error ] contextual_response_rephraser.init.llm_instantiation_failed error=ProviderClientValidationError("Environment variables: ['OPENAI_API_KEY'] not set. Required for API calls.") message=Unable to instantiate LLM client.
Unable to create the LLM client for component - ContextualResponseRephraser. Please make sure you specified the required environment variables. Error: Environment variables: ['OPENAI_API_KEY'] not set.
Required for API calls.
So I’ve set OPENAI_API_KEY and ran “rasa train” again. This time it proceeded further, but still failed with the following error:
2024-10-20 19:11:02 ERROR rasa.dialogue_understanding.generator.flow_retrieval - [error ] Failed to populate the FAISS store with the provided flows. error=ProviderClientAPIException("Failed to embed documents\nOriginal error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: 5df41196********************ad37. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}})") error_type=ProviderClientAPIException event_key=flow_retrieval.populate_vector_store.not_populated
2024-10-20 19:11:02 ERROR rasa.dialogue_understanding.generator.llm_based_command_generator - [error ] Flow retrieval store isinaccessible. error=ProviderClientAPIException("Failed to embed documents\nOriginal error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: 5df41196********************ad37. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}})") event_key=llm_based_command_generator.train.failed
2024-10-20 19:11:02 ERROR rasa.engine.graph - [error ] graph.node.error_running_component node_name=train_SingleStepLLMCommandGenerator0
ProviderClientAPIException: ProviderClientAPIException:
Failed to embed documents
Original error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: 5df41196********************ad37. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Then I found a similar thread - and added the “flow_retreival” property to SingleStepLLMCommandGenerator in my config.yml based on it:
flow_retrieval:
embeddings:
model: text-embedding-ada-002
provider: azure
api_type: azure
api_base: https://xyz.openai.azure.com/
api_version: 2024-02-15-preview
deployment: text-embedding-ada-002
Finally “rasa train” completed successfully.
However when I run “rasa shell” or “rasa inspect” and type any utterance I get the following error:
2024-10-20 19:16:55 ERROR rasa.core.nlg.summarize - [error ] summarization.error error=ProviderClientAPIException("\nOriginal error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: 5df41196********************ad37. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}})")
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-10-20 19:16:57 ERROR rasa.utils.log_utils - [error ] nlg.llm.error error=ProviderClientAPIException("\nOriginal error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: 5df41196********************ad37. You can find your API key at https://platform.openai.com/account/api-keys.', 'type':
'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}})")
What am I missing now? If I read logs correctly it still seems to be trying to access OpenAI endpoints instead of Azure ones…