Local llm error with enviromental variable OPENAI key

Hello everyone, I have installed the latest rasa version and I am trying to use a local llm but I get this error:

Unable to create the LLM client for component - ContextualResponseRephraser. 
Please make sure you specified the required environment variables. 
Error: Environment variables: ['OPENAI_API_KEY'] not set. Required for API calls.

This is my config file:

recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
  llm:
    provider: self-hosted
    model: 'merged_model_gemma_27b_it'
    type: "openai"
    openai_api_key: foobar
    api_base: http://127.0.0.1:5000/v1 
    timeout: 800
    temperature: 0.7

If i add the open api key in the environmental variables as NULL or blank or something else it returns connection error from openai.

Thank you very much

Can you try without mentioning the type as openai, and also without the openai_api_key? Like it is done here:

yes I have tried like this but it is not working.

Hi all,

looks like we have the same problem using a Mistral model. First we got an error that says the command generator needs an OPENAI_API_KEY and after disabling the flow retrieval like in this post Rasa Pro Tutorial with Azure OpenAI Service not working - #2 by emilymoore04 we can train successfully but when chatting with the bot, another tool asks for the OPENAI_API_KEY

What we are trying is just get some experience with CALM with our Mistral model. And we are trying to get the default tutorial running with this model (Tutorial)

Our config looks like this:

recipe: default.v1
language: en
pipeline:
  - name: SingleStepLLMCommandGenerator
    llm:
      provider: mistral
      model: mistral-large-latest
    flow_retrieval:
      active: false

policies:
  - name: FlowPolicy
#  - name: EnterpriseSearchPolicy
#  - name: RulePolicy

And our .env file looks like:

RASA_PRO_LICENSE='rasa_pro_license_key'
MISTRAL_API_KEY='mistral_api_key'

Thanks in advance for your help!