Local llm error with enviromental variable OPENAI key

Hello everyone, I have installed the latest rasa version and I am trying to use a local llm but I get this error:

Unable to create the LLM client for component - ContextualResponseRephraser. 
Please make sure you specified the required environment variables. 
Error: Environment variables: ['OPENAI_API_KEY'] not set. Required for API calls.

This is my config file:

recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
  llm:
    provider: self-hosted
    model: 'merged_model_gemma_27b_it'
    type: "openai"
    openai_api_key: foobar
    api_base: http://127.0.0.1:5000/v1 
    timeout: 800
    temperature: 0.7

If i add the open api key in the environmental variables as NULL or blank or something else it returns connection error from openai.

Thank you very much

Can you try without mentioning the type as openai, and also without the openai_api_key? Like it is done here:

yes I have tried like this but it is not working.