Ollama integraion, API_BASE key - Rasa Pro CALM

This is my config.yml:

recipe: default.v1
language: en
pipeline:
  - name: NLUCommandAdapter
  - name: MultiStepLLMCommandGenerator
    llm:
      provider: ollama
      model: example
      api_base: "http://localhost:8080/v1"
    flow_retrieval:
      embeddings:
        provider: "huggingface_local"
        model: "sentence-transformers/all-mpnet-base-v2"
        model_kwargs:
          device: "cpu"
        encode_kwargs:
          normalize_embeddings: true

policies:
- name: FlowPolicy
assistant_id: 20241023-095359-finite-subfloor

I am trying to host this model in local using ollama. Then its asking to set OLLAMA_API_BASE key to set.

I have tried several ways by changing endpoints and port in the api_base in the config.yml and terminal/powershell. And also i tried setting different ports in terminal while hosting the model like this OLLAMA_HOST=8080 ollama serve

No matter what I do the response isnt reaching the hosted model from rasa inspect. Could anyone explain and tell /which endpoint should I set while hosting and using it with rasa inspect?

Any helping response would be appreciated. @jtrasa