Getting OpenAI API Key error after changing to Huggingface model

I have changed the default OpenAI model to Microsoft Phi-2 model from Huggingface. I followed the default tutorial: LLM Providers

please note: I have already installed huggingfacehub and set huggingface token as an environment variable. set HUGGINGFACEHUB_API_TOKEN=<token>

and made relevant changes to config.yaml file. For reference below config.yaml file:

recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
  llm:
    type: "huggingface_hub"
    repo_id: "microsoft/phi-2"
    task: "text-generation"

policies:
- name: FlowPolicy
#  - name: EnterpriseSearchPolicy
#  - name: RulePolicy
assistant_id: 20240424-061357-all-patch

I’m receiving the following error:

Traceback (most recent call last):
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\graph.py", line 526, in __call__
    output = self._fn(self._component, **run_kwargs)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\dialogue_understanding\generator\llm_command_generator.py", line 213, in train
    self.flow_retrieval.populate(flows.user_flows, domain)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\dialogue_understanding\generator\flow_retrieval.py", line 181, in populate
    embeddings = self._create_embedder(self.config)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\dialogue_understanding\generator\flow_retrieval.py", line 153, in _create_embedder
    return embedder_factory(
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\shared\utils\llm.py", line 245, in embedder_factory
    return embeddings_cls(**parameters)
  File "E:\Computer Science\CALM\venv\lib\site-packages\pydantic\v1\main.py", line 341, in __init__
    raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for OpenAIEmbeddings
__root__
  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass  `openai_api_key` as a named parameter. (type=value_error)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Program Files\Python38\lib\runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Program Files\Python38\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "E:\Computer Science\CALM\venv\Scripts\rasa.exe\__main__.py", line 8, in <module>
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\__main__.py", line 133, in main
    cmdline_arguments.func(cmdline_arguments)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\cli\train.py", line 61, in <lambda>
    train_parser.set_defaults(func=lambda args: run_training(args, can_exit=True))
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\cli\train.py", line 104, in run_training
    training_result = train_all(
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\api.py", line 105, in train
    return asyncio.run(
  File "C:\Program Files\Python38\lib\asyncio\runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "C:\Program Files\Python38\lib\asyncio\base_events.py", line 616, in run_until_complete
    return future.result()
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\model_training.py", line 246, in train
    return await _train_graph(
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\model_training.py", line 336, in _train_graph
    await trainer.train(
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\training\graph_trainer.py", line 106, in train
    await graph_runner.run(inputs={PLACEHOLDER_IMPORTER: importer})
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\runner\dask.py", line 103, in run
    dask_result = await execute_dask_graph(run_graph, run_targets)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\runner\dask.py", line 254, in execute_dask_graph
    await fire_task()
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\runner\dask.py", line 246, in fire_task
    task_result = await _execute_task(dsk[key], data)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\runner\dask.py", line 176, in _execute_task
    return await func(*awaited_args)
  File "E:\Computer Science\CALM\venv\lib\site-packages\rasa\engine\graph.py", line 533, in __call__
    raise GraphComponentException(
rasa.engine.exceptions.GraphComponentException: Error running graph component for node train_LLMCommandGenerator0.

Can someone help me to solve this issue?

using huggingface as LLM provider doesn’t work for me :frowning_face:

Did anyone get it working?