Rasa CALM: Could not import huggingface_hub python package

When attempting to run a model from Huggingface, getting an error about the H.F. package even though I have loaded it into the environment with pip as per the error msg:

“Please install it with pip install huggingface_hub. (type=value_error)”

here’s the confi.yml

recipe: default.v1
language: en
pipeline:
- name: KeywordIntentClassifier
- name: NLUCommandAdapter
- name: LLMCommandGenerator
  llm:
    type: "huggingface_hub"
    repo_id: "mistralai/Mistral-7B-v0.1"
    task: "text-generation"

policies:
- name: FlowPolicy
- name: EnterpriseSearchPolicy
- name: IntentlessPolicy

I have tried many different versions of this pipeline with no luck so far. Please advise.

P.S. I was able to successfully pipeline a custom fine-tuned gpt model which is very cool !

update on this issue.

I had been hasty when I ran through demo install and neglected to update poetry …

  • pip install huggingface_hub

  • poetry add huggingface_hub

that made import hugginface_hub work…

now I’m on to the challenge of getting the pipeline to work with a huggingface AWS endpoint…

for models like meta-llama/Meta-Llama-3-8B … a URL is given ( paid versions )

Any thoughts about how to configure an LLM endpoint via AWS url, please share… !

Thanks

Can you share how you worked with hugging face? Because I want to use rasa with a local llm. Any helps will be highly appreciated.