Below is my config file-
recipe: default.v1
language: en
pipeline:
- name: LLMCommandGenerator
llm:
type: “huggingface_hub”
repo_id: “meta-llama/Llama-2-7b-chat-hf”
task: “text-generation”
flow_retrieval:
embeddings:
type: “huggingface_hub”
repo_id: “sentence-transformers/all-mpnet-base-v2”
policies:
- name: EnterpriseSearchPolicy
- name: RulePolicy
assistant_id: 20240508-150129-azure-turmeric
ERROR - 2024-05-08 18:00:35 ERROR rasa.dialogue_understanding.generator.llm_command_generator - [error ] llm_command_generator.llm.error error=ValueError(‘Error raised by inference API: Cannot override task for LLM models’)
Its showing same for all the models l tried using HF, Can anyone help me to solve this issue.
1 Like
fellinn
(Fellinn)
3
I got the same error. Have anybody already figured out a solution to this?
I think we should find a compatible huggingface_hub
package version that is compatible with langchain
package version 0.0.329 since rasa uses it.
Please note:
-
The currently accepted task for embedding in huggingface_hub
is only feature-extraction
-
Currently only embedding models from the sentence-transformers/ repos are accepted.
-
A valid configuration may be like this
pipeline:
- name: LLMCommandGenerator
llm:
type: "huggingface_hub"
repo_id: "HuggingFaceH4/zephyr-7b-beta"
task: "text-generation"
flow_retrieval:
embeddings:
type: "huggingface_hub"
repo_id: "sentence-transformers/distilbert-base-nli-mean-tokens"
task: "feature-extraction"