Getting error "[error ] llm command generator.llm.error error=AttributeError("module 'cohere' has no attribute 'error'")", how to resolve it

my config file:

recipe: default.v1 language: en pipeline:

  • name: LLMCommandGenerator llm: type: “cohere” model: “command”

    flow_retrieval: embeddings: type: “huggingface” model_name: “moka-ai/m3e-base” model_kwargs: device: “cpu” encode_kwargs: normalize_embeddings: True

policies:

  • name: FlowPolicy

and I also have my cohere API key in .env file.

When I run rasa inspect, and enter Hi in the inspector am getting the response is not printing and its showing: “Sorry, I am having trouble with that. Please try again in a few minutes.”.

and in terminal its showing “[error ] llm command generator.llm.error error=AttributeError(“module ‘cohere’ has no attribute ‘error’”)”.

Please suggest me how to resolve it as am new to RASA CALM and am using python version 3.10.11 in mac

I believe this is the same issue discussed here: post

Rasa Pro 3.8.x doesn’t support Cohere as of now, so try downgrading it to 3.7.9 or similar and downgrade the cohere version

pip install cohere==4.57