Your input → hi
2024-04-03 12:47:40 ERROR rasa.utils.log_utils - [error ] llm_command_generator.llm.error error=AttributeError(“module ‘cohere’ has no attribute ‘error’”)
Sorry, I am having trouble with that. Please try again in a few minutes.
Which version of rasa and cohere are you using? Please make sure that you have set COHERE_API_KEY environment variable as described here: LLM Providers.
I am using Command R model which provides free trial key so that I could check the functioning before purchasing. I have even set the COHERE_API_KEY . The error says AttributeError(“module ‘cohere’ has no attribute ‘error’”.
We got to the root of the problem. Langchain is using the following import in cohere.py which is failing as the code was moved: cohere.error.CohereError (should be cohere.CohereError ).
So that is failing. We recently switched to the async execution of our llm calls. In the sync version that particular code was not called and did not caused any errors.
It looks like the issue is still present in the latest version of langchain (link).
We can open an issue on langchain side to fix this, but I’m afraid we cannot use cohere as of now with Rasa Pro 3.8.0.
A possible workaround for now is using rasa pro 3.7.9 and downgrading cohere to an older version:
pip install cohere==4.57
We will update our documentation and I will let you know if/when it is fixed. Thanks for trying out Rasa Pro!