I am integrating RASA with external LLM. While doing so, I am facing version conflict issues.
Rasa wants packaging version >=20, <21 while langchain wants packaging version <24.o, >23.2.
How to resolve this?
Error messages:
ERROR: pip’s dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
langchain-core 0.1.32 requires packaging<24.0,>=23.2, but you have packaging 20.9 which is incompatible.
ERROR: pip’s dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
rasa 3.6.15 requires packaging<21.0,>=20.0, but you have packaging 23.2 which is incompatible.
Sorry for the late response.
I found a workaround.
Please install following packages:
langchain==0.2.16
langchain-community==0.2.12
langchain-core==0.2.38
langchain-openai==0.1.23
langchain-text-splitters==0.2.4
langchain-together==0.1.5
langsmith==0.1.114
This will install the latest packaging version. Please uninstall it and install packaging==20.9 at last. It will work.
Create a Flask application specifically for LLM calls.
Run the Flask application in a separate container and define its own endpoint.
Handle User Input for RAG:
If your LLM setup involves a Retrieval-Augmented Generation (RAG) system that requires user input, define a function within the LLM Flask app to handle this input.
Create an endpoint in your Flask app for the UI code to provide user input. This ensures the Flask app can receive and process the user input from the UI.
Resolve Version Conflicts:
This separation will address any version conflicts between the RASA environment, which might be built on older packages, and the LLM application, which uses the latest packages.
This approach helps in managing dependencies and avoiding conflicts between different components.