Error while using ollama llama3.1

I am getting that error

ERROR rasa.dialogue_understanding.generator.llm_based_command_generator - [error ] llm_based_command_generator.llm.error error=ProviderClientAPIException(“\nOriginal error: litellm.APIError: APIError: OpenAIException - Error code: 500 - {‘error’: {‘message’: 'llama runner process no longer running: -1 ', ‘type’: ‘api_error’, ‘param’: None, ‘code’: None}})”)

and Let me Share with you The structure of my files first : image

The structure of data files : image

config.yml :

flows.yml:

nlu.yml: image

patterns.yml:

rules.yml:

stories.yml:

domain.yml:

For Running command I am using that :

it needs OPENAI API KEY just for running and can’t run without it : export OPENAI_API_KEY=**************************************************** (I didn’t pay for it just for making RASA CALM project RUN)

export RASA_PRO_LICENSE=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJqdGkiOiJkYTViOGRmZC02NjRmLTQ5Y2ItYTFhNS02OTk3ZmIzN2JiNDMiLCJpYXQiOjE3MjU1NTgwMTEsIm5iZiI6MTcyNTU1ODAwOCwic2NvcGUiOiJyYXNhOnBybyByYXNhOnBybzpjaGFtcGlvbiIsImV4cCI6MTgyMDE2NjAwOCwiZW1haWwiOiJhbGFhLnNheWVkQGV4cGVydGZsb3cuY29tIiwiY29tcGFueSI6IlJhc2EgQ2hhbXBpb25zIn0.HgAZHBN00LJ1fv6rI5ZdTjXxiesdfdpNtrtDk0-TPq-4tWvnX7xWf8w_TloB1rCB3Gg_pU73vXzioCxtebS6rG_O3w-nXpRmntGsae-jI5y5jK2kqFDhEJ1gn5pX5Yzi0rhyt6AofgAm3SD6uiyvbLCnr0qia0HUQWFcZrF5YYsnqgIUOspVPRMH5S2X3Cu7wgMtZ0Ia3VIP0EqTVCtFYYSQsjr8pzSPJT02claDnJATzgqVq2QIqN1c1S4bMAHfb1h43KcPz22_GQbhSd8MbwuX-jZ1oOYptARxzYzXuve8lqVso-VrxHs1kHuf4Wxr4osYzv33-qSNJMIaJK9lcA

RASA train rasa shell

but I am getting the above error.