Rasa API replace curently loaded model

Hello everyone.

Im trying to load others trained nlu models during conversations, i have found the API page that provide the “model” endpoint with PUT method.

But i found your error tracking quite hard to understand.

Application side i perform curl command:

curl -X PUT http://localhost:5005/model -H 'Content-type: application/json' -d '{"model_file":"/models/prestataire.tar.gz","model_server":{"url":"string","params":{ },"headers":{ },"basic_auth":{ },"token":"string","token_name":"string","wait_time_between_pulls":0},"remote_storage":"None"}'

response is:

{"version":"1.3.8","status":"failure","message":"An unexpected error occurred. Error: string","reason":"LoadingError","details":{},"help":null,"code":500}

Ok error 500. no probleme.

But Rasa side i have this:

rasa_1           | 2019-10-16 14:55:23 ERROR    rasa.core.agent  - Could not load model due to string.

This is this line i fully not understand. can you tell me what is wrong.

Model folder:

$ pwd
<....>/dockerrasa/models
$ ls
default.tar.gz  prestataire.tar.gz

The cause of this error is “remove_storage” field is an enum and None isn’t in. but i don’t want to use remote storage or model server. So i perform this line:

curl -X PUT http://localhost:5005/model -H 'Content-type: application/json' -d '{"model_file":"/models/prestataire.tar.gz"}'

Response is:

{"version":"1.3.8","status":"failure","message":"Agent with name '\/models\/prestataire.tar.gz' could not be loaded.","reason":"BadRequest","details":{"parameter":"model","in":"query"},"help":null,"code":400}

Edit: i use docker compose

Both models are trainned using docker command:

docker run -v $(pwd):/app rasa/rasa:latest-full train --domain domain.yml --data data --out models --config config.yml --fixed-model-name default 

docker run -v $(pwd):/app rasa/rasa:latest-full train --domain domain.yml --data data_prestataire --out models --config config.yml --fixed-model-name prestataire

$ ls data_prestataire
nlu.md

Try out

curl -X PUT http://localhost:5005/model -H ‘Content-type: application/json’ -d ‘{“model_file”:“/app/models/prestataire.tar.gz”,}’

You need to provide the absolute path of the model.

Any idea how to use it with “remote_storage”:“aws”?

same question, did you found any answer for this?

please, give me a guide about the absolute path of the model. I used path “D:/ChatbotCode/ProjectChatBot/CreateChatBot/models/20230629-094247-excited-message.tar.gz”. But Server still response the message which is "“Agent with name ‘D:/ChatbotCode/ProjectChatBot/CreateChatBot/models/20230629-094247-excited-message.tar.gz’ could not be loaded.”. Please Help me. Thanks a lot