Nothing is stopping you from doing just that. In fact, if you run;
rasa run nlu --enable-api
then you’ll be able to take your trained model and host it as a webservice. There’s even a blogpost over here that explains the details of getting such a service to production in Docker.