RASA NLU Use Multiple models at the same time

I have a use case only for NLU part, where I have to use several models at the same time. So, I’m using the below code. But the response time is very high around 4-6 sec. Is there any way where I can use several models at the same time and get the response very quickly (only NLU). I took a clone of git repo version 1.10.x

from rasa.nlu.model import Interpreter

def run_nlu(modelpath,predText):

interpreter = Interpreter.load(modelpath);print(interpreter.parse(predText))

Why do you have multiple models? It might be faster to include them all in one pipeline. But I guess that also depends on what kind of models you have. Can you share some more details?

Hi, In our use case, Let’s say we have each model for each client. How can we run multiple models at the same time?

What exactly is " include them all in one pipeline"?

So, you have one bot that should handle multiple tasks? If so, you can maybe have a look at Meta bots: why they're probably not the solution to your problems | Akela Drissner - YouTube.