RASA 3.0 large amount of memory

Hello, I have been involved in the development of conversational agents for a long time both at the university and now in my company. Namely, I started making conversational agents with version 2.0, at that time I wasn’t monitoring my memory that much. Now that I’m working for myself and we’re making a product for one company that includes speech technologies related to a robot and we have 50 conversational agents built for them, I’m interested in how, if there is any possibility, we can reduce the consumption of models. I am using the default pipeline and rasa version 3.3. I wonder if by changing the batch size I can change this. Currently, each model consumes between 800MB-1GB RAM depending on the size of the model. I also have a question, because it seems from my testing that the learning of the rasa models takes place on the cores, and I’m wondering if I can use some command to limit this or choose how many cores I want the rasa to occupy during training or startup. If, of course, anyone knows the answers to these questions, I would be very happy and maybe clarify some things for me.

Kind regards, Daniel