I’ve been attempting to train a rasa_core model with about 150k stories, but unfortunately, I’m running out of memory. I’ve tried to use a machine with 400GB of RAM and it still runs out of memory.
My question is: shouldn’t the batch_size define how much data I load to my RAM? Why am I having this problem even with a small batch_size ? It seems that rasa_core was implemented such that all data is loaded into memory, but haven’t anyone tried to use large story data yet?