Determine the largest possible batch size

While training rasa-nlu I was getting ResourceExhaustedError. So I used fixed batch size. But as I will have variable sized datasets, I need to know how to calculate the largest possible batch size for any given datasets. Is there any way to do that?

I know that maximum batch size can be calculated using the formula:

Max batch size= available GPU memory bytes / 4 / (size of tensors + trainable parameters)

How do I get the tensor size and the number of trainable parameters?

what do you mean by size of tensors? Our algorithms contain a number of different tensors of different sizes, you can find them by looking at the source code. number of trainable parameters can be found by printing RasaModel.trainable_variables

size of tensors mean the size of my dataset. Sorry for my late reply.

@Ghostvv can you please tell me how can I approach to determine the largest possible batch size given any dataset?

I don’t know how to do it