Rasa-NLU Memory footprint and hardware requirement

I am working on a project where I am using rasa-nlu to extract entities from the user’s query. I have few doubts regarding Rasa-NLU. What is the memory footprint for rasa-nlu project which is developed using spacy-pipeline? And what rate this memory footprint increases with training data? How many CPU’s and how much RAM is required for rasa-nlu to run smoothly?

# Configuration for Rasa NLU.
# https://rasa.com/docs/rasa/nlu/components/
language: en
  - name: SpacyNLP
    case_sensitive: True
  - name: SpacyTokenizer
    intent_tokenization_flag: True
    intent_split_symbol: " "
  - name: SpacyFeaturizer
  - name: RegexFeaturizer
  - name: CRFEntityExtractor
  - name: EntitySynonymMapper
  - name: "regex.RegexEntityExtractor"
  - name: SklearnIntentClassifier

This is my config file.

Thank you in advance.

1 Like

For a full setup including Rasa Open Source and Rasa X, we recommend using at least 2 (better 2-6) vCPUs, and having at least 4GB (better 8GB) RAM.

If you are using only rasa-nlu, you’ll probably get away with less memory. It depends on the size of your project, I can’t give you specific estimates for how it scales with e.g. number of intents though. But if you follow the recommendations for the full setup, you should be on the safe side.

1 Like