NLU training on my data is consuming a lot of time.
Details are as following -
Data size - 3.7 MB RAM size - 32 GB No. of cores - 12 Training time - 40 mins
NLU config is as follows -
language: en pipeline: - name: nlp_spacy - name: tokenizer_spacy - name: intent_entity_featurizer_regex - name: ner_synonyms - name: tokenizer_whitespace - name: ner_crf - name: intent_featurizer_count_vectors analyzer: 'WORD' min_ngram: 1 # int max_ngram: 1 # int - name: intent_classifier_tensorflow_embedding epochs: 100
Can anybody tell me why is training taking a lot of time on such small data?