Problem when using transformer in NLU pipeline

@Pain Pain, I am not expert in Arabic Pipeline, but I guess you have seen this if not please.

For example, let’s say you’ve found this Arabic model and you’re interested in using it. It’s a model based on the bert architecture, so the configuration for Rasa would be:

YAML

- name: LanguageModelFeaturizer  
model_name: bert  
model_weights: asafaya/bert-base-arabic

From here, Rasa would download the models on your behalf automatically. There are many Bert models that Rasa supports via this route. The main thing to keep in mind is that Bert models tend to require many computing resources to run. As a best practice, we recommend properly benchmarking the pipeline to ensure that the accuracy is worth the compute costs for these models. It’s certainly possible that adding Bert to a pipeline makes performance worse due to overfitting.

Ref Link: Non-English Tools for Rasa NLU | The Rasa Blog | Rasa

OR

While using HuggingFace : How to Use BERT in Rasa NLU | The Rasa Blog | Rasa

OR

While using Spacy : https://github.com/hashirabdulbasheer/rasa-financial-assistant-arabic-demo/blob/main/config.yml

I hope this will help you further.

1 Like