How to integrate BERT to the nlu model?

Hi, I was wondering if anyone has any advice on how to integrate bert embeddings, intent classifier, etc to the nlu model?

You’ll have to write custom components for that.

Hi @ychen , hi @IgNoRaNt23,

I have created a repo for exactly that - without the necessity to create custom components.

If you need help, feel free to ask!

Regards Julian

2 Likes

Hi everyone,

I updated the repo today. Two new features were added:

  • Support for DistilBERT and other transformer-based architectures
  • Support for NER on a transformer-based model

Using e.g. DistilBERT resulted in very good results in terms of intent-detection with an absolute reasonable amount of training time. Now you would even be able to add your own custom entities and simply using the hybrid of spacy and rasa to auto-extract those entities for e.g. the usage in slots.

Regards Julian

1 Like