Introducing The Algorithm Whiteboard!

Hey Rasa @community

Rasa Research Advocate Vincent Warmerdam (@koaning) presents a brand-new series on the Rasa YouTube channel: The Algorithm Whiteboard!

The field of natural language is expanding, and we want to make sure they are well understood by developers who use our tools. The goal of this playlist is to create a place where developers can learn about the ideas behind the algorithms and explain some of our research results.

The first episodes explain our new DIET algorithm for intent classification and entity extraction:

Have suggestions for future episodes?

Vincent is looking forward to reading them in his AMA thread!

Follow Vincent

/fishnets88 /koaning

More on DIET

Looking for more information about DIET? Check out our recent blog post by Senior Evangelist @maddymantha where she walks us through what is DIET, why to use DIET, and how to use DIET.


The Algorithm Whiteboard: Letter Embeddings is out now! :crown:

Rasa Research Advocate Vincent (@koaning) demonstrates how to train letter embeddings in preparation for the next episode: The Algorithm Whiteboard: Word Embeddings out March 30! Stay tuned and subscribe to be notified. :bell:

If you would like to reproduce the steps yourself shown in this video, you can find the resources, including links to live colab notebooks, on GitHub:


Hey everyone,

We have a new release for you!

Continuing this series of videos on word embeddings, Vincent explains the two variants of word2vec; continuous bag of words and skip gram!

1 Like

Hey all,

In this latest episode, Vincent explains GloVe embeddings and shows how to train your own variant of these embeddings!

Find the code for this project on GitHub as well as Colab. :slight_smile:


what’s special about GloVe vs. other formats of word embeddings? I should probably watch the video :smiley:

If you want to know what’s special about GloVe it would help if you saw the video. :wink:

In general, word embeddings are all different because they’re;

  • using different techniques to embed words to floats
  • trained on different datasets
1 Like

Hey all,

Here’s a brand-new video for you! Vincent demonstrates a new tool that we’ve open-sourced, it’s called “whatlies”, and the goal of the package is to help you find out “what lies” in word embeddings!

Find the documentation here, and the GitHub repository for the project below;

Need to catch up on past episodes?

Check the full Embeddings playlist!

1 Like

Where can i get original image of DIET architecture .

The original image can be found in the paper on arxiv. Linked here.