Installed Rasa but getting error: No module named 'rasa_nlu'

Hello there! I installed Rasa using a virtual environment with Python 3.8. When I try to run the following:

from rasa_nlu.converters import load_data
training_data = load_data("./training_data.json")

I get this error:

> Traceback (most recent call last):
>   File "/Users/alexmoore/Library/Mobile Documents/com~apple~CloudDocs/practice/", line 1, in <module>
>     from rasa_nlu.converters import load_data
> ModuleNotFoundError: No module named 'rasa_nlu'

If I type

rasa --version

into the terminal. I get:

Rasa Version : 2.0.3 Rasa SDK Version : 2.0.0 Rasa X Version : None Python Version : 3.8.6 Operating System : macOS-10.15.5-x86_64-i386-64bit Python Path : /Library/Frameworks/Python.framework/Versions/3.8/bin/python3.8 alexmoore@Alexs-MacBook-Pro eli % rasa_nlu --version zsh: command not found: rasa_nlu alexmoore@Alexs-MacBook-Pro eli % “rasa_nlu” --version zsh: command not found: rasa_nlu

I am so frustrated. Any idea what is going on??

Ah, it looks like you might be using code from the legacy docs that’s no long supported in 2.x. If you’re trying to load data to train an NLU model, you’re looking for this function: rasa.shared.nlu.training_data.loading

So you’d rewrite your import like so:

from rasa.shared.nlu.training_data.loading import load_data

Hi, @rctatman so is rasa 2.0 fully compatible with python 3.8? I ask because I just started a project and installed 2.0 in a virtual environment with python 3.8. So far I have not encountered any problems but I only just began and will start over with python 3.7 if it’s more stable. :slightly_smiling_face:


Thanks all for the help. I am still exploring Rasa. When I stated my initial question I was trying to train a model manually. I decided to back up and just run

rasa init

and start with the pre-loaded documents. Everything is working fine so far.

@jbh I believe Python 3.8 is officially supported/included in our testing right now, but you may still run into some edge case bugs. Please do report those as issues on our GitHub if you run into them!