Exception: ImportError: cannot import name 'Interpreter' from 'rasa.nlu.model'

Hi Team,

I am trying to run just intent classification using rasa NLU. Running fine on my local machine but getting error as deployed azure function

I am receiving import error for interpreter. It is running on VS Code on my local machine (windows, py 3.6.6) but giving error when i deploy it as Azure Func (As per logs; Machine - Linux py 3.7) Following are the files, code and error that I am using. I am using a pre-trained model and named it’s directory as “nlu_new”. I am able to get predictions

requirements.txt ------------------------------------------------------------------------>

azure-functions
rasa
rasa[transformers]

init.py ------------------------------------------------------------------------------------------->

import logging
import azure.functions as func
from rasa.nlu.model import Interpreter
import json


def main(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')
    message = req.get_json()
    logging.info(message)
    msg1=message.get("text")
    model_directory1='./nlu_new'
    interpreter = Interpreter.load(model_directory1) ## this should be an extracted model

    result = interpreter.parse(msg1,only_output_properties=False)
    new_result=dict(text=result['text'],intent=result['intent'],entities=result['entities'],intent_ranking=result['intent_ranking'])
    logging.info(new_result)

    return func.HttpResponse(
    json.dumps(new_result),mimetype="application/json",
      status_code=200)

Model Trained on ---------------------------------------------------------- >

language: en

pipeline:
- name: HFTransformersNLP
  model_weights: "bert-base-uncased"
  model_name: "bert"
- name: LanguageModelTokenizer            # splits the sentence into tokens
- name: LanguageModelFeaturizer

- name: DIETClassifier

Result: Failure Exception: ImportError: cannot import name 'Interpreter' from 'rasa.nlu.model' (/home/site/wwwroot/.python_packages/lib/site-packages/rasa/nlu/model.py). Troubleshooting Guide: https://aka.ms/functions-modulenotfound Stack: File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/dispatcher.py", line 309, in _handle__function_load_request func_request.metadata.entry_point) File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/utils/wrappers.py", line 42, in call raise extend_exception_message(e, message) File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/utils/wrappers.py", line 40, in call return func(*args, **kwargs) File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/loader.py", line 85, in load_function mod = importlib.import_module(fullmodname) File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 1006, in _gcd_import File "<frozen importlib._bootstrap>", line 983, in _find_and_load File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 677, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/site/wwwroot/VA_AZFUNC1/__init__.py", line 5, in <module> from rasa.nlu.model import Interpreter

@akgarg00 can you share the rasa --version ?

2.8.15

Are you sure the rasa version on your local machine v/s the one on azure is the same? I see there is a difference between python versions between your environments

I had already tried checking the version but i could not find it. It was always checking this import first :frowning: However, i made a change to requirements.txt

rasa==2.8.15
transformers==2.11.0
rasa[transformers]

and my code ran. Seems like the issue would have been rasa version. Since, on local i had python 3.6.6 so it was picking rasa version 2.8.15 and similar supported transformers but on azure function server it was python 3.7 so was loading latest rasa version and latest transformers.

But now i have another issue and I really need some help on this. The command

interpreter = Interpreter.load(model_directory1)

is taking too much time to load model (it builds model from scratch , downloading bert model vocab and other configurations). It happens every time azure function is called. It takes nearly 2 minutes. Is there a way by which I can save this interpreter object and just consume it in a Azure Function.

Try python’s LRU Cache

1 Like