Unable to deploy on REST

Hi, I just run into the issue when I wanted to deploy my bot using RestInput and Chatroom. I’m using Rasa core Master version.

I am not sure if I should use the standard Rasa projector a custom Rasa Core project on CLI? I tried both. I followed the steps (Usage with a standard Rasa Core project) in Chatroom, but when I run python -m rasa_utils.bot -d models/dialogue -u models/current/nlu

The terminal returns:

Traceback (most recent call last):
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/Users/wisionlearning/Documents/dd0926/rasa_utils/bot.py", line 21, in <module>
    from rasa_core.channels import (
ImportError: cannot import name 'RestInput'

I also followed the steps on chat & voice platform RestChannel:

I made a credentials.yml as same as it should be:

rest:
  # you don't need to provide anything here - this channel doesn't
  # require any credentials

Then I run python -m rasa_core.run -d models/dialogue -u models/current/nlu/
–port 5002 --credentials credentials.yml

I got:

Using TensorFlow backend.
Traceback (most recent call last):
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/Users/wisionlearning/rasa_core/rasa_core/run.py", line 234, in <module>
    nlu_endpoint)
  File "/Users/wisionlearning/rasa_core/rasa_core/run.py", line 203, in main
    generator=nlg_endpoint)
  File "/Users/wisionlearning/rasa_core/rasa_core/agent.py", line 81, in load
    ensemble = PolicyEnsemble.load(path)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/ensemble.py", line 198, in load
    policy = policy_cls.load(policy_path)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/keras_policy.py", line 264, in load
    model_arch = cls._load_model_arch(path, meta)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/keras_policy.py", line 240, in _load_model_arch
    arch_file = os.path.join(path, meta["arch"])
KeyError: 'arch'

When I tried Simple Usage with a custom Rasa Core project on CLI

credential.yml:

</Users/wisionlearning/Documents/dd0926/rasa_utils>/bot_server_channel.BotServerInputChannel:
# pass

Then I run:

python -m rasa_core.run -vv \
  --core models/dialogue  \
  --nlu models/current/nlu  \
  --endpoints endpoints.yml \
  --credentials credentials.yml

I got the errors as:

Using TensorFlow backend.
Traceback (most recent call last):
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/Users/wisionlearning/anaconda3/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/Users/wisionlearning/rasa_core/rasa_core/run.py", line 234, in <module>
    nlu_endpoint)
  File "/Users/wisionlearning/rasa_core/rasa_core/run.py", line 203, in main
    generator=nlg_endpoint)
  File "/Users/wisionlearning/rasa_core/rasa_core/agent.py", line 81, in load
    ensemble = PolicyEnsemble.load(path)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/ensemble.py", line 198, in load
    policy = policy_cls.load(policy_path)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/keras_policy.py", line 264, in load
    model_arch = cls._load_model_arch(path, meta)
  File "/Users/wisionlearning/rasa_core/rasa_core/policies/keras_policy.py", line 240, in _load_model_arch
    arch_file = os.path.join(path, meta["arch"])
KeyError: 'arch'

Should I do the basic usage first? When I run yarn install, it returns many errors. Thanks for the help.

It seems like the rasa_core code problem. I made the change and raised the issue on github issue