Interactive Learning on Docker

Hi,

I do not have the exact command to run interactive learning on Docker , I am trying command

sudo docker run   -v $(pwd):/app/project   -v $(pwd)/models/rasa_core:/app/models   rasa/rasa_core:latest   train interactive --core project/model/rasa_core/dialogue --stories project/data/stories.md -d project/domain.yml --endpoints project/config/endpoints.yml  -u project/models/rasa_nlu/current

I am getting error as :

usage: train.py [-h] {default,compare,interactive} ...
train.py: error: argument mode: invalid choice: 'project/stories.md' (choose from 'default', 'compare', 'interactive')

@Juste Please help me on this

Hi,

I had similar issues and finally got the bot to work with this command.

docker run -v $(pwd):/app/project -v $(pwd)/models/rasa_core:/app/models rasa/rasa_core:latest train --domain /app/project/domain.yml --stories /app/project/stories.md --out models

I get sometimes this error -

File “/usr/local/lib/python3.6/shutil.py”, line 482, in rmtree os.rmdir(path) OSError: [Errno 16] Device or resource busy: ‘models’

but when I run the same command again, it works…

… 24/24 [==============================] - 0s 495us/step - loss: 1.0936 - acc: 0.5000 - val_loss: 1.4416 - val_acc: 0.6667 2019-01-01 16:32:35 INFO rasa_core.policies.keras_policy - Done fitting keras policy model 2019-01-01 16:32:36 INFO rasa_core.agent - Persisted model to ‘/app/models’

Hope it helps.

-Sreeni

Hi @rsreeni train worked perfectly for me and I had the bot up and running. I was trying to start interactive training to generate stories , as it shows here

but this doesn’t show how to start it when deployed on docker.

Yes. Sorry for the confusion… I haven’t gotten down to that yet.

Thanks,

-Sreeni

Hi @stacybot123 ,

I tried with this command… docker run -v $(pwd):/app/project -v $(pwd)/models/rasa_core:/app/models rasa/rasa_core:latest run python -m rasa_core.train --interactive -o models -d /app/project/domain.yml -s /app/project/stories.md --nlu /app/models --endpoints /app/project/config/endpoints.yml

I get errors as missing pygraphviz module… do you have that dependency satisfied ?

ile “/app/rasa_core/training/interactive.py”, line 1015, in _plot_trackers write_dot(graph, output_file) File “/usr/local/lib/python3.6/site-packages/networkx/drawing/nx_agraph.py”, line 191, in write_dot ‘http://pygraphviz.github.io/’) ImportError: ('requires pygraphviz ', ‘http://pygraphviz.github.io/’)

Thx,

-Sreeni

I have python 3.7 installed in my mac. Looks like the rasa_core.train is looking for only python3.6… Is there a way to override that when I call rasa_core.train ?

Any help, any one ?

Thanks,

-Sreeni

@rsreeni the --interactive is not an argument. I tried

docker run -v $(pwd):/app/project -v $(pwd)/models/rasa_core:/app/models rasa/rasa_core:latest run python -m rasa_core.train interactive -o models -d /app/project/domain.yml -s /app/project/data/stories.md --nlu /app/models --endpoints /app/project/config/endpoints.yml

this worked but after Epoch , I got this error

2019-01-02 08:44:59 INFO     rasa_core.policies.keras_policy  - Done fitting keras policy model
Processed actions: 99it [00:00, 10063.16it/s, # examples=99]
2019-01-02 08:45:00 INFO     rasa_core.agent  - Persisted model to '/app/models'
2019-01-02 08:45:00 INFO     rasa_core.training.interactive  - Rasa Core server is up and running on http://localhost:5005
Processed Story Blocks: 100%|███████Bot loaded. Visualisation at http://localhost:5005/visualization.html.Ctr-c to abort):
Type a message and press enter (press 'Ctr-c' to exit).
? Next user input (Ctr-c to abort):

Warning: Output is not to a terminal (fd=1).
Warning: Input is not to a terminal (fd=0).
2019-01-02 08:45:00 ERROR    rasa_core.training.interactive  - An exception occurred while recording messages.
Traceback (most recent call last):
  File "/app/rasa_core/training/interactive.py", line 1213, in record_messages
    _enter_user_message(sender_id, endpoint)
  File "/app/rasa_core/training/interactive.py", line 1063, in _enter_user_message
    lambda a: not a)
  File "/app/rasa_core/training/interactive.py", line 313, in _ask_or_abort
    answers = questions.ask()
  File "/usr/local/lib/python3.6/site-packages/questionary/question.py", line 17, in ask
    return self.unsafe_ask(patch_stdout)
  File "/usr/local/lib/python3.6/site-packages/questionary/question.py", line 27, in unsafe_ask
    return self.application.run()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 699, in run
    return run()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 673, in run
    return f.result()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/future.py", line 149, in result
    raise self._exception
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/coroutine.py", line 90, in step_next
    new_f = coroutine.throw(exc)
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 652, in _run_async2
    result = yield f
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/coroutine.py", line 90, in step_next
    new_f = coroutine.throw(exc)
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 604, in _run_async
    result = yield From(f)
EOFError
Exception in thread Thread-7:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "/app/rasa_core/training/interactive.py", line 1213, in record_messages
    _enter_user_message(sender_id, endpoint)
  File "/app/rasa_core/training/interactive.py", line 1063, in _enter_user_message
    lambda a: not a)
  File "/app/rasa_core/training/interactive.py", line 313, in _ask_or_abort
    answers = questions.ask()
  File "/usr/local/lib/python3.6/site-packages/questionary/question.py", line 17, in ask
    return self.unsafe_ask(patch_stdout)
  File "/usr/local/lib/python3.6/site-packages/questionary/question.py", line 27, in unsafe_ask
    return self.application.run()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 699, in run
    return run()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 673, in run
    return f.result()
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/future.py", line 149, in result
    raise self._exception
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/coroutine.py", line 90, in step_next
    new_f = coroutine.throw(exc)
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 652, in _run_async2
    result = yield f
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/eventloop/coroutine.py", line 90, in step_next
    new_f = coroutine.throw(exc)
  File "/usr/local/lib/python3.6/site-packages/prompt_toolkit/application/application.py", line 604, in _run_async
    result = yield From(f)
EOFError

@rsreeni I have opened a issue on github , hope some team member will help on that

Somehow when I run mine with interactive ( without – ), I get an error which tells me that --interactive is an option. Also, with my try I do get the same message got as the bot is up…

2019-01-02 13:40:50 INFO rasa_core.policies.keras_policy - Done fitting keras policy model 2019-01-02 13:40:51 INFO rasa_core.agent - Persisted model to ‘/app/models’ 2019-01-02 13:40:51 INFO rasa_core.training.interactive - Rasa Core server is up and running on http://localhost:5005 Processed Story Blocks: 100%|██████████| 3/3 [00:00<00:00, 1517.48it/s, # trackers=1] 2019-01-02 13:40:51 ERROR rasa_core.training.interactive - An exception occurred while recording messages. Traceback (most recent call last): File “/usr/local/lib/python3.6/site-packages/networkx/drawing/nx_agraph.py”, line 188, in write_dot import pygraphviz ModuleNotFoundError: No module named ‘pygraphviz’

Then it goes down looking for pygraphviz.

Here is the info I get indicating to use --interactive

docker run -v $(pwd):/app/project -v $(pwd)/models/rasa_core:/app/models rasa/rasa_core:latest run python -m rasa_core.train interactive -o models -d /app/project/domain.yml -s /app/project/data/stories.md --nlu /app/models --endpoints /app/project/config/endpoints.yml usage: train.py [-h] (-s STORIES | --url URL | --core CORE) [-o OUT] [-d DOMAIN] [-u NLU] [–history HISTORY] [–epochs EPOCHS] [–validation_split VALIDATION_SPLIT] [–batch_size BATCH_SIZE] [–interactive] [–skip_visualization] [–finetune] [–augmentation AUGMENTATION] [–debug_plots] [–dump_stories] [–endpoints ENDPOINTS] [–nlu_threshold NLU_THRESHOLD] [–core_threshold CORE_THRESHOLD] [–fallback_action_name FALLBACK_ACTION_NAME] [-v] [-vv] [–quiet] train.py: error: unrecognized arguments: interactive

-Sreeni

I used the -it flag and its working. The problem now is its not able to parse question on the nlu server.

The command I used is :

sudo docker run -it   -v $(pwd):/app/project   -v $(pwd)/models/rasa_core:/app/models   rasa/rasa_core:latest   run python -m rasa_core.train interactive -o /app/project/models/rasa_core/ -s/app/project/data/stories.md -d /app/project/domain.yml --endpoints /app/project/config/endpoints.yml  -u /app/project/rasa_nlu/current

But now I am getting error as :

ERROR    rasa_core.interpreter  - Failed to parse text 'hello' using rasa NLU over http. Error: HTTPConnectionPool(host='localhost', port=5000): Max retries exceeded with url: /parse?model=current&project=%2Fapp%2Fproject%2Frasa_nlu&q=hello (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7df04bf710>: Failed to establish a new connection: [Errno 110] Connection timed out',))

Any help

I tried mine with -it flag and I proceed a bit more further, but mine is still stuck on pygraphviz …

:face_with_head_bandage:

With docker I believe that it creates the environment in the container based on the image you are pulling. It should not have to do with what version of python you have installed on your mac .

Can you try deleting the image and pulling it again

Thanks a lot!! That did the trick with the pygraphviz error…

I had to force remove the container image and pull latest again… now at least its up and asking for user input… No I get a similar nlu parse error but a slightly different one…

Failed to establish a new connection: [Errno -2] Name or service not known…

trying to check nlu serices

:+1: can you share your endpoint.yml contents. Is it like below?

action_endpoint:
  url: http://localhost:5055/webhook

nlu:
  url: http://localhost:5000

yes … Here it is

action_endpoint:
    url: http://action_server:5055/webhook
nlu:
    url: 'http://rasa_nlu:5000'

I think http://action_server and http://rasa_nlu servers will start running when you deploy the bot on the containers on the network they will be sharing.So rasa core access these two services using the names /action_server and rasa_nlu.

I had the same before , but then I changed these two to localhost to check if that resolve the issue, but no luck. Its not starting up the nlu server I think.

I tried running the action and nlu server both and then train core , but its still not working

I tried the same too and stuck on the same thing :disappointed:

when I started my core & nlu servers with the endpoints referring to localhost, I was getting the same error I got in my interactive trainer on my regular REST call which was working previously too… Once I shifted it back to rasa_nlu:5000, my regular calls are working, but the interactive training is getting the error that the service rasa_nlu is unknown. It looks like the xformation you mentioned before as core interprets the rasa_core:5000 to local host is not happening in the interactive mode…

[02/Jan/2019:23:30:58 +0000] "GET /parse?model=&amp;project=current&amp;q=hello HTTP/1.1" 200 684 "-" "python-requests/2.20.0"

from the interpreter

rasa_core.interpreter - Failed to parse text 'hello' using rasa NLU over http. Error: HTTPConnectionPool(host='rasa_nlu', port=5000): Max retries exceeded with url: /parse?model=models&amp;project=%2Fapp%2Fproject&amp;q=hello (Caused by NewConnectionError('&lt;urllib3.connection.HTTPConnection object at 0x7fb4f30ecb70&gt;: Failed to establish a new connection: [Errno -2] Name or service not known',))

@rsreeni , did you got this working? I got the response, here is the link for the github issue