Starter Pack running, listening on port 5005, what now, how to connect?


So I have installed the starter pack and in the cmdline in docker the bot answers to me.

When I run “make cmdline” the bot listens on port 5005, so far so good.

The next step is a mistery to me: How can I send text from a web page to that port, and receive the answer. I have perused the FAQ and tutorials, but for my understanding there is a missing link between the starter-pack and Sara chatbots tutorials and the “tutorials” for webchat and GitHub - mrbot-ai/rasa-webchat: A chat widget easy to connect to chatbot platforms such as Rasa Core.

So my best effort so far:

I have tried to use the webchat (GitHub - scalableminds/chatroom: React-based Chatroom Component for Rasa Stack) on a html page and set the port to 5005 and when I load the page I do see some requests coming in, such as: "GET /webhooks/chatroom/conversations (…) HTTP 1.1. 404

Obviously the chatroom widget does not create the correct requests.

Back to my question: How do I know what kind of request I have to send to the starter pack chatbot. What do they have to look like and how would I find out, what they have to look like?

Thank you

@lancaster Mike, the scalableminds Chatroom component needs you to use a custom channel on the server that they include as part of their project (it’s in the rasa_utils subdirectory). The part of the documentation you need is this bit.

If you want to use the rasa-webchat front end, then you can use the standard socketio channel that is included with the rasa_core project code. You do need to setup the credentials.yml file for this. Documentation is here.

In my experience, both of the frontends are a little fiddly to setup but do produce the correct calls as long as you follow the instructions really carefully. In the end I went with the webchat component as I found the SM chatroom component too laggy.

Can you post the code you used for embedding the webchat component into your webpage?

Hi, thanks for helping.

So I tried to use scalableminds chatroom and I added this to my MAKEFILE after copying bot_utils and downloading dependencies):

run-bot: python -m -d models/current/dialogue -u models/current/nlu

that results in errors that I can’t interpret:

mtr@nb17378 MINGW64 /c/Rasa/starter_pack/starter-pack-rasa-stack (master) $ make run-bot python -m -d models/current/dialogue -u models/current/nlu C:\Python368\lib\site-packages\ FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as `np.float64 == np. from ._conv import register_converters as _register_converters 2019-04-04 17:42:45 INFO root - Rasa process starting 2019-04-04 17:42:49 INFO rasa_nlu.components - Added ‘nlp_spacy’ to component cache. Key ‘nlp_spacy-en’. 2019-04-04 17:42:49 WARNING py.warnings - C:\Python368\lib\site-packages\rasa_nlu\extractors\ UserWarning: Failed to load synonyms file from ‘models/current/nlu\entity_synonyms.json’ “”.format(entity_synonyms_file))

2019-04-04 17:42:49 WARNING py.warnings - C:\Python368\lib\site-packages\pykwalify\ UnsafeLoaderWarning: The default ‘Loader’ for ‘load(stream)’ without further arguments can be unsafe. Use ‘load(stream, Loader=ruamel.yaml.Loader)’ explicitly if that is OK. Alternatively include the following in your code:

import warnings warnings.simplefilter(‘ignore’, ruamel.yaml.error.UnsafeLoaderWarning)

In most other cases you should consider using ‘safe_load(stream)’ data = yaml.load(stream)

2019-04-04 17:43:10.094151: I T:\src\github\tensorflow\tensorflow\core\platform\] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 Traceback (most recent call last): File “C:\Python368\lib\”, line 193, in _run_module_as_main “main”, mod_spec) File “C:\Python368\lib\”, line 85, in _run_code exec(code, run_globals) File “C:\Rasa\starter_pack\starter-pack-rasa-stack\rasa_utils\”, line 150, in tracker_store=_tracker_store) File “C:\Rasa\starter_pack\starter-pack-rasa-stack\rasa_utils\”, line 120, in load_agent action_endpoint=endpoints.action) File “C:\Python368\lib\site-packages\rasa_core\”, line 259, in load ensemble = PolicyEnsemble.load(path) if path else None File “C:\Python368\lib\site-packages\rasa_core\policies\”, line 182, in load policy = policy_cls.load(policy_path) File “C:\Python368\lib\site-packages\rasa_core\policies\”, line 248, in load model = load_model(model_file) File “C:\Python368\lib\site-packages\tensorflow\python\keras\engine\”, line 229, in load_model model = model_from_config(model_config, custom_objects=custom_objects) File “C:\Python368\lib\site-packages\tensorflow\python\keras\engine\”, line 306, in model_from_config return deserialize(config, custom_objects=custom_objects) File “C:\Python368\lib\site-packages\tensorflow\python\keras\layers\”, line 64, in deserialize printable_module_name=‘layer’) File “C:\Python368\lib\site-packages\tensorflow\python\keras\utils\”, line 173, in deserialize_keras_object list(custom_objects.items()))) File “C:\Python368\lib\site-packages\tensorflow\python\keras\engine\”, line 293, in from_config layer = layer_module.deserialize(conf, custom_objects=custom_objects) File “C:\Python368\lib\site-packages\tensorflow\python\keras\layers\”, line 64, in deserialize printable_module_name=‘layer’) File “C:\Python368\lib\site-packages\tensorflow\python\keras\utils\”, line 193, in deserialize_keras_object function_name) ValueError: Unknown layer:name make: *** [run-bot] Fehler 1

The HTML file looks like this:

<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Strict//EN” “”>

<html xmlns=“”>


<link rel=“stylesheet” href=“” />



<div class=“chat-container”></div>

<script src=“”/> </script>

<script type=“text/javascript”> var chatroom = window.Chatroom({

  host: "http://localhost:5005",
  title: "Chat with Mike",
  container: document.querySelector(".chat-container"),
  welcomeMessage: "Hi, I am Mike. How may I help you?",
  speechRecognition: "en-US"




Thank you

Hi again,

Got it working :slight_smile:

Don’t know what the problem was, but I re-downloaded the starter pack from git and then it all just worked.