Chatbot UI: connect frontend to bot

Hey guys,

So I went through most topics regarding running your bot with a webchat frontend. I could also implement one of the recommended premade interfaces (scalableminds).

Now, I needed an Angular version instead of the above mentioned React based one but I am not too familiar with web development. So, I tried forking the chatbot UI by @JiteshGaikwad. I tried the RestInput channel I also tried using the custom input channel from scalableminds and change this line accordingly but I have the following issues:

  • The RestInput version doesn’t seem to respond. I run the action server and I run python -m rasa_core.run --core models/dialogue --nlu models/nlu/default/newbot --endpoints endpoints.yml --credentials credentials.yml --debug. Now, the server is up, I can send the bot messages through Postman and it responds correctly. But the frontend keeps showing the default error message (Sorry I wasn’t able to understand your Query. Let’s try something else!). In the meantime, I see this in the terminal: OPTIONS /webhooks/rest/webhook HTTP/1.1" 200 137 0.000993 and the action server also seems to be running actions.

  • Therefore I tried with changing back my credentials.yml as it is for the scalableminds version and changing the url in the line mentioned above in the intro (again I was running the bot and the action server with the same commands on the terminal). I can talk to the bot using Postman the same way. However, when I write the bot in the chat window, it responds with ‘undefined’ many times. In the terminal, I can see that the bot receives the input from the window, classifying it and fining a response to send, but somehow the response never ends up in the frontend.

@JiteshGaikwad, could you maybe point out where I go wrong with my implementation?

hey @lauraperge, while running the rasa core server you din’t add the --cors “*” arguments, thats the reason you are getting the above error.

2 Likes

Great, it is solved, thank you for the quick answer!

Okay, so for the sake of clarity:

  1. The RestInput version worked well with the modification advised by Jitesh. I also found the video where he mentions this with some lag on my side :slight_smile:.

  2. I can also easily use the custom channel implementation of scalableminds with Jitesh’s nice UI, by making a small change in his script.js:

//------------------------------------------- Call the RASA API--------------------------------------
	function send(text) {

        var dataToSend = JSON.stringify({
            "recipient_id": "user",
            "message": text
        })

		$.ajax({
            url: 'http://localhost:5005/webhooks/chatroom/webhook', //  RASA API
			type: 'POST',
			headers: {
                'Content-Type': 'application/json; charset=utf-8'
            },
            dataType: "json",
            data: dataToSend,
            success: function (data, textStatus, xhr) {
                console.log(data);

				if (Object.keys(data).length !== 0) {
					for (i = 0; i < Object.keys(data[0]).length; i++) {
						if (Object.keys(data[0])[i] == "buttons") { //check if buttons(suggestions) are present.
							addSuggestion(data[0]["buttons"])
						}

					}
				}

				setBotResponse(data);

			},
			error: function (xhr, textStatus, errorThrown) {
				console.log('Error in Operation');
				setBotResponse('error');
			}
		});

Hope this helps anyone interested!

Hey guys,

If I want to use salabalemind or react-web chat UI or any other with a rest channel with my dockered NLU+Core in a cloud, do I need to docker the UI too? If you have dockered both (Building Rasa with Docker), it is avaiable at http://localhost:5005 so I assume that you can just use the UI within your browser without dsocker anything, right?

So, can I docker my rasa model, and just using the UI without changing anyting with the UI? Since the UI sends to the local port which is directed to the host port automatically.

I am not experienced with that.

HI , in credentials.yml file , what should i pass as value

hey @hemamalini, you can pass the below line in your credentials.yml file

rest:

Hi.Still it is not working. I the path as well. my endpoint.yml action_endpoint: url: http://localhost:5055/webhook/

core_endpoint:
  url: http://localhost:5005

credentials.yml `rasa_utils.bot_server_channel.BotServerInputChannel:

Also i tried by giving rest. Still i am getting 404

I had this problem before. After adding the code below into credentials.yml,the 404 problem had been solved.

However, the bot still don’t answer my words, are there any example front python project for rasa to recommend?I just can’t understand that webchat.js in some front project.

socketio:
  user_message_evt: user_uttered
  bot_message_evt: bot_uttered
  session_persistence: true

You need to send text in the following format

	data: JSON.stringify({
			"sender": "Rasa",
			"message": text
		}),

我也是卡在前端项目中的weibchat.js有什么解决办法么

Hello, can you let me know what is the full argument to run in the terminal? because i also keep getting that error in the UI " Sorry I wasn’t able to understand your Query. Let’s try something else!" PLS it would be a big help

Thank you