How to initiate conversation from Rasa first?

Currently I am trying a bot with Rasa and testing with Rasa X. How can I make the Rasa to initiate the conversation first? It always feels like the user has to initiate the conversation. I tried overriding “action_session_start” action. But that is not working inside Rasa X. Can someone help me in this?

Hey @bhaskar1986,

That is a common “issue” and it is very straight-forward. As a general rule, every conversation must be initiated from the user as it is a standard good practice (and in many cases a compliance constraint) in bot development in order to maintain bots that are triggered and do not initiate unwanted interactions.

As for practicality, you can send a dummy request to simulate this first user interaction but it will depend heavily on your io channel, i.e. the environment where your conversations will be held

How do you plan to deploy your bot?

We are planning to deploy a Web chat (front-end) for now with Rasa deployed in Docker. But we will add additional interfaces for Mobile as well in the future.

if you are using rasa webchat then you can easily initiate conversation with initPayload attribute

> WebChat.default.init({
>     selector: "#webchat",
>     initPayload: "/get_started",
>     interval: 1000, // 1000 ms between each message
>     socketUrl: "http://localhost:5005/",
>     socketPath: "/",
> })

But this wont work inside RasaX as this information is provided in frontend that is not required in rasaX

Thank you. I will try this approach.

@bhaskar1986 can you please help me with running rasa server on docker as i am unable to do so and getting error like

> 2020-01-07 16:57:06 INFO     root  - Starting Rasa server on http://localhost:5005
2020-01-07 16:57:13.234759: E tensorflow/stream_executor/cuda/] 
failed call to cuInit: UNKNOWN ERROR (303) 

Please help me with this

Do you have any GPUs in your system? CUDA is the toolkit related Nvidia GPUs.

yes, it is

Intel(R) UHD Graphics 630
    Driver version:

but rasa server is working fine in my local environment, it is creating problem when i try to run rasa server on docker

Try installing “tensorflow-gpu”

@varunsapre10 i did, but it’s not helping…error persist after installing

Check out the reply from stephens in the end.

It is a warning and can be ignored

I am already following that thread Thanks

Is this error preventing you from running (or training) models?

In any case, you can force tensorflow to not use GPU by setting an environment variable (that must be accessible from all of rasa’s container instances)