Concurrent request

When I simulate 100 users to send http requests to my bot at the same time, I find that the requests are queued. Only when the bot finishes processing one request can it start processing the next one. This is very slow. How can I handle it? Thanks.

Hi @MagicMoGua,

this is due to the Sanic server which - if none of the processed requests is blocked - processes requests one after the other. Also, if you are sending multiple requests for the same conversation that’s desired behavior. What do you mean “very slow”? Usually you also don’t have a 100 conversations at the same second, but rather a bit spread out.