3MW (Adding Async Streams to Our Shiny Chat Bot)

Guten Tag!

Many greetings from Munich, Germany. Hope you had a great start into 2025 and you’re ready to get back into the AI with R game.

Today, I’m teaching you the difference of synchronous and asynchronous programming. This is a tricky topic but it is necessary for lots of Shiny apps.

Just like last time, our completed app can be found on GitHub. So with that said, let’s dive in:

Reviewing Shiny AI code

Let’s review the code for our app that we started to build last year. You know, this chat bot here:

The main code that powered this was the following snippet within our server:

If you recall, the trick to see the bits and pieces of the reply as they become available was to stream the AI response. That’s why we

  • used chat$stream() instead of chat$chat(),

  • used coro::loop() to loop around the chunks of the stream as the chunks become available, and

  • we used session$sendCustomMessage() to have JavaScript code on the UI side insert the reply (this allowed for actual Markdown output.)

The catch

Our approach worked nicely so far but there’s a catch. We can see that if we put in a simple print() statement after the coro::loop().

Now, if we run this, we see that we don’t get any friendly greeting printed to the console when the AI reply is still streaming.

But once the stream finished, we see that the greeting is printed to the console.

Why is this bad?

Well, the thing with this behavior is that it will block other users. Think about it: If

  • two users log into our app and one of them starts streaming, then

  • the second user will have to wait to do anything because his R console is also blocked.

Thankfully, there’s a way around that. And it comes in the form of non-blocking operations, i.e. async programming.

Async with {coro}

There are multiple ways to do async programming in R. For a fantastic overview, check out this great blog post by Veerle Van Leemput. Thankfully, we don’t have to switch to other packages. The {coro} package allows us to do async, too.

To do, we have to switch from coro::loop() to coro::async(). And to make the switch we have to do four things:

  1. Replace coro::loop() with coro::async() (obviously.)

  2. Wrap the inside of the previous loop into a function().

  3. Inside the new function, use a for-loop in which you “await” each stream chunk.

  4. Call coro::async(), i.e. use parentheses at the end.

And in order for these four steps to work, we have to make sure that the stream is also an async stream. You can achieve that by using chat$stream_async(). If you follow these steps, here’s what you’ll get:

And with that, you should see the “hello” statement in the console, even before the reply finished. Thus, we know that the R console is still kicking and ready to jump into action for other users. Hooray!

And in case you’re trying to implement this yourself: If you’re wondering why your stream gets unbearably slow, then you’re in good company. It took me forever to debug why this happened until I realized that the solution is simple:

Just update Shiny. With version v.1.10.0 you get a much better experience with async streams using {coro} in Shiny apps.

So with that we’ve further improved our chat bot and learned a little bit more about {ellmer} (which btw is now spelled with “ll”.) As always, if you have any questions, or just want to reach out, feel free to contact me by replying to this mail or finding me on LinkedIn or on Bluesky.

See you next week,
Albert 👋

Enjoyed this newsletter? Here are other ways I can help you:

Reply

or to participate.