- 3 Minutes Wednesdays
- Posts
- 3MW (Nice Markdown Output for our AI chat bot)
3MW (Nice Markdown Output for our AI chat bot)
Guten Tag!
Many greetings from Munich, Germany. Last time we created a Shiny chatbot that could stream LLM responses into the chat window. However, you’ll notice that there’s Markdown notation instead of the actual rendered text. Today we’ll figure out how to fix that.
Just like last time, our completed app can be found on GitHub.
Quick Recap
Let’s review the main things that helped us create the interactive part of the chatbot. When someone clicks the send button, we use insertUI()
to
insert the text input
Then we insert a new div container for the LLM reply
We create a stream for our chat, loop through the chunks of our stream and use
insertUI()
to write the text chunks into the reply container
Unfortunately, we don’t get markdown rendered that way. Even worse: While Shiny has a markdown()
function to render Markdown, we cannot use it in a stream. The UI will wait until the stream is complete until it shows the full text.
Instead, we need to take a detour by sending the data directly to the user’s browser, but not to the UI. That way, we can let JavaScript on our user’s computer handle the Markdown rendering. And if that sounds scary: Don’t worry. I’ll lead you through the process.
Custom Messages Solution
Alright, here’s what we have to do:
We’ll send a “custom message” to the browser
We write JS code to catch these messages
We equip the message handler with a function to render Markdown
And the first step is actually quite easy. Within our loop, instead of using insertUI
, we’ll use the sendCustomMessage()
method from the session
object. This method takes two arguments:
A message type (a name we make up, let’s go for
"updateReply"
)Content to pass with the message (the current chunk)
Checking the results
Now with these changes, you’ll see that nothing happens. That’s because we send the messages but we didn’t implement anything to respond to the messages.
Catch messages
Inside our UI function, we need to add JavaScript code that includes a custom message handler that listen to these ”updateReply”
messages.
Notice that here we added a function that takes an argument called chunk
. This is the actual handler so we need to fill this thing.
First, we should collect the current chunk in a chunks
variable with all the previous chunks. To do so, we declare the chunks
variable outside the handler as an empty string and then we just add the current chunk to it.
And then we grab the last container of class chat_Reply
and set its innerHTML
to the content of the chunks variable.
Now the app should work but still not render nicely.
Load a Markdown package
To render Markdown, we are going to use the JS package markdown-it
. Just like with R, we have to load that package first. We can do so by including another script tag that loads its source code.
Render Markdown
Now, we just have to create a Markdown renderer using the markdownit()
function. Once that is done, we can replace the innerHTML
of the reply container with the rendered version of the chunks
text. To do so, we just have to stick the chunks
into the render()
method of the Markdown renderer.
And with that our Markdown output should work nicely.
Resetting chunks
But we’re not done yet. Watch what happens when we send multiple messages:
Oh no! The other message is still in the reply. That happens because our chunks
variable was never emptied. One way, we can fix that is to include another message handler that resets the chunks
variable.
And in our loop on the R server we just send this new message type at the start of the loop.
And with that, our app should be fixed.
More JS niceties
I hope you are starting to appreciate the things JS can do for us. If not, here’s another trick for better user experience.
Notice that the chat output doesn’t scroll down automatically as messages become longer. To fix this, we can use our message handler to
grab the element containing all content, and
set its
scrollTop
attribute to itsscrollHeight
.
And this will give us this smooth scrolling experience:
Excellent! We’ve covered how to use a tiny bit of JS to make our app much nicer. In the next installment, we’ll cover asynchronous streaming. This becomes important when deploying the Shiny app, as the Shiny session might halt for all users if one user is currently streaming.
As always, if you have any questions, or just want to reach out, feel free to contact me by replying to this mail or finding me on LinkedIn or on Bluesky.
See you next week,
Albert 👋
Enjoyed this newsletter? Here are other ways I can help you:
Reply