Find answers from the community

Updated 3 months ago

I wonder if anyone has gotten chainlit

I wonder if anyone has gotten chainlit to work after the llama index V 0.1 upgrade? I see there was a chainlit repo update that was supposed to transition from service_context to settings, but I haven't been able to find any examples that have been updated. Have spent hours on it and had no success so far. Would be amazing if there were a functioning example somewhere. About to give up and move onto something else that might have more updated examples with the newer Llama Index codebase.
L
S
P
8 comments
I havent used chainlit explicitly, but happy to help suggest code changes to migrate things
Let me ask a question from a different angle. Any recommendations about easiest way to get a llamaindex query codebase running on a site like huggingface spaces? I was checking out streamlit, but even their example is from 2023 and uses deprecated functions. Seems like lots of people would want to share their apps, so hopefully there is a good way, but I'm just not finding info for it.
I got the chainlit app figured out. I needed to set streaming=True within the query engine parameters.
As well as this callback setting.
callback_manager = CallbackManager([cl.LlamaIndexCallbackHandler()])

Settings.callback_manager = callback_manager
Makes sense πŸ‘ Yea streamlit also works fine, I've used it a ton. Mostly just a matter of taking a bit of time to read a few llama-index examples (from our docs, so that they are recent) to see how it works
QQ on this...playing with chainlit and got query engine working fine from their docs. However I want to capture the chat history. I got chat engine to work but doesn't seem the chat history is sent. Did you figure that out @SeaBerg ?
No I was unable to get the chat history working. Didn't see any examples of that part and moved on.
So I finally got it to work. I replaced the query engine with a chat engine, and added the chat memory buffer. The only weird thing that I'm struggling with is the callback. I got it mostly working, but it seems to spit out the first response to one of the steps within chainlit and then it simply repeats that step as the final output, which makes it appear to the end user that's much slower than it actually is.

EDIT: I'm dumb. Forgot to stream response so it was much slower than expected. All good now.
Glad you got it mostly working! Thanks for the info.
Add a reply
Sign up and join the conversation on Discord