Find answers from the community

Updated 2 years ago

Hi guys Not sure if this is the right

At a glance
Hi guys! Not sure if this is the right place to ask, but I am struggling to understand chat_engine. I have read the documentation, but my main question is, does Llama Index keep track of the converstation history, or do I need to feed it each message in the conversation? I have an online chat system and we'd like our customers to be able to chat with our bot using a custom knowledge base. So I would love for ChatGPT to have knowledge of the entire conversation that's going on.
L
H
W
17 comments
Indeed it does!

Depending on which chat engine you use, there is either a memory attribute (for react chat engine) or a chat_history attribute for condense chat engine
So when using condense chat engine, do I have to manually create a chat_history and feed it to as_chat_engine()? Or will it keep track on it's own without that?
It will keep track of its own. You only have to pass in a chat_history if you plan on saving/loading previous conversations

For the condense chat engine, the chat history is just a list of tuples, where each tuple is a human/ai message pair
Awesome, that's exactly what I need to know! Now I have one other question: We will actually being using our application on potentially hundreds of different websites that all have slightly different data. They will each hit a single Python API that will query ChatGPT for the answer. I have a plan for getting the index data for each site, but what will happen if several different people chatting on different website and hitting the same API? Will the context get really messed up?
Normally in an API, the requests are threaded right?

I think it should be fine tbh, it shouuuuuld be thread safe... I think... lol
Haha ok! We'll just have to experiement with it. Worst case is each website will have to have their own OpenAI account they use separately. Do you know if creating another API key in a single account is like talking to a different instance of ChatGPT?
It should be! πŸ™‚
Ok awesome. That would be the better plan anyway so we can track usage!
Thanks a bunch for your help! Super helpful!
Hey @Haluk , You will have to create different chat engine instances for different user otherwise chat engine currently does not handle multiple chat histories and all the user queries will get mixed in one chat history.
And will cause two problems
  • Non meaningful context for user
  • Chat History size will exceed the token size available for GPT-3 or ChatGPT very soon.
Oh, ok that makes perfect sense! So would simply creating a new API key for each website help?
No need to make different API, Just need to make different instances of chat engines
Plain Text
query_engine_1 = index.as_chat_engine()
query_engine_2 = index.as_chat_engine()

You'll need to manage these engine instances only
Interesting...so I guess I could make variables for the query engine name so each user chatting in would have a unique instance?
Sorry, I'm new to Python!
Yes you can make a Dictionary like this

Plain Text
Chat_Dict = {
unique_user_1 = query_engine_1,
unique_user_2 = query_engine_2
}


You can check first if the unique ID exist in the Dict then use that if not then make a instance and add it into the dict so that in the next turn you have the instance for that user
Oh, that's smart! My app is actually a Flask api, so I just need to pass over the unique chat id from the main API which handles all the other chat functions. Then I can just use that! Our chat system handles chatting with humans, but has a bot component at the beginning of a conversation
@WhiteFang_Jr thanks a bunch for your help!
Add a reply
Sign up and join the conversation on Discord