Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
I was checking `index as chat engine `
I was checking `index as chat engine `
Inactive
0
Follow
W
WhiteFang_Jr
last year
Β·
I was checking
index.as_chat_engine()
Had some doubts
What will happen if history context gets bigger? Will it remove some previous conversation from it so that the openAI is able to predict on it?
If not I'd be happy to work on it to create a PR
L
W
7 comments
Share
Open in Discord
L
Logan M
last year
The react chat engine will use a memory module from langchain, and they have certain ones that will work for longer chat histories
The others may hit some issues though. A PR for this would be awesome π
W
WhiteFang_Jr
last year
Got it, Will work on this.
W
WhiteFang_Jr
last year
Also found one more case!
W
WhiteFang_Jr
last year
ChatEngine is suppose to be designed to be used in QA bot. We should have a chat memory map for each different user. As different user will have different context and that is not yet supported in my opinion.
W
WhiteFang_Jr
last year
If we get some sort of
user_id
or
chat_id
we can map it with their own context. if none passed we already have current scenario
W
WhiteFang_Jr
last year
I'll be working on these two parts
L
Logan M
last year
Makes sense!
Add a reply
Sign up and join the conversation on Discord
Join on Discord