Find answers from the community

Updated 2 years ago

In my experience you just need to be

At a glance

The post suggests that being verbose in the description or using a keyword-based approach (e.g. "If user mentions the keyword [TOOL], use this tool") can be helpful. The comments discuss an alternative approach of skipping the agent and using just the index to run the chat, which involves modifying the QA prompt. Community members raise potential issues with long conversations, such as the need to summarize the chat history or limit its length. They also discuss the capabilities of the LangChain chat agent and its handling of long conversations. Overall, there is no explicitly marked answer, but the community members provide suggestions and discuss potential approaches and considerations.

In my experience, you just need to be super verbose in the description. Or if you wanted, you could even do something like "If user mentions the keyword [TOOL], use this tool"
m
L
7 comments
one more question, if you have any thoughts. I've considered just skipping the agent altogether and using just the index to run the chat.

it involves changing the QA prompt to look like this
Plain Text
QA_PROMPT_TMPL = (
    "You are a helpful AI chatbot named 'AI'. Your job is answer questions from the 'User', based on the provided context information.\n"
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this context information, you will be completing the following chat. Make sure to use information from the chat to respond:\n"
    f"{chat_history_str}"
    "User: {query_str}\n"
    "AI: "
)
QA_PROMPT = QuestionAnswerPrompt(QA_PROMPT_TMPL)

In your opinion do you see any potential issues with this?
If the conversation gets too long, you might have to make some extra calls to summarize the chat history or something πŸ€” Or just add some extra processing to cut off the chat history at a certain length
But I think it should work!
thanks! also this makes me wonder what the langchain chat agent does, since i'm guessing it can't deal with infinite chat lengths either πŸ€”
Yea it will just error out at a certain point lol but they have different memory modules that deal with the length in different ways
The base ConversationMemoryBuffer doesn't do anything special I think
ok i think this gives me enough to work with for now. thank you again!
Add a reply
Sign up and join the conversation on Discord