Find answers from the community

s
F
Y
a
P
Updated 2 years ago

A Q not related to the new version do

A Q not related to the new version: do the QA and REFINE prompts (also the similarity top_k) work with langchain agents using llama index as tool? or with GPTIndexChatMemory?
L
A
8 comments
You should be able to set the prompts in the query_kwargs of each tool I think!
same with the similarity_top_k
This code works well, but being directly from langchain I don't think similarity and prompt work here:
tools = [ Tool( name = "GPT Index", func=lambda q: str(index.query(q)), description="useful for ...", return_direct=True, ), ] # set Logging to DEBUG for more detailed outputs memory = ConversationBufferMemory(memory_key="chat_history") llm=OpenAI(temperature=0) agent_chain = initialize_agent(tools, llm, agent="conversational-react-description", memory=memory) agent_chain.run(input="hi, i am bob")
.

OTHER SOLUTION BELOW:


memory = GPTIndexChatMemory( index=index, memory_key="chat_history", query_kwargs={"response_mode": "compact", "similarity_top_k": 3}, # return_source returns source nodes instead of querying index return_source=True, # return_messages returns context in message format return_messages=True ) llm = OpenAIChat(temperature=0) # llm=OpenAI(temperature=0) agent_chain = initialize_agent([], llm, agent="conversational-react-description", memory=memory) agent_chain.run(input="hi, i am bob")

This works fine with that query ("hi, i am bob"), but when I ask for index content I got this error:

Out[21]: 'Agent stopped due to max iterations.'
I can't figure out why😩
Right, so where you do index.query() you can set the prompts and top_k πŸ’ͺ
Yea that one is a little weird πŸ€”
πŸ€¦πŸ»β€β™‚οΈπŸ€¦πŸ»β€β™‚οΈπŸ€¦πŸ»β€β™‚οΈ don’t even seen!

Anyway, do you think are there any differences in performance? Which one do you suggest?
Add a reply
Sign up and join the conversation on Discord