Find answers from the community

Updated 2 months ago

hey how can i use `Hyde Query Transform`

hey how can i use Hyde Query Transform with a chat engine i was unable to find implementation with chat engine
is it not possible to implement it with chat engine?

edit: if there is no chat engine implementation, can i modify query engine to include chat history

L
p
k
9 comments
Hyde transformations are essentially just query rewrites for retrieval. You can do the same in the condense plus context chat engine if you modify the condense prompt. Or you can give an existing query engine to an agent as a tool

One example
Plain Text
from llama_index.core.chat_engine import CondensePlusContextChatEngine

condense_prompt = """
  Given the following conversation between a user and an AI assistant and a follow up question from user,
  please write a passage to answer the question
  Try to include as many key details as possible.

  Chat History:
  {chat_history}
  Follow Up Input: {question}
  Passage:"""

chat_engine = CondensePlusContextChatEngine.from_defaults(retriever, llm=llm, condense_prompt=condense_prompt)
I was also checking bm25 retriever with a custom query rewriter, but was facing an error using index from qdrant vector store

Apparantly index.docstore was not fetching i think this is the solution
https://github.com/run-llama/llama_index/issues/9251#issuecomment-2105931352
@Logan M could you answer to the question in issue if number of nodes is greater then 10,000
Anyways I am using qdrant
Incase of index, agent prompt will be to rewrite the query and send it to tool which would be index?
hi @Logan M

i tried using agent to rewrite the query from index engine but its not working, its passaing the original query only

Plain Text
agent_prompt = """

For each user input, you will:

1. Generate two search queries:
   - The first query will be the original user input, unmodified.
   - The second query will be a rephrased or expanded version of the original, aimed at capturing additional relevant information.

2. Use both queries to gather information and formulate your response.

3. Provide a comprehensive answer based on the combined results of both queries.

Remember to tailor your responses to the context of mortgage business and loan application management.

Input: {query}
Please process the input according to the steps above and provide your response.
"""

query_engine_tools = [
    QueryEngineTool(
        query_engine=mloflo_engine,
        metadata=ToolMetadata(
            name="tool-name",
            description=(
                "Provides information about ..., Use this to address user queries"
                "Use a detailed plain text question as input to the tool."
            ),
        ),
    )
]

from llama_index.agent.openai import OpenAIAgent
agent = OpenAIAgent.from_tools(query_engine_tools, verbose=True, system_propmt=agent_prompt)


it running the toll only once with the same query as input
Let me check the knowledge base for you. One moment! :books:
Sorry @payload, I seem to be having a bit of trouble. Don't worry though, I've pinged my support and they will try to fix me up! :technologist:

Perhaps try again in 15mins?
Add a reply
Sign up and join the conversation on Discord