Find answers from the community

Updated last year

Where does llama index extract the query

Where does llama_index extract the query it sends to a vector store? I am trying to alter the query it sends to customize the chatbot behavior
L
c
25 comments
Are you using a query engine? Chat engine?
my debugger has taken me to call_function
So what happens is the default chat engine is an agent

The agent has a list of tools (i.e. your query engine) with descriptions and names

It looks at the user message and list of tools/names/descriptions and decides to use a tool or not

If it uses a tool, it calls the tool as a function, where it has written it's own input for the tool/function
gotcha - thanks for context. Is there a way we can change the way it looks at a user message and ultimate creates the input string for that tool?
Isn't that what it does though? Well technically, its the user message + history + tools = input string for tool
Are you saying defining the tool metadata is the way to control that behavior? We don't want to override the way each tool decides to parse the user message + history?
Yea exactly.

We don't want to override the way each tool decides to parse the user message + history?
  • You could ovrride this by making your own custom tool
Plain Text
from typing import Any
from llama_index.tools import BaseTool, FunctionTool

query_engine = index.as_query_engine()

def query_tool(query_str: str) -> Any:
    """[TOOL DESCRIPTION]"""
    new_query_str = parse_query_str(query_str)
    return query_engine.query(new_query_str)


query_tool = FunctionTool.from_defaults(fn=query_tool)

agent = OpenAIAgent.from_tools([query_tool], llm=llm, verbose=True)


Here, the tool name is the function name, and tool description is the docstring
ah so query_tools as just query_engines where you control the input
im assuming in this example, parse_query_str is the custom behavior i want?
yea exactly, you got it
It's pretty flexible tbh, since you can write any function and the agent can call it. Just need to make sure to type arguments properly, good docstring and function name
makes sense - it is very flexible. Thats why I love this library so much! You all are doing incredible work

Does the query_engine in turn use retreivers to fetch the nodes from the storage_context, in my case ChromaVectorStore?
Exactly -- the query engine is basically a pipeline of a retriever -> node postprocessors (if any) -> response synthesizer
any way we can pass these tools to a chat_engine?
So, OpenAIAgent is a chat engine (well, the default one anyways) -- it's the same thing πŸ™‚
Declaring it like this is more flexible than as_chat_engine() is all
can I pass all the same kwargs to OpenAIAgent.from_tools() that I do to as_chat_engine()?
I thiiiink you can
you probably cant pass the list of tools
from_tools first arg is tools tho, right?
Snippet from under the hood

Plain Text
elif chat_mode == ChatMode.OPENAI:
    return OpenAIAgent.from_tools(
        tools=[query_engine_tool],
        llm=llm,
        **kwargs,
    )
So you can pass everything else
ah nice to the Agent.from_tools call
Add a reply
Sign up and join the conversation on Discord