Find answers from the community

Updated 2 months ago

Is there a way to get OpenAIAgent to return retrieved nodes?

Is there a way to get OpenAIAgent to return retrieved nodes?
k
n
5 comments
Please stand by @node0 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, there is a way to get OpenAIAgent to return retrieved nodes. You can use the postprocess_nodes method of the RankGPTRerank class to rerank the retrieved nodes. Here is an example of how you can do this:

Plain Text
from llama_index.postprocessor.rankgpt_rerank import RankGPTRerank
from llama_index.llms.openai import OpenAI

# Configure reranker
reranker = RankGPTRerank(
    llm=OpenAI(
        model="gpt-3.5-turbo-16k",
        temperature=0.0,
        api_key=OPENAI_API_KEY,
    ),
    top_n=reranker_top_n,
    verbose=True,
)

# Postprocess retrieved nodes
retrieved_nodes = reranker.postprocess_nodes(
    retrieved_nodes, query_bundle
)

return retrieved_nodes


In this example, retrieved_nodes is the list of nodes that you have retrieved and want to rerank. query_bundle is the query that you used to retrieve the nodes. The postprocess_nodes method reranks the nodes based on their relevance to the query and returns the reranked nodes.

You can find more details about this in the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai What I wanted to know is if an agent constructed by using OpenAIAgent can return nodes?
Let me check the knowledge base for you. One moment! :books:
----------------------
The OpenAIAgent in LlamaIndex is designed to interact with various tools and return responses based on the queries it receives. However, the documentation does not explicitly mention that an OpenAIAgent can return nodes directly.

Typically, the agent is used to process queries and return responses. For example, in the OpenAIAgent Query Planning () example, the agent is used to process a query and return a response:

Plain Text
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI

agent = OpenAIAgent.from_tools(
    [query_plan_tool],
    max_function_calls=10,
    llm=OpenAI(temperature=0, model="gpt-4-0613"),
    verbose=True,
)

response = agent.query("What were the risk factors in sept 2022?")


In this example, the agent processes the query "What were the risk factors in sept 2022?" and returns a response. The agent does not return nodes directly.

If you want to work with nodes directly, you might need to use other parts of the LlamaIndex library, such as the retrieval or indexing modules. For more information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord