Find answers from the community

Updated 3 months ago

Does Object Index retriever make any LLM

Does Object Index retriever make any LLM calls? It looks like it just fetches back the corresponding nodes?
L
c
7 comments
Nope. It will make embedding calls though (assuming you constructed the object index with a vector index)
I'm messing around comparing the following code bases

Plain Text
object_mapping = SimpleToolNodeMapping.from_objects([add_tool, multiply_tool])
object_index = ObjectIndex.from_objects(
    [add_tool, multiply_tool], object_mapping, VectorStoreIndex, service_context=service_context
)

object_retriever = object_index.as_retriever(similarity_top_k=1)
object_retriever.retrieve("add two numbers")


which responds back with

Plain Text
[<llama_index.tools.function_tool.FunctionTool at 0x28643cb50>]

How do I actually view the content of this?
where as if I use SimpleObjectNodeMapping

Plain Text
tags = ['apple', 'oranges', 'grapes']

obj_node_mapping = SimpleObjectNodeMapping.from_objects(tags)
print("obj_node_mapping", obj_node_mapping)


object_index = ObjectIndex.from_objects(
  tags,
  obj_node_mapping,
  VectorStoreIndex,
  service_context=service_context
)

object_retriever = object_index.as_retriever(similarity_top_k=1)
object_retriever.retrieve("""i love grapes""")


it responds back with the following which is easier to view

['grapes']
I basically want to create kind of a Tool chooser depending on the query so I can prepare the correct prompts and query engines to use.
Preferably the objectindex can help me do that but an LLM call to determine whether the query is related to grapes may be more accurate but slower. the dilemma.
Ahh regarding the ToolMapper I see it returns the function and you just it by running function.call
yup you got it
Add a reply
Sign up and join the conversation on Discord