Find answers from the community

Updated 4 months ago

Anyone have success building an Auto-

At a glance
Anyone have success building an Auto-Retriever like the one in the docs [1]?

I ran into 3 issues:
  1. Error from FunctionTool's description being too long. This is odd since (a) the example vector_store_info token count is 168 and mine is only 244. As a result, I had to remove a bunch of metadata filters.
  1. The LLM's function creation was quite bad. I'm using the OpenAI Function API to infer function parameters, so I'm passing vector_store_info in the function's description, and it keeps picking content_info as the filter key instead of the right MetadataInfo field (in this case, medical_provider).
Plain Text
User: What is the patient's history with Dr. Woods?
**************************************************
=== Calling Function ===
Calling function: fabc5870-ce73-4ee0-9d09-c2c1158b1bdd with args: {
  "query": "chief complaint",
  "filter_key_list": ["content_info"],
  "filter_value_list": ["Dr. Woods"],
  "filter_operator_list": ["=="],
  "filter_condition": "AND"
}

  1. I'm realizing the filter operator I want is a sort of "text contains" or "approximately equal to" where the Auto-Retriever could retrieve "Dr Woods", "Woods", or "John Woods." Perhaps Auto-Retriever isn't a good use for this?
[1] https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_query_cookbook.html#load-and-index-structured-data
J
1 comment
CC: @jerryjliu0
Add a reply
Sign up and join the conversation on Discord