If you want to make it more bullet proof, you can define a pydantic model to generate queries, to take advantage of tool calling on llms and restrict what can actually be generated in a query
Okay cool I saw similarity and reindexing as vectors which is possible. It's actually for real estate so there's a lot of potential fields they can query on but it needs to come from a sentence in layman's terms.
So 4 bedroom houses in x neighborhood in this town under 2m with at least 3 bathrooms, if I can deliver that then I'm good to go. Is that something llamaindex is designed for or am I better just tuning the prompt? I feel like I could potentially provide a few hundred sentences and the query that should be generated from it but I don't know how helpful that would be or where that would live.