Hi Guys, i am in the process of building a rag application using llamaindex, already i have reference to look for same rag application done by peers using langchain, in both cases we use chromadb as vector, however there is a function VECTOR_STORE.search_by_vector(question_embeddings, k = 3) to search by the vector in langachin, where question_embeddings are the embedded values of the input and now we are looking for similarilty search with in our vectorstore to return the closest of 3 matches, how to achieve this using LLAMAINDEX, #❓py-issues-and-help ????
Thanks @WhiteFang_Jr for your response, i see your point but how about when i want to know the list of closest searches related to my question , how do i retrieve that information, say for instance k=3 to see the 3 top matches related to the question in vectors ?
@WhiteFang_Jr oh okay , in my use case incoming questions are related to information in tables of the database, so in order to identify the right table we process the questions to break it in to mulitple words ( means tables) and then convert them as vectors, so that llm will look for the top 3 tables and decide which one to go and execute the sql , from your description it seems this is not possible to achieve it in llamaindex at this moment? is that right
@WhiteFang_Jr , partially correct . let me tell you what is happening in Langchian, say if the question is how many customers we have? this input goes through goes through the process of our custom function to generate the response in json as ['many customers', 'How many customers do we have?'], this json is converted to embeddings, then that embeddings is searched in the vector index with k=3 to get the closest results, in this case tables and its metadata