The community member is asking if there is a concept in LlamaIndex that allows passing the input through a language model (LLM) before querying the index. The example given is "found all the ips in the document", where IP addresses have a standard format like x.x.x.x, and processing the input to add IP information could potentially help get better information from the documents.
In the comments, another community member suggests that the user could do this themselves by directly calling the LLM, providing an example code snippet. The original poster confirms that they asked the question to avoid starting from scratch, and wonders if this has been discussed before or if LlamaIndex has something built-in for this.
Another community member thinks that starting from scratch is probably the easiest approach, as it gives the user full control over the input and output.
There is no explicitly marked answer in the comments.
Hi, exist some concept in llama index that passes the input trought the llm before querying the index? For example, "found all the ips in the document", IP have standar format like x.x.x.x, maybe processing the input adding ip information helps to get better info from documents?