Find answers from the community

Updated 6 months ago

Hi, exist some concept in llama index

At a glance

The community member is asking if there is a concept in LlamaIndex that allows passing the input through a language model (LLM) before querying the index. The example given is "found all the ips in the document", where IP addresses have a standard format like x.x.x.x, and processing the input to add IP information could potentially help get better information from the documents.

In the comments, another community member suggests that the user could do this themselves by directly calling the LLM, providing an example code snippet. The original poster confirms that they asked the question to avoid starting from scratch, and wonders if this has been discussed before or if LlamaIndex has something built-in for this.

Another community member thinks that starting from scratch is probably the easiest approach, as it gives the user full control over the input and output.

There is no explicitly marked answer in the comments.

Hi, exist some concept in llama index that passes the input trought the llm before querying the index?
For example, "found all the ips in the document", IP have standar format like x.x.x.x, maybe processing the input adding ip information helps to get better info from documents?
L
G
5 comments
You could do this yourself right? By calling the llm directly?

Plain Text
refined_query = str(llm.complete("..."))
Yes, of course, I made the question to avoid starting from scratch, maybe this has been discussed before o llamaindex have something
I think from scratch is probably the easiest -- you have full control on the input and output πŸ‘
Thank you Logan
And for the hard work with framework
Add a reply
Sign up and join the conversation on Discord