----------------------
Yes, it is possible to query or prompt an index in LlamaIndex. You can customize your prompts and pass them directly during query-time. This can be done either via the high-level API or the low-level composition API.
Here's an example of how you can do it using the high-level API:
query_engine = index.as_query_engine(
text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt
)
And here's how you can do it using the low-level composition API:
retriever = index.as_retriever()
synth = get_response_synthesizer(
text_qa_template=custom_qa_prompt, refine_template=custom_refine_prompt
)
query_engine = RetrieverQueryEngine(retriever, response_synthesizer)
In both cases,
custom_qa_prompt
and
custom_refine_prompt
are your custom prompts. The two approaches are equivalent, where the first one is essentially syntactic sugar for the second one and hides away the underlying complexity. You might want to use the first one to quickly modify some common parameters, and use the second one to have more granular control.
For more details on which classes use which prompts, you can visit the
Query class references (). You can also check out the
reference documentation () for a full set of all prompts.
Source:
LlamaIndex Documentation ()
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------