Find answers from the community

Updated 2 months ago

class RAGStringQueryEngine(CustomQueryEngine): """RAG String Query Engine.""" ret

class RAGStringQueryEngine(CustomQueryEngine):
"""RAG String Query Engine."""

retriever: BaseRetriever
response_synthesizer: BaseSynthesizer
llm: OpenAI
qa_prompt: PromptTemplate

async def acustom_query(self, query_str: str):
nodes = await self.retriever.retrieve(query_str)

context_str = "\n\n".join([n.node.get_content() for n in nodes])
response = await self.llm.complete(
qa_prompt.format(context_str=context_str, query_str=query_str)
)

return str(response)
How do i as a question using aquery here?
k
2 comments
Please stand by @Joey as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
To ask a question using the RAGStringQueryEngine class, you need to create an instance of the class and then call the acustom_query method with your question as the argument. Here's how you can do it:

Plain Text
# Create an instance of RAGStringQueryEngine
query_engine = RAGStringQueryEngine(
    retriever=retriever,
    response_synthesizer=synthesizer,
    llm=llm,
    qa_prompt=qa_prompt,
)

# Ask a question
response = await query_engine.acustom_query("Your question here")

# Print the response
print(str(response))


Please replace "Your question here" with the actual question you want to ask. Note that this code is asynchronous, so it needs to be run inside an async function or an event loop.

This information is based on the code provided in the extracts from the LlamaIndex documentation () and GitHub repository ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord