Find answers from the community

Updated 2 months ago

Hi, Is there any way to disable the "

Hi, Is there any way to disable the "Retrieval" and "Augmentation" part when sending a query?

query_engine = index.as_query_engine() response = query_engine.query("What did the author do growing up?")
So I would like to send this question directly to my LLM and get a response, without any type of augmentation. I know this is not the purpose of using LLAMAINDEX, but this can help me in debugging as I would like to see what is raw response without augmentation?

Thanks a lot
T
u
L
5 comments
You can use it like this
Plain Text
from llama_index.core.llms import ChatMessage
from llama_index.llms.openai import OpenAI

from llama_index.llms.openai import OpenAI

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = OpenAI().chat(messages)

print(resp)


https://docs.llamaindex.ai/en/stable/examples/llm/openai.html#openai
Thanks both of you for the answering @redfield01 and @Teemu .
I understand your reply, I was looking for almost using the same API, maybe just changing any parameter, I could directly get response from LLM. That will help me testing in almost same APIs.
For example, I was thinking like below
query_engine = index.as_query_engine(similarity_top_k=0) response = query_engine.query("What did the author do growing up?")

Here I am asking 0 similar documents instead of default 2, maybe this will give me what I want, but wanted to ask if there is a standard way
Yea if no nodes are returned, it wont generate a response
What you probably want to do is wrap an LLM into a custom query engine
Add a reply
Sign up and join the conversation on Discord