Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Retriever
Retriever
Inactive
0
Follow
C
Chiken1
last year
Β·
I have a question: instead of getting response from LLM using query_engine, how do I get the retrieved context from the step before sending to LLM for generating response
E
C
4 comments
Share
Open in Discord
E
Emanuel Ferreira
last year
https://gpt-index.readthedocs.io/en/stable/core_modules/query_modules/retriever/root.html#retriever
C
Chiken1
last year
Thanks
C
Chiken1
last year
Do I still have to define a llm in ServiceContext if I only intend to use retrieval?
C
Chiken1
last year
Ah I found it, I set llm=None
Add a reply
Sign up and join the conversation on Discord
Join on Discord