Find answers from the community

Updated 2 months ago

direct LLM query (no context)

rather dumb question -- if I simply want to query the LLM directly (service_context) without sending any index/reference information, how do I do that?
t
2 comments
I'm assuming that's almost pure langchain
Add a reply
Sign up and join the conversation on Discord