Find answers from the community

Updated last year

Providing context to completion

Is there some way to pre-provide context to a completion? I know this is possible with chat, but how about completion?
j
L
7 comments
Converting to thread
Would that just be a system prompt/query wrapper prompt? Or just using a node postprocessor to insert your extra context?
Hello! Hope all is well

I am thinking with:
  • Node preprocessor: just prepends to the same completion prompt?
  • System prompt/query wrapper prompt: what does this do? Is it just prepending to the same prompt?
All is well! Just lurking while on vacation lol

Node postprocessor modifies the list of retrieved nodes before sending to the llm

System prompt is prepending

Query wrapper is wrapping the entire text that would be sent to the llm with some additional template
Lol at the vacation lurking, nice and appreciated!

So in a way, are all three modifying the one-shot completion prompt: str sent to the LLM?

For example, with
  • Query wrapper: uses templating, but still just expands the prompt
  • Node processing: how does a node get sent to the LLM? Isn't it just stringified somehow?
All the nodes text gets Inserted into a prompt template yea.

node.get_content(metadata_mode="llm")
Thanks for providing lots of context though! πŸ‘
Add a reply
Sign up and join the conversation on Discord