Find answers from the community

Updated 5 months ago

Is it possible to add system prompt in

At a glance

The community member in the post is trying to set a system prompt for the CitationQueryEngine, but the prompt is not being applied. In the comments, another community member suggests that the system prompt needs to be set during the initialization of the language model (LLM), rather than in the CitationQueryEngine configuration. The community member confirms that this approach works.

Additionally, the community member in the comments raises a question about the correct way to initialize an LLM when building a web application, asking whether it should be done at the endpoint (function) level or at the module level. However, this question is out of the scope of the original post.

Is it possible to add system prompt in CitationQueryEngine?

Plain Text
    query_engine = CitationQueryEngine.from_args(
        index=temp_index,
        similarity_top_k=3,
        citation_chunk_size=512,
        system_prompt="Doesn't matter what user asks, always reply with: `I love bananas`"
    )


doesn't matter what I ask, it is not outputting I love bananas
p
1 comment
I see, seems like I need to declare system prompt during llm initialization.

Plain Text
Settings.llm = OpenAI(api_key="sk-...", model="gpt-4o",
                      system_prompt="Doesn't matter what user asks, always reply with: `I love bananas`")


This time it works.

Out of scope, but I have a question: while building a webapp, is it correct to initialize a llm on endpoint (function) level? Now, I am initializing it globally (in module level).
Add a reply
Sign up and join the conversation on Discord