Find answers from the community

v
vampir
Offline, last seen 3 months ago
Joined September 25, 2024
v
vampir
·

Prompts

What is the proper way to pass in custom input_variables for Prompt? It says its supported at query time in the doc. I have a custom {history} one for instance, should it be passed in the query engine call where I define custom QA and refine prompts?
9 comments
L
v
What is the standard way to format the response sent back? Like to remove the <|im_end|> tags, etc?
5 comments
v
B
L
I'm on 0.6.5 and trying to use the Composable graph, I pass it custom_query_engines with matching index id but it seems the subsequent query doesn't use the service context at all and complains about missing engine or deployment id.. any ideas what I might be doing wrong?

Plain Text
custom_query_engines = {
        i.index_id: i.as_query_engine(
            mode="embedding", 
            text_qa_template=_get_prompt_template(lang), 
            similarity_top_k=k, 
            # https://github.com/jerryjliu/llama_index/blob/main/docs/guides/primer/usage_pattern.md#configuring-response-synthesis
            response_mode="tree_summarize", # other modes are default and compact 
            refine_template=_get_refined_prompt(lang), service_context=service_context
        )    
        for i in indices.values()
    }

    query_engine = graph.as_query_engine(custom_query_engines=custom_query_engines)
    
    response = query_engine.query(query)
15 comments
L
v