Find answers from the community

Updated last year

Is it possible to pass a custom prompt

At a glance

The post asks if it is possible to pass a custom prompt template when using OpenAILike. A community member responds that yes, it is possible to pass a custom prompt template using the LangchainPromptTemplate. They provide an example of how to use the LangchainPromptTemplate to map the context and question variables in a langchain prompt to the variables expected by the synthesizer in the query engine. The community member notes that the specific details may vary depending on the use case and the structure of the custom prompt, and they suggest referring to the LlamaIndex documentation for more information.

Is it possible to pass a custom prompt template when using OpenAILike?
L
k
3 comments
@kapa.ai Is it possible to pass a custom prompt template when using OpenAILike?
In particular I want it to use a chatml template for auto-merge rag retrieval service context
Let me check the knowledge base for you. One moment! :books:
----------------------
Yes, it is possible to pass a custom prompt template when using OpenAILike. You can use the LangchainPromptTemplate to map to LangChain prompts. Here is an example of how you can do it:

Plain Text
from langchain import hub
from llama_index.prompts import LangchainPromptTemplate

# Pull the langchain prompt
langchain_prompt = hub.pull("rlm/rag-prompt")

# Create a LangchainPromptTemplate with the necessary variable mappings
lc_prompt_tmpl = LangchainPromptTemplate(
    template=langchain_prompt,
    template_var_mappings={"query_str": "question", "context_str": "context"},
)

# Update the prompts in the query engine
query_engine.update_prompts(
    {"response_synthesizer:text_qa_template": lc_prompt_tmpl}
)


In this example, the LangchainPromptTemplate is used to map the context and question variables in the langchain prompt to context_str and query_str, which are the variables expected by the synthesizer in the query engine.

You can then retrieve and display the updated prompts using the following code:

Plain Text
prompts_dict = query_engine.get_prompts()
display_prompt_dict(prompts_dict)


Please note that the specific details might vary depending on your exact use case and the structure of your custom prompt.

For more information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord