----------------------
To use a query with a custom QuestionAnswerPrompt and a chat OpenAI model, you can follow these steps:
- Create custom QuestionAnswerPrompt templates. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-1).
- Import the custom QuestionAnswerPrompt templates into your application. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-3).
- Use the custom QuestionAnswerPrompt templates during the query. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-3).
Here's an example of how to use the custom QuestionAnswerPrompt with a chat OpenAI model:
from constants import REFINE_TEMPLATE, TEXT_QA_TEMPLATE
...
if "llama_index" in st.session_state:
query_text = st.text_input("Ask about a term or definition:")
if query_text:
query_text = query_text # Notice we removed the old instructions
with st.spinner("Generating answer..."):
response = st.session_state["llama_index"].query(
query_text, similarity_top_k=5, response_mode="compact",
text_qa_template=TEXT_QA_TEMPLATE, refine_template=REFINE_TEMPLATE
)
st.markdown(str(response))
...