Find answers from the community

Updated 4 months ago

how to use query with question answer prompt with chat open ai?

At a glance
how to use query with question answer prompt with chat open ai?
k
3 comments
Please stand by @cincy as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
To use a query with a custom QuestionAnswerPrompt and a chat OpenAI model, you can follow these steps:

  1. Create custom QuestionAnswerPrompt templates. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-1).
  1. Import the custom QuestionAnswerPrompt templates into your application. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-3).
  1. Use the custom QuestionAnswerPrompt templates during the query. You can refer to the example in the extracted code (#extract-guides--tutorials--terms_definitions_tutorial--a-guide-to-extracting-terms-and-definitions--improvement-2----better-prompts--part-3).
Here's an example of how to use the custom QuestionAnswerPrompt with a chat OpenAI model:

Plain Text
from constants import REFINE_TEMPLATE, TEXT_QA_TEMPLATE
...
if "llama_index" in st.session_state:
    query_text = st.text_input("Ask about a term or definition:")
    if query_text:
        query_text = query_text  # Notice we removed the old instructions
        with st.spinner("Generating answer..."):
            response = st.session_state["llama_index"].query(
                query_text, similarity_top_k=5, response_mode="compact",
                text_qa_template=TEXT_QA_TEMPLATE, refine_template=REFINE_TEMPLATE
            )
        st.markdown(str(response))
...
This example assumes you have already set up the LlamaIndex with a chat OpenAI model. If you need help setting up a custom LLM model, you can refer to the extracted code (#extract-how_to--customization--custom_llms--defining-llms--example-using-a-custom-llm-model--part-1) for an example of how to define a custom LLM model.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord