Find answers from the community

Updated 2 years ago

Gpt 3.5

At a glance
Hi. I've been running into a problem, and I'm not sure how to resolve it. Hoping one of you knows the answer πŸ™‚

I've been running the following (to analyse an annual report of a company):
Plain Text
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0, model_name="gpt-3.5-turbo", max_tokens=512))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)

index = GPTSimpleVectorIndex.from_documents(documents, service_context = service_context)

response = index.query("What does the report mention about stakeholder engagement? Does it provide specific examples?", 
                       similarity_top_k = 3)

When I use 'gpt-3.5.turbo' I get this type of response:
The existing answer is still relevant and provides a comprehensive list of specific examples of stakeholder engagement [...]
Why does it talk about an existing answer? What is happening here?

When I use 'text-davinci-003", I receive a response that is more to my liking - and it doesn't talk about an "existing answer"
L
1 comment
So in llama index, it can't always fit all the text retrieved from a query into a single LLM call. So it refines the answer across multiple llm calls, which is why it's mentioned an existing answer here.

This is a pretty common problem with gpt 3.5 lately, openai updated the model and it seems a lot less smart and harder to get it to follow instructions lol

But! I've been working on one that might be a little better, try it out!


Plain Text
from langchain.prompts.chat import (
    AIMessagePromptTemplate,
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
)

from llama_index.prompts.prompts import RefinePrompt

# Refine Prompt
CHAT_REFINE_PROMPT_TMPL_MSGS = [
    HumanMessagePromptTemplate.from_template("{query_str}"),
    AIMessagePromptTemplate.from_template("{existing_answer}"),
    HumanMessagePromptTemplate.from_template(
        "I have more context below which can be used "
        "(only if needed) to update your previous answer.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, update the previous answer to better "
        "answer my previous query."
        "If the previous answer remains the same, repeat it verbatim. "
        "Never reference the new context or my previous query directly.",
    ),
]


CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS)
CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC)
...
index.query("my query", similarity_top_k=3, refine_template=CHAT_REFINE_PROMPT)
Add a reply
Sign up and join the conversation on Discord