Find answers from the community

Updated 2 years ago

Hello again just trying to refine the

Hello again, just trying to refine the behavior between Vicuna and llama_index, I can get the responses from the model, but looks they're lost because of the "second question".

I have this prompt template

Plain Text
QA_PROMPT_TMPL = (
    "### Human: Considering the following code:\n"
    "{context_str}\n"
    "{query_str}\n ### Assistant: \n"
)


If I print the response inside CustomLLM._call method I see this response:

  1. Production Machine Data Source is a data source class for vending machines that provides a set of APIs to interact with the machine. The creator of this class is "Sergio Casero" and it was created on 18/04/2023.
For this question:

Who creates the code?, This is so nice, but if I print the response from response = index.query("Who creates the code?", text_qa_template=QA_PROMPT, similarity_top_k=1), I get an empty response, any ideas?
S
L
6 comments
Did you also customize the refine template?
Honestly I didn't know that it could be done
Add a reply
Sign up and join the conversation on Discord