Find answers from the community

Updated last year

How to use prompt templete with

How to use prompt templete with VecstoreStoreIndex?
W
o
18 comments
Hi, This can help youπŸ™‚
thank you, let me check
I am trying to output result as html, not text using prompt template. Before it worked, but nowdays, It is not working, I guess this might be related with version.
@WhiteFang_Jr
In Llamaindex previous version, You were able to get response in required format?

Now with the same prompt in the latest version, Response not coming as expected?
QA_PROMPT_TMPL = (
"---------------------\n"
"{context_str}"
"\n---------------------\n"+
setting_docs["companyPrompt"]+"\n Question: {query_str}\n"
)
QA_PROMPT = QuestionAnswerPrompt(QA_PROMPT_TMPL)
I defined QuestionAnswerPrompt before.
before, I got all response as html, not text

Now, all response are text regardless my prompt template
Prompt is not correctly explaining the task here. You'll need to add the instructions like. Return the output in HTML format
setting_docs["companyPrompt"]

This is json, and that context is defiend in setting_docs
as I said, it worked before well. πŸ™‚
@WhiteFang_Jr
You could try with new variations of prompt. Since it all depends on LLM. GPT-3 is very bad at following isntructions. Try with GPT3.5 or GPT4, They are much better at following instructions.
@WhiteFang_Jr , I have used gpt-4.
Add a reply
Sign up and join the conversation on Discord