Find answers from the community

s
F
Y
a
P
Updated 2 years ago

I used the example notebook jerryjliu98

I used the example notebook @jerryjliu0 created, but a lot of the query responses talk about the "previous context" or "existing answer"
L
j
f
6 comments
The current "internal prompts" for llama_index are optimized for davinci-003. People here are trying to figure out prompts/methods that will work with ChatGPT, but I haven't seen any solution yet. ChatGPT is pretty... obtuse to use haha

You can customize the default prompts yourself if you'd like to experiment πŸ’ͺ
yeah we're actively working on optimizing this for llamaindex!
Awesome! Is there a way to pass this into a composable graph query?
I tried passing it in directly which didn't work. Setting it in query configs didn't seem to do much either.
@foggyeyes i think so! you need to specify the chatgpt llm_predictor when creating the graph, and also set the refine_template kwarg in the query_kwargs section of each QueryConfig
Add a reply
Sign up and join the conversation on Discord