Find answers from the community

Updated 2 months ago

Prompts

hi all, does anyone have any reading materials on building prompts for index construction? In the docs (https://docs.llamaindex.ai/en/stable/core_modules/model_modules/prompts.html#modify-prompts-used-in-index-construction) what exactly do these prompts look like?
Second, is it possible to create prompt templates for query_engines where you can pass a system and a user instruction or is that only for chat completions? thanks kindly for any guidance!
E
t
6 comments
Prompts for index construction already have default ones, the customization is for who wants to try something different, so would recommend to only customize if you have any purpose

And yes, you can customize systemprompt for query engines, through the service context
Plain Text
service_context = ServiceContext.from_defaults(system_prompt="your prompt")

and the user query you can pass when doing the query

Plain Text
query_engine.query('user query')
Jerry Liu have this thread as well about the new prompt customization module:

https://x.com/jerryjliu0/status/1716843000704434374?s=46&t=N7mwIlS7__yWTrTCsWEN2w
ok! thanks so much for sharing @Emanuel Ferreira
@Emanuel Ferreira do I need to use the prompt_helper class in some capacity? I just tried to create my first prompt and I got a cryptic ValueError:
ValueError: Got a larger chunk overlap (-36) than chunk size (-366), should be smaller. coming from here:
C:\anaconda3\envs\kedro_workbench_venv\lib\site-packages\llama_index\text_splitter\token_splitte β”‚ β”‚ r.py:57 in __init__
what I dont understand is that my service_context specifies a SentenceWindowNodeParser for chunking, so I'm guessing this chunking is coming from the prompt text?
mybe can you share some snippets of your code?
Add a reply
Sign up and join the conversation on Discord