Find answers from the community

Updated 6 months ago

Hello, @Logan M , How to set instruction

Hello, @Logan M , How to set instruction prompt for index?
Before I was setting like this
index.index_struct.summary = companyPrompt
L
o
W
11 comments
I don't know what an instruction prompt for an index is πŸ˜…

What are you doing with the index?
I was setting instruction prompt to control how our index would answer.
It was working fine last year πŸ˜‰
Btw, in latest version, I am not sure how to do.
Please guide me
Is it impossible to set prompt for index? @Logan M
Prompts aren't attached to indexes πŸ€” Not sure what your overall code looks like or what classes/modules you are using

You can provide some name/description for a query engine tool

You can modify the prompts of a query engine as well https://docs.llamaindex.ai/en/stable/examples/prompts/prompt_mixin/?h=prompts#accessingcustomizing-prompts-within-higher-level-modules
I was setting prompt like this before.
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0.7, model_name=getAIEngine(setting_docs), streaming=True), system_prompt="My prompt")

Btw, I saw that llm_predictor is not supported.
Ah this is llm prompt and not index prompt.

Index in simple terms contains vectors, docstore etc.
The prompts are used by llm while generating the answer.

LLMPredictor was deprecated way back. You can define llm object directly
Could you guide, how to set prompts for llm then?
Plain Text
from llama_index.llms.openai import OpenAI

resp = OpenAI(system_prompt="add here").complete("Paul Graham is ")
Hmm, before, llm_predictor was added to index, btw, now, I saw that we don't pass llm_predictor, any way to pass LLM and set system prompt?
Yes. Just pass the llm object anywhere you want to use it. Or add it globally

Plain Text
from llama_index.core import Settings 

Settings.llm= llm


This will make sure to use your defined your llm everywhere and you don't have to pass it anywhere
Add a reply
Sign up and join the conversation on Discord