Find answers from the community

Updated 3 months ago

Hi there a quick question I ve looked at

Hi there. a quick question. I've looked at the prompt templates but it looks like you have to use them each for seperate use cases. is there an option to just write a prompt for general purpose?

like in langchain you define a prompt_template and then you use prompt = Prompt(template=prompt_template) and you just use llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0))

here in llamaindex when we are building an index you have to pass like summary_template=SUMMARY_PROMPT which is for a specific usecase
L
A
9 comments
By default, you don't have to worry about the prompt templates, as it's all handled under the hood using the default prompts.

So you only have to pass in prompts when you are trying to do instruct the LLM to do something very specific, or maybe you have a better prompt that is specific to your data.

The reason there are so many default prompts is because each index does things slightly differently.
thank you. so for example if i want to set an introduction and what i do normally do in gpt3 apps or in chatgpt which i say like
Plain Text
you're a coding bot. explain and give example based on users input
user: {input}
explain: 

or like
Plain Text
 you're a doctor. based on the problem give a prescription
problem:{problem}
prescription:

I do not need to set this master prompt?
Ah I see what you mean. Llama Index is meant to be an interface for finding specific specific information/context and providing that to an LLM to generate a response to a query

In your case, you are looking for an actual "chat" persona. I think this notebook does what you are looking for: https://github.com/jerryjliu/gpt_index/blob/main/examples/langchain_demo/LangchainDemo.ipynb
oh yeah i was also looking a this.
look like combination of langchain and llamaindex is what I'm looking for and it's powerful a lot
Yea! So it sounds like that is more what you are looking for -- langchain is the user interface/frontend, and llama index is a tool it uses to talk to/help people 🧠
make senes thank you.
so to summarize
better to use llama index for making an index and after that just use langchain?
cause you can make an index in other ways as well but the features regarding to indexing is what makes llama index useful? ( I mean it has index in its name lol )
I might have missed something. cause query function from llama index also can find specific texts.. but doesn't a similarity search from facebook or openai can do the same?
Right, at the most basic level it finds the most similar texts using embeddings. But then it calls an LLM to synthesize a natural language response using that similar text as context. And depending on the index used (vector, tree, knowledge graph) there are a few variations on this that can be helpful.

Furthermore, llama index takes care of all the text chunking, storage, and more for you.

So, it's pretty unique at the end of the day. I definitely recommend playing around with it a bit to get some firsthand experience in what it is doing πŸ’ͺ
awesome. yeah make sense.
thanks for the answer!
Add a reply
Sign up and join the conversation on Discord