Hi there. a quick question. I've looked at the prompt templates but it looks like you have to use them each for seperate use cases. is there an option to just write a prompt for general purpose?
like in langchain you define a prompt_template and then you use prompt = Prompt(template=prompt_template) and you just use llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0))
here in llamaindex when we are building an index you have to pass like summary_template=SUMMARY_PROMPT which is for a specific usecase
By default, you don't have to worry about the prompt templates, as it's all handled under the hood using the default prompts.
So you only have to pass in prompts when you are trying to do instruct the LLM to do something very specific, or maybe you have a better prompt that is specific to your data.
The reason there are so many default prompts is because each index does things slightly differently.
Ah I see what you mean. Llama Index is meant to be an interface for finding specific specific information/context and providing that to an LLM to generate a response to a query
Yea! So it sounds like that is more what you are looking for -- langchain is the user interface/frontend, and llama index is a tool it uses to talk to/help people π§
make senes thank you. so to summarize better to use llama index for making an index and after that just use langchain? cause you can make an index in other ways as well but the features regarding to indexing is what makes llama index useful? ( I mean it has index in its name lol )
I might have missed something. cause query function from llama index also can find specific texts.. but doesn't a similarity search from facebook or openai can do the same?
Right, at the most basic level it finds the most similar texts using embeddings. But then it calls an LLM to synthesize a natural language response using that similar text as context. And depending on the index used (vector, tree, knowledge graph) there are a few variations on this that can be helpful.
Furthermore, llama index takes care of all the text chunking, storage, and more for you.
So, it's pretty unique at the end of the day. I definitely recommend playing around with it a bit to get some firsthand experience in what it is doing πͺ