Find answers from the community

Updated 6 months ago

Prompts

At a glance

The community member is trying to integrate LlamaIndex and LangChain Hub, specifically wanting to use the JSON loader from LlamaIndex and the node parser, as well as the QA_CHAIN_PROMPT from LangChain Hub. Another community member suggests that the prompts in the LangChain Hub may not be compatible with the RAG pipelines, but notes that the community member can likely adapt them easily. The second community member also mentions that they are in the process of changing how they expose prompts, which should make it easier in the future.

Useful resources
is there a way to intergrate lllama index and langchain hub. Want to use the json loader from llama index and node parser but need this from langchain import hub
QA_CHAIN_PROMPT = hub.pull("rlm/rag-prompt-mistral for the model...


I thought llms were suppose to make life easier
L
D
3 comments
I would think the prompts in langchain hub are not compatible with our RAG pipelines.

The RAG prompts have some specific variable requirements

You can likely adapt easily though

https://gpt-index.readthedocs.io/en/stable/examples/customization/prompts/completion_prompts.html
We are in the middle of changing how we expose prompts too. Should be easier at some point πŸ™
Add a reply
Sign up and join the conversation on Discord