Find answers from the community

Updated 3 months ago

Prompts

is there a way to intergrate lllama index and langchain hub. Want to use the json loader from llama index and node parser but need this from langchain import hub
QA_CHAIN_PROMPT = hub.pull("rlm/rag-prompt-mistral for the model...


I thought llms were suppose to make life easier
L
D
3 comments
I would think the prompts in langchain hub are not compatible with our RAG pipelines.

The RAG prompts have some specific variable requirements

You can likely adapt easily though

https://gpt-index.readthedocs.io/en/stable/examples/customization/prompts/completion_prompts.html
We are in the middle of changing how we expose prompts too. Should be easier at some point πŸ™
Add a reply
Sign up and join the conversation on Discord