Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Prompts
Prompts
Inactive
0
Follow
D
DangFutures
last year
Β·
is there a way to intergrate lllama index and langchain hub. Want to use the json loader from llama index and node parser but need this from langchain import hub
QA_CHAIN_PROMPT = hub.pull("rlm/rag-prompt-mistral for the model...
I thought llms were suppose to make life easier
L
D
3 comments
Share
Open in Discord
L
Logan M
last year
I would think the prompts in langchain hub are not compatible with our RAG pipelines.
The RAG prompts have some specific variable requirements
You can likely adapt easily though
https://gpt-index.readthedocs.io/en/stable/examples/customization/prompts/completion_prompts.html
L
Logan M
last year
We are in the middle of changing how we expose prompts too. Should be easier at some point π
D
DangFutures
last year
Ty senpai
Add a reply
Sign up and join the conversation on Discord
Join on Discord