Find answers from the community

Updated 9 months ago

I used to use following format to get

I used to use following format to get response:
response = index.query(query,
service_context=service_context,
response_mode="compact",
similarity_top_k=1,
node_postprocessors=[SimilarityPostprocessor(similarity_cutoff=0.75)],
text_qa_template= QNA_PROMPT,
refine_template = REFINE_PROMPT
)
how the code will be looks like under:query_engine.query ? what kind of import library will be needed ?
W
a
2 comments
You can pass the arguments inside query_engine!

No external imports are needed if you are simply using the default openAI setup.

I see you are using service_context, i would suggest creating a new env and do pip install llama-index
and for every intergration like if you are using Huggingface, there is a separate pypi package that needs to be installed.
here's the package guide: https://pretty-sodium-5e0.notion.site/ce81b247649a44e4b6b35dfb24af28a6?v=53b3c2ced7bb4c9996b81b83c9f01139
Add a reply
Sign up and join the conversation on Discord