Find answers from the community

Updated 3 months ago

llama_index/docs/examples/query_engine/e...

Hello,
I'm trying to follow this tutorial about ensemble query engine using a local llm rather than the OpenAI API.
https://github.com/jerryjliu/llama_index/blob/main/docs/examples/query_engine/ensemble_query_engine.ipynb
For this I'm using the HuggingFaceLLM method and a finetuned version of Llama 2.
However, when I get to the end of the tutorial for the LLMMultiselector it says it's using the default Llama CPP and I get this error message when I try the query:
" KeyError: 'choice' "
Can someone help me figure this out ?
Thanks πŸ˜„
L
T
4 comments
hmm local LLMs are not great for this type of problem

the multi-selector relies on the LLM to write very specific instructions. If I had to guess, it's just not doing that right now :PSadge:
I see, is there an alternative way to use a rag composed of a keyword index and a Vector store index ? or should I just stick for a vector store index at the time being ?
thanks, will try that !
Add a reply
Sign up and join the conversation on Discord