Find answers from the community

Updated last year

Hello everyone not sure this is the

At a glance
Hello everyone, not sure this is the right channel but, after the relevant text is retrieved from a document, what Question answering model are you using at the moment?
W
a
4 comments
So after the relevant text are retrieved. They are sent to LLM ( It can be any LLM GPT-3,3.5,4 or any open source model ). There are different Prompt templates that you can use to present your retrieved template in a more meaningful manner.
yes sorry my question wasn't clear, I was just curious about what LLM model you are using at the moment, I would like to use the smallest model i can of course
was just curious about what's performing best for the ppl in the community
Smallest model would be a opensource model. But small OpenSource model are not good at performing task.

LlamaIndexc default model is llama2 if OpenAI API Key is not available in your code.
Add a reply
Sign up and join the conversation on Discord