Find answers from the community

Updated last year

LLM

I thought that llama-index was essentially a way to add extra information on top of ChatGPT or other models
So there's no way that it could perform worse than not having the extra info :THONK:
L
C
5 comments
What is "ChatGPT 3.5" ?

The models used in the OpenAI chatgpt interface are not the same as gpt-3.5-turbo.

Tbh, gpt-3.5-turbo is generally not great in a lot of caess, but it is very cheap.
Since you input an entire book, I would increase the top k a bit
Maybe to 4 or 5, given that your chunk size is 512
When using llama index, it's supposed to only answer using the top k nodes it retrieved. So it might not be retrieving enough to answer the question πŸ€”
Ahh okay, that makes sense. Maybe I should use another model, too
Add a reply
Sign up and join the conversation on Discord