Find answers from the community

s
F
Y
a
P
Updated 2 years ago

Llm settings

Hello community and all - a Sorta-noob question: How is that when a) creating an index from docs , and also b) setting up inference (e.g., loading same index file and querying it) in BOTH cases we can instantiate a LLM object and in both cases we can set "temperature" (higher vals mean more 'creative' responses). Does setting 'temperature' have any meaning at index-creation time? If so what? I could see that it would have impact at inference time.. Thanks.
L
1 comment
Some indexes use the LLM at index creation time (knowledge graph, tree, keyword)

For these indexes, sometimes you need a higher quality LLM when creating the index, since it's building an index based on LLM output

So, in both cases, you can specify the LLM and temperature
Add a reply
Sign up and join the conversation on Discord