Find answers from the community

Updated 7 hours ago

Instantiating a SummaryIndex without Setting the LLM in the Settings Global Class

Hi everyone,

Is it possible to instantiate a SummaryIndex without setting the LLM in the Settings global class? I’m looking for something similar to how it’s done in VectorStoreIndex, where I can pass the embed_model as an argument.

When I use the Settings class, everything works as expected, but this approach isn’t an option for me due to concurrency issues. In my app, I use dependency injection to handle LLM instances.
L
c
2 comments
only when creating a query engine/chat engine
Thank you! 🤘
Add a reply
Sign up and join the conversation on Discord