Find answers from the community

Updated last month

Instantiating a SummaryIndex without Setting the LLM in the Settings Global Class

At a glance

The community member is asking if it's possible to instantiate a SummaryIndex without setting the LLM in the Settings global class, and is looking for something similar to how it's done in VectorStoreIndex, where they can pass the embed_model as an argument. The community member mentions that when they use the Settings class, everything works as expected, but this approach isn't an option due to concurrency issues, and they use dependency injection to handle LLM instances.

In the comments, another community member suggests that this is only possible when creating a query engine or chat engine.

Hi everyone,

Is it possible to instantiate a SummaryIndex without setting the LLM in the Settings global class? I’m looking for something similar to how it’s done in VectorStoreIndex, where I can pass the embed_model as an argument.

When I use the Settings class, everything works as expected, but this approach isn’t an option for me due to concurrency issues. In my app, I use dependency injection to handle LLM instances.
L
c
2 comments
only when creating a query engine/chat engine
Thank you! 🤘
Add a reply
Sign up and join the conversation on Discord