Find answers from the community

Updated 8 months ago

Hello, does anyone have an idea about

Hello, does anyone have an idea about which functionality would be the least expensive if I want to integrate it from Llamaindex to my project: Llamaindex agents, Sub Question Query, or summarization?
T
e
P
3 comments
Open source LLMs are free πŸ‘€

But in general, agents can be more expensive to run because they tend to require a more powerful LLM.

Summarization can also be expensive since you're passing in a lot of content to the LLM = high token usage.
thank you so much for ur response
You could add this to your code and benchmark. Open source = Free...sometimes. If self hosted yes, if using another service, likely not.

https://docs.llamaindex.ai/en/stable/examples/callbacks/TokenCountingHandler/?h=token
Add a reply
Sign up and join the conversation on Discord