Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 8 months ago
0
Follow
Hello, does anyone have an idea about
Hello, does anyone have an idea about
Inactive
0
Follow
e
emy
8 months ago
Β·
Hello, does anyone have an idea about which functionality would be the least expensive if I want to integrate it from Llamaindex to my project: Llamaindex agents, Sub Question Query, or summarization?
T
e
P
3 comments
Share
Open in Discord
T
Teemu
8 months ago
Open source LLMs are free π
But in general, agents can be more expensive to run because they tend to require a more powerful LLM.
Summarization can also be expensive since you're passing in a lot of content to the LLM = high token usage.
e
emy
8 months ago
thank you so much for ur response
P
PwnosaurusRex
8 months ago
You could add this to your code and benchmark. Open source = Free...sometimes. If self hosted yes, if using another service, likely not.
https://docs.llamaindex.ai/en/stable/examples/callbacks/TokenCountingHandler/?h=token
Add a reply
Sign up and join the conversation on Discord
Join on Discord