Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 2 years ago
0
Follow
Token usage
Token usage
Inactive
0
Follow
k
krishnan99
2 years ago
·
Ok I’ll check that out. And interms of calculating tokens manually what components contribute to the total token usage?
Just thinking of doing a manual calculation to understand ahah
L
1 comment
Share
Open in Discord
L
Logan M
2 years ago
The embedding model and llm predictor are the two root elements that contribute to token counts
Internally, there is a function decorator that is used to count tokens for different functions in different places
Add a reply
Sign up and join the conversation on Discord
Join on Discord