Find answers from the community

Updated 2 years ago

Costs

At a glance

The community member is exploring the use of the OpenAI API within the llamaindex library and is looking for a way to limit the number of tokens generated to control costs. In the comments, another community member suggests using the token usage predictor feature, which allows testing functions before spending money. The original poster acknowledges this suggestion with a "thank you" response.

Useful resources
hi, is there a way to limit the OpenAI API tokens generated in llamaindex? just wanted to control cost since I am exploring using my own funds. πŸ˜„
L
d
2 comments
There is a token usage predictor, so that you can test functions before spending money πŸ’°

https://gpt-index.readthedocs.io/en/latest/how_to/analysis/cost_analysis.html
ok thank you!!!
Add a reply
Sign up and join the conversation on Discord