Find answers from the community

Updated 2 months ago

how can I obtain the total token usage like: ` llm_predictor.last_token_usage` but when us

how can I obtain the total token usage like: llm_predictor.last_token_usage but when using an AgentExecutor created from the create_llama_chat_agent function?
k
2 comments
Please stand by @TennisPrestigious624 as I search the knowledge base. We are experiencing especially high traffic right now and might take longer than usual. :wave:
----------------------
Hmm, I don't know enough to give you a confident answer yet. However, you can refer to the LlamaIndex documentation for more information: https://gpt-index.readthedocs.io/en/latest
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord