Find answers from the community

Updated 2 months ago

Can anyone direct me to an example of

Can anyone direct me to an example of how to use TokenCountingCallback?
L
e
11 comments
Let me know if anything doesn't make sense!
@Logan M I'm double checking all but everytime I try version 0.6.27 or above I'm getting
Plain Text
llama_index.chat_engine.react.ReActChatEngine.from_query_engine() got multiple values for keyword argument 'service_context'
do you have the full traceback and/or the code? Might be an easy fix
Plain Text
demo_server.py", line 169, in query_ind
    chat_engine = index.as_chat_engine(service_context=service_context, chat_mode='react', chat_history=custom_chat_history, verbose=False, similarity_top_k=4, text_qa_template=custom_prompt)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/llama_index/indices/base.py", line 360, in as_chat_engine
    return ReActChatEngine.from_query_engine(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: llama_index.chat_engine.react.ReActChatEngine.from_query_engine() got multiple values for keyword argument 'service_context'
And the setup:
Plain Text
token_counter = TokenCountingHandler(tokenizer=tiktoken.encoding_for_model("gpt-3.5-turbo").encode)
    callback_manager = CallbackManager([token_counter])
    llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0, model_name="gpt-3.5-turbo", max_tokens=1024, request_timeout=120))
    service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor, callback_manager=callback_manager, chunk_size=512)
    set_global_service_context(service_context)
No need to pass it in for as_chat_engine(..), it's already being passed in under the hood

I suppose there should probably be extra check for this scenario though
Attachment
image.png
Ahh.. checking
@Logan M That did the trick. And the new TokenCountingHandler is working as expected. Thanks again.
Nice! πŸ’ͺ
Add a reply
Sign up and join the conversation on Discord