I am using CTransformers and trying to run it in the llamaindex pipeline [[response = qp.run(query_str="What is the correlation between survival and age?"]] but I am getting the following error:
AttributeError: 'CTransformers' object has no attribute 'set_callback_manager'
Can someone help me understand how to get a query pipeline working with my CTransformers local model?
import os from langchain_community.llms import CTransformers from llama_index.core import Settings
Set TRANSFORMERS_OFFLINE environment variable to 1
Yeah, I have it wrapped when I pass it here. The problem is when I call the query pipeline. llm = CTransformers( model = LLM_MODEL_NAME, model_type="gguf", callbacks=callback, config = config )
ah, that worked. So even though I wrapped it in the settings, I also need to wrap it everywhere else? Understood. Thank you for your help! You are awesome and you for your prompt solutions!