Find answers from the community

Updated 11 months ago

Hi, How can I setup a

Hi, How can I setup a DocumentSummaryIndex using custom llm instead of OpenAI default. I have set mistral as my LLM in the srvice context but still it makes some request to OpenAI and I get OpenAI rete limit. BTW my embed_model is also a custom embed model.
llm_model_name= "Mistralai/Mistral-7B-Instruct-v0.2" llm = HuggingFaceLLM(model_name=llm_model_name) service_context = ServiceContext.from_defaults(llm=llm,embed_model=embed_model)
What am i missing to make it work with my default LLM model rather than the OpenAI GPTs.
W
1 comment
Try setting the service_context global once
Plain Text
from llama_index import set_global_service_context

set_global_service_context(service_context)
Add a reply
Sign up and join the conversation on Discord