DocumentSummaryIndex
using custom llm instead of OpenAI
default. I have set mistral as my LLM in the srvice context
but still it makes some request to OpenAI
and I get OpenAI
rete limit. BTW my embed_model
is also a custom embed model.
llm_model_name= "Mistralai/Mistral-7B-Instruct-v0.2"
llm = HuggingFaceLLM(model_name=llm_model_name)
service_context = ServiceContext.from_defaults(llm=llm,embed_model=embed_model)
from llama_index import set_global_service_context set_global_service_context(service_context)