Curious if it's possible to use Mistral for service context? If I replace with OpenAI it works just fine.
llm = MistralAI(model="mistral-tiny",
api_key=MISTRAL_API_KEY)
....
print(type(llm))
<class 'llama_index.llms.mistral.MistralAI'>
....
service_context = ServiceContext.from_defaults(llm=llm,
embed_model=embed_model,
system_prompt=SPF)
....
index = VectorStoreIndex.from_documents(
documents=[web_docs, pdf_docs],
service_context=service_context,
storage_context=storage_context,
show_progress=True
)
It's throwing an OpenAI error. Appears to be trying to default to OpenAI:
Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.