Find answers from the community

Updated 10 months ago

Service context

Curious if it's possible to use Mistral for service context? If I replace with OpenAI it works just fine.

Plain Text
llm = MistralAI(model="mistral-tiny", 
                api_key=MISTRAL_API_KEY)
....
print(type(llm))
<class 'llama_index.llms.mistral.MistralAI'>
....

service_context = ServiceContext.from_defaults(llm=llm,
                                    embed_model=embed_model,
                                    system_prompt=SPF)
....
index = VectorStoreIndex.from_documents(
    documents=[web_docs, pdf_docs],
    service_context=service_context,
    storage_context=storage_context,
    show_progress=True
)


It's throwing an OpenAI error. Appears to be trying to default to OpenAI: Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
L
s
5 comments
Are you passing the service context to the other modules in your code?

(Some modules also take the LLM directly as a kwarg)
After doing some more reading I think it may not be supported. When looking at the service context classes for Service Context here are the LLMs listed:

https://docs.llamaindex.ai/en/stable/api_reference/llms.html
I guess it might be possible to call it via the huggingface api via HuggingFaceLLM - im going to try that. Id prefer to call mistral directly
No it's supported, api ref is just bad
Hmm. I couldn't get it working. Ill keep trying
Add a reply
Sign up and join the conversation on Discord