Find answers from the community

Updated 4 months ago

Hi! I'm trying to create a local version

At a glance
Hi! I'm trying to create a local version of the semantic search example but it keeps prompting me for an OpenAI key. I cannot use OpenAI at all. Could someone point me to an example that only uses the local computer? Ideally I'd like to use Ollama as the LLM for answering the query.
https://docs.llamaindex.ai/en/latest/understanding/putting_it_all_together/q_and_a.html#semantic-search
L
k
4 comments
You need to configure both a locall LLM and local embedding model
Usually, it's easiest to setup a service context and then set that context as the global default
Plain Text
from llama_index import ServiceContext, set_global_service_context

ctx = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)
set_global_service_context(ctx)
Fantastic, that worked really well! Here is the final code. Thank you!

Plain Text
from llama_index import VectorStoreIndex, SimpleDirectoryReader,ServiceContext, set_global_service_context
from llama_index.llms import Ollama

llm = Ollama(model="mistral", request_timeout=60.0)

ctx = ServiceContext.from_defaults(llm=llm, embed_model="local")
set_global_service_context(ctx)

documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()
response = query_engine.query("What is the authors name?")
print(response) 
Add a reply
Sign up and join the conversation on Discord