Find answers from the community

Updated 9 months ago

Noob question - in the quickstart

Noob question - in the quickstart tutorial there is this code

Plain Text
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")


How do I change the base url to point at a different server, not the openai server? I've combed the Internet and the site and can't find a mention anywhere.
W
V
7 comments
Not local. I am calling Llama 2 running remotely through the Oogabooga API that emulates OpenAI API for GPT 3.5. In langchain you can just feed "base_url" to the calls.
You can use OpenAIlike present in llamaindex. It emulates openai API as well.
Thank you so much. Looks promising. Where does the URL get set? Part of LOCALAI_DEFAULTS? Not finding docs on that variable.
Yeah they are hard to find πŸ˜†


Use these:
{
"api_key": "localai_fake",
"api_type": "localai_fake",
"api_base": f"http://localhost:<port>/v1",
}
You've been so helpful, thanks. Just wanted to share how I ended up skinning this. It took lots of trial and error and piecing together ideas from fragmented docs, but this actually works...

Plain Text
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI
from llama_index.embeddings.openai import OpenAIEmbedding

api_key="[REDACTED]"
base_url="[REDACTED]"

Settings.embed_model = OpenAIEmbedding(
    api_base=base_url, 
    api_key=api_key
)

Settings.llm = OpenAI(
    api_base=base_url,
    api_key=api_key
 )

documents = SimpleDirectoryReader("data").load_data()

index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()

response = query_engine.query("What did the author do growing up?")
print(response)
Add a reply
Sign up and join the conversation on Discord