Find answers from the community

Updated 4 months ago

I am using ollama local model api which

At a glance

The community member is using the ollama local model API and is able to test it using the OpenAI API request. However, when using the llamaindex library, the community member is encountering an issue where the code is saying the API is invalid. The community member is wondering if they can change the llamaindex baseUrl to their local model API.

The comments suggest that there is a specific ollama class that can be used, and provide instructions on how to install the required package and set up the Ollama class with the correct base URL. The community members also discuss how to integrate the Ollama class into existing code that uses the PandasQueryEngine.

There is no explicitly marked answer, but the community members provide suggestions and guidance on how to resolve the issue.

I am using ollama local model api which I test it using OpenAI api request way and is working the only thing I changed was the baseUrl with my api and work but now when I am using llamaindex and I put my api in the env file and run the code is saying invalid api I was just wondering if I can change the llamaindex baseUrl to my local model api
L
C
W
17 comments
There is a specific ollama class too lol
pip install llama-index-llms-ollama
from llama_index.llms.ollama import Ollama
llm = Ollama(model="llama2", request_timeout=60.0, base_url="http://localhost:11434")
Wow thanks man
How would I add that to this code : `from dotenv import load_dotenv
import os
import pandas as pd
from llama_index.core.query_engine import PandasQueryEngine
from prompts import new_prompt, instruction_str
from llama_index.core.llms.custom import CustomLLM




load_dotenv()

#my quran.json file
data_path = os.path.join('data', 'data.csv');
#load
data_df = pd.read_csv(data_path);

data_query_engine = PandasQueryEngine(df=data_df, verbose=True, instruction_str=instruction_str);
data_query_engine.update_prompts({"pandas_prompt": new_prompt})
data_query_engine.query("how many total hasanat in the the whole quran")
data_query_engine = PandasQueryEngine(df=data_df, verbose=True, instruction_str=instruction_str, llm=llm)
Thanks man I really appreciate your help man thanks a lot
Or you can just set a global default

Plain Text
from llama_index.core import Settings

Settings.llm = llm
Yea no worries :dotsCATJAM:
So like this to set global:

`from llama_index.llms.ollama import Ollama
from llama_index.core import Settings
llm = Ollama(model="llama2", request_timeout=60.0, base_url="http://localhost:1143
Plain Text
Settings.llm = llm
i got error:
ModuleNotFoundError: No module named 'llama_index.core.llms.ollama'
when i do this:
from llama_index.core.llms.ollama import Ollama
i get a error:
Traceback (most recent call last):
File "/Users/ahmednadiir/Desktop/agency/main.py", line 6, in <module>
from llama_index.core.llms.ollama import Ollama
ModuleNotFoundError: No module named 'llama_index.core.llms.ollama'

altho i run : pip install llama-index-llms-ollama
@WhiteFang_Jr @Logan M
Try importing it like
from llama_index.llms.ollama import Ollama
Add a reply
Sign up and join the conversation on Discord