Find answers from the community

Home
Members
nischalmdnvr
n
nischalmdnvr
Offline, last seen 3 months ago
Joined September 25, 2024
Hello! i am trying to create a service where i can load files and create index ... does anyone know how to load the file directly to loader/PDFReader instead of the path

the code below isnt working

Plain Text
@app.post("/add-sources/")
async def create_file(file: UploadFile):
    print(file.filename)
    loader = SimpleDirectoryReader("")
    print("Processing PDF files...")
    source_doc = loader.load_data(file=file)
    return {"file_size": file.filename}
12 comments
W
n
n
nischalmdnvr
·

Agent

Do we have something similar for locally hosted models ?
Agents for locally hosted models ?

Plain Text
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI

# import and define tools
...

# initialize llm
llm = OpenAI(model="gpt-3.5-turbo-0613")

# initialize openai agent
agent = OpenAIAgent.from_tools(tools, llm=llm, verbose=True)
2 comments
n
W
I need help with this issue

Plain Text
 100 callback_manager = callback_manager_from_settings_or_context(
    101     Settings, service_context
    102 )
    103 if len(query_engine_tools) > 0:
--> 104     callback_manager = query_engine_tools[0].query_engine.callback_manager
    106 llm = llm or llm_from_settings_or_context(Settings, service_context)
    107 if question_gen is None:
3 comments
n
W
Hello!...I need help with something.... lets say you want to chat with multiple data sources and you have multiple pdfs in each one of them, and switch between sources whenever needed...
how do i approach this with llamaindex ?

ps : i am using Ollama
2 comments
n
L