Find answers from the community

Updated 3 months ago

openaiagent with vector store?

openaiagent with vector store?
k
a
4 comments
Please stand by @andysingal as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, you can use the OpenAIAssistantAgent with a vector store in LlamaIndex. Here's a step-by-step guide on how to do it:

  1. Import the necessary modules:
Plain Text
from llama_index.agent.openai import OpenAIAssistantAgent
from llama_index.core import (
    SimpleDirectoryReader,
    VectorStoreIndex,
    StorageContext,
)
from llama_index.vector_stores.supabase import SupabaseVectorStore
from llama_index.core.tools import QueryEngineTool, ToolMetadata

  1. Load your data:
Plain Text
!mkdir -p 'data/10k/'
!wget '<https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/10k/uber_2021.pdf'> -O 'data/10k/uber_2021.pdf'
!wget '<https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/10k/lyft_2021.pdf'> -O 'data/10k/lyft_2021.pdf'

# load data
reader = SimpleDirectoryReader(input_files=["./data/10k/lyft_2021.pdf"])
docs = reader.load_data()
for doc in docs:
    doc.id_ = "lyft_docs"

  1. Set up your vector store and index:
Plain Text
vector_store = SupabaseVectorStore(
    postgres_connection_string=(
        "postgresql://<user>:<password>@<host>:<port>/<db_name>"
    ),
    collection_name="base_demo",
)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(docs, storage_context=storage_context)

  1. Create your agent:
```python
lyft_tool = QueryEngineTool(
query_engine=index.as_query_engine(similarity_top_k=3),
metadata=ToolMetadata(
name="lyft_10k",
description=(
"Provides information about Lyft financials for year 2021. "
"Use a detailed plain text question as input to the tool."
),
),
)
agent = OpenAIAssistantAgent.from_new(
name="SEC Analyst",
instructions="You are a QA assistant designed to analyze SEC filings.",
tools=[lyft_tool],
verbose=True,
run_retrieve_sleep_time=1.0,
)
Plain Text
5. Use your agent to chat:
python
response = agent.chat(
"Tell me about Lyft's risk factors, as well as response to COVID-19"
)
print(str(response))
`` Please replace the placeholders in the postgres_connection_string` with your actual database details.

For more details, you can refer to the source ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@WhiteFang_Jr @Logan M any examples(not notebook) to use vector db with openaiagent for chat https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_query_cookbook/...
Add a reply
Sign up and join the conversation on Discord