Find answers from the community

s
strike
Offline, last seen 3 months ago
Joined September 25, 2024
Hi, can anyone help me with

"How can I add a new text Node into an existing Document?"

And this Document will then be inserted in a VectorStoreIndex.
2 comments
s
L
I think there is a bug in SimpleDirectoryReader or the Documentation needs to be updated.
I am unable to parse/index .ppt but the .pptx's are just fine.

Anyone can help/confirm if what needs to be done.
Thanks,
24 comments
L
s
p
Hi, I am using Vectorstoreindex and persisting it locally on disk and then storing them in cloud storage; I am handling multiple indices; one per user...
I observed; that is quite slow in retrieval and adding data to it; because have to fetch from cloud (storage) every time I have to read/add to it.
Is there any-way i can speed that up? probably using any other vector store options
I was looking at this article;
https://docs.llamaindex.ai/en/latest/module_guides/storing/vector_stores/#vector-store-options-feature-support

And it is using different databases; can anyone recommend/ comment on this. What would be good here?
4 comments
s
L
Is anyone else having this issue while using assistants
Plain Text
agent = OpenAIAssistantAgent.from_new(
    name="Math Tutor",
    instructions="You are a personal math tutor. Write and run code to answer math questions.",
    openai_tools=[{"type": "code_interpreter"}],
    instructions_prefix="Please address the user as Jane Doe. The user has a premium account.",
)

Error:
Plain Text
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/trike/api/env/lib/python3.11/site-packages/llama_index/agent/openai/openai_assistant_agent.py", line 230, in from_new
    assistant = client.beta.assistants.create(
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Assistants.create() got an unexpected keyword argument 'file_ids'


I am referencing https://docs.llamaindex.ai/en/v0.10.34/examples/agent/openai_assistant_agent/

I am using following versions:
Plain Text
llama-hub==0.0.79.post1
llama-index==0.10.5
llama-index-agent-openai==0.1.1
llama-index-core==0.10.5
llama-index-embeddings-openai==0.1.1
llama-index-legacy==0.9.48
llama-index-llms-openai==0.1.1
llama-index-multi-modal-llms-openai==0.1.1
llama-index-program-openai==0.1.1
llama-index-question-gen-openai==0.1.1
llama-index-readers-file==0.1.3
llama-index-readers-web==0.1.3
openai==1.33.0
2 comments
s
L
how to filter with either of following categories
Plain Text
filters = [
                ("source", "file"),
                ("source", "url"),
                ("source", "twitter"),
            ]
Plain Text
from llama_index.core.vector_stores import MetadataFilters, MetadataFilter, FilterCondition

filters = MetadataFilters(
    filters=[
        MetadataFilter(key="source", value="file"),
        MetadataFilter(key="source", value="url"),
        MetadataFilter(key="source", value="twitter"),
    ],
    condition=FilterCondition.OR,
)

retriever = index.as_retriever(filters=filters)
retriever.retrieve("Your query here")


I am getting empty nodes list; what is supposed to be the issue; can anyone help?
thanks
5 comments
s
L
Why does llama calls OpenAI embeddings API when creating inserting/creating into VectorStoreIndex, but the embeddings are not there in nodes/documents.

So why is it calling it?
7 comments
s
L
I have indexed many documents in a VectorStoreIndex. How can I delete a document from the Index using the documents; I donot have track of what document has what id in Index?
3 comments
W
s