Find answers from the community

Updated 5 months ago

```

At a glance

The post discusses an issue with accessing the index after the initial creation of documents. A community member suggests trying the following to access the index in the second iteration:

vector_store = SupabaseVectorStore(postgres_connection_string=connectionString, collection_name="llama_demo") index = VectorStoreIndex.from_vector_store(vector_store=vector_store)

Another community member provides a bonus tip to define the service context globally to avoid adding it everywhere.

The comments also mention encountering an error related to the OpenAI API response and a warning about the query not having a covering index. The community members discuss updating the code and resolving a filtering issue.

Finally, a community member mentions getting an error when trying to use the sub-query engine, specifically an OutputParserException due to an invalid return object.

There is no explicitly marked answer in the provided information.

Useful resources
Plain Text
vector_store = SupabaseVectorStore(
    postgres_connection_string=connectionString,
    collection_name="llama_demo",
)


storage_context = StorageContext.from_defaults(vector_store=vector_store)


index = VectorStoreIndex.from_documents(documents, storage_context=storage_context)


filters = MetadataFilters(filters=[ExactMatchFilter(key="workspaceId", value="25juldeplo482af4cd83")])


retriever = index.as_retriever(filters=filters)
ans = retriever.retrieve("query?")


issue = i dont have access to index after intial creation of documents
W
k
11 comments
You can try the following.

Plain Text
vector_store = SupabaseVectorStore(
    postgres_connection_string=connectionString,
    collection_name="llama_demo",
)


# for second iteration when you have already created the indexes in the first run
index = VectorStoreIndex.from_vector_store(vector_store=vector_store)
Bonus Tip: You can also define service context globally so you do not have to add it everywhere.
thanks @WhiteFang_Jr
but gttign this error
Plain Text
ge='OpenAI API response' path=https://api.openai.com/v1/embeddings processing_ms=80 request_id=12405e2561e906cf281aefb36c97809f response_code=200
/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/vecs/collection.py:182: UserWarning: Query does not have a covering index for IndexMeasure.cosine_distance. See Collection.create_index
  warnings.warn(
DEBUG:llama_index.indices.utils:> Top 0 nodes:
hey can you check with the updated code once
ok i think there is a filtering issue on my side
it works thanks
but i am getting error trying to yuse sub query engine: llama_index.output_parsers.base.OutputParserException: Got invalid return object. Expected markdown code snippet with JSON object, but got: [
{
"sub_question": "What is the ...",
"tool_name": "vector index"
}
]
Add a reply
Sign up and join the conversation on Discord