Find answers from the community

Updated 3 months ago

Query

I have a table hosted on supabase with id, content, metadata, embeddings. did not use llama or langchain to create it.
is it possible to run an index over it and start querying using llama-index? thanks!
L
M
4 comments
Sadly not, the vector class expects a pretty specific structure.

You'd have to edit the query method here
https://github.com/run-llama/llama_index/blob/2de0badc7e009a8eabe587c8f48a58fae8cd58e9/llama_index/vector_stores/supabase.py#L139
argh... i created a search function on supabase, indexed vectors with hnsw, and now can run queries using langchain's
Plain Text
vector_store = SupabaseVectorStore(
    embedding=embeddings,
    client=supabase,
    table_name="documents",
    query_name="match_documents",
)

it'd be great if you could enable smth similar
Langchain also has the table schema hardcoded, but you've created the table to match their schema πŸ˜…
but in general yea, we are trying to move our vector stores to be able to work with pre-existing data. Just been a tedious process
Add a reply
Sign up and join the conversation on Discord