Find answers from the community

Updated last month

Retrieving scientific papers for client chat

I am a bit confused here, i have a client who wants to chat or ask his data, this data will be scientific papers, so before llamaindex i was using long road using Openai, embeddings, save to Supabase, then query, so here i am a bit confused how to start, i need a bit advanced example if there is any please.
W
m
J
3 comments
You can follow this notebook: https://github.com/run-llama/llama_parse/blob/main/examples/demo_advanced_weaviate.ipynb

This contains
LlamaParse - Best PDF parser out there.
Query_engine - Allows you to query the docs that you add.

Weaviate - Vector store to keep nodes in a vector DB, Its optional

This can get you going !
thank you so much
@mahdicodex999 : The pipeline I wrote consumes general content, embeds it, and stores that in pgvector. The nice thing about that is I can use it directly on Supabase w/o having to standup any other services. You can write a single fast query joining your embeddings w/your other relational content.

@WhiteFang_Jr always has great answers, but I wanted to jump in to say it's pretty straightforward with llama-index. I tend to adopt boring technologies that just work and keep things as simple as possible.
Add a reply
Sign up and join the conversation on Discord