Find answers from the community

Updated 3 months ago

In memory

Thank you, Logan. I understand now when you said, "Keep it in memory," In the PHP world from where I came from ;), we use to use Redis, for example, to save data in memory. What would be the Python or Llama index way?
L
T
3 comments
Heh for me, when I said in memory, I meant that with python you would load the index at the start of your application, and it would live in memory until the application dies in some kind of global variable.

Then while it's loaded, you can answer queries

In a server setting this makes sense, sense the index is loaded when the server starts up, and then there might be a specific endpoint to handle queries

Tbh if you can, deploying llama index as a server/microservice is probably the ideal way if your application isn't built entirely in python lol
I got it; it makes sense; yeah, load the index and keep it there until the app dies. And no, my idea is to create an API using FastAPI to hit the query and build all the front end separately. Thank you again! You have been so helpful.
Awesome sounds good! FastAPI is a great choice πŸ’ͺπŸ’ͺ
Add a reply
Sign up and join the conversation on Discord