Find answers from the community

Updated 2 months ago

is the llama networks component (also?)

is the llama networks component (also?) designed to setup a client-server model, eg a streamlit app that talks to a llamindex service to answer the query?
s
L
a
9 comments
i want to separate the webservice more from the backend, so eg they don't need access to secrets as much (secrets to access the LLM, the vectordatabase etc etc)
i could write a separate rest api ofcourse, but was wondering if the llama-network code can already do it
currently llama-index networks sets up a fastapi server over a query engine. Definitely open to contributions and ideas for expanding it though πŸ‘
I guess create-llama with a fastapi backend kinda of gets to what your looking for. But frontend is a next.js app.
i gave it some more thought, and will setup something simple with fastapi.
main reason is that even if llama_inde creates server client mode, i will have logic around the usual llama_index api that i also want to keep out of the webservice part (simply because it's not webservice related)
Sounds like you've given it some good thought and landed on something that makes sense for you -- kudos!

Do you think you'd be able to provide feedback as to why wrapping your query engine with a ContributorService that exposes it behind a fastapi REST app anyways, and modifying the app thereafter to your needs would not work?
My thought process here was that you'd be able to customize app to your needs, but perhaps that doesn't cover your case?
Add a reply
Sign up and join the conversation on Discord