Find answers from the community

Updated 3 months ago

Quick question: Is there a public

Quick question: Is there a public roadmap for llama-index as a framework? Or are we adding more features as we discover new use cases? I would be interested in the general future direction of the framework, what are we trying to achieve, let's say for 2024?
L
S
4 comments
The space evolves so quickly that it can be hard to specify exact features

I think in general, the overall goals are
  • More "deployment" focused examples and fixes (i.e. memory usage, load times, better storage, etc.)
  • Better support/examples for "large dataset" type problems
  • Improved open-source LLM support (specifically with prompts, structured data)
  • Code refactors (knowledge graph, tree index)
  • more/improved async support
  • batch support
Yeah, I agree. This space is moving so fast. πŸ˜„. I hope next year, open source LLM will finally catch up or at least reliably functional on crucial features like tool calling, agent etc.
@Logan M Also a bit related to this topic. Do you have any plan to release v1, what features would you expect before the v1 release?
I think for me, V1 means stability. So I think V1 is more about "vibes" than actual features haha

For example
  • Most LLMs work fine out of the box with core features
  • I want to better oragnize/package the never ending integrations
  • remove experimental or dead features from the core repo
  • deploying to aws, render, etc. and setting up a fastapi service is a breeze
All that would make me more comfortable labelling something as V1 πŸ™‚
Add a reply
Sign up and join the conversation on Discord