My use case is the following: I want a local running application, that can run without internet and only uses local LLMs. The chat should make citations on where it found the relevant information.
My questions: Should I use llama index with python or typescript? I would prefer typescript and build a next app, but it seems more difficult since not everything is supported.