The community member is looking to build a local running application that can operate without internet and only uses local large language models (LLMs). They want the application to provide citations for the information it provides. The community member is considering using llama index with Python or TypeScript, or using langchain itself. Another community member suggests using Python as it has more support, especially for open-source LLMs.
My use case is the following: I want a local running application, that can run without internet and only uses local LLMs. The chat should make citations on where it found the relevant information.
My questions: Should I use llama index with python or typescript? I would prefer typescript and build a next app, but it seems more difficult since not everything is supported.