Hey, I seen on the readme that LamaIndex ts library supports llama2 models, is there a guide how to use it? Id like to hook it up to the local ollama instance
I am super new to this, so I dont think I can do it. I just was able to understand the interface of ollama, and trying to learn more about indexing and local data feeding into the models. Though I would be interested how it is done.
Not yet. I couldn't get a chance to pick it up @Brum. Looks like I will only be able to pick it up after 10 days as I have been really busy in family function. Sorry about the same.
No problem, I looked onto it and I'm unable to do it within a month, need to understand more about llms. Sorry for bothering you, and I'm going to be happy any time if you are able to look at it. And good luck with anything that's going on in your life
No worries. Will have a look once I am back from the function. Thank you! I am also a beginner so it might take sometime for me to raise a PR. Thought of letting you know. Will keep you in loop.
Hello @disco.dr, any update on this? I think I got to a point where slowly I could accommodate learning more and trying to understand how to do this task. So I am just checking so we accidentally don't do a double effort π This would make an awesome addition to a LlamaIndex Obsidian plugin for local Ollama usage ^^