Find answers from the community

Updated last year

Hey, I seen on the readme that LamaIndex

Hey, I seen on the readme that LamaIndex ts library supports llama2 models, is there a guide how to use it? Id like to hook it up to the local ollama instance
1
L
B
d
22 comments
it supports it only through replicate right now, there's no ollama integration yet
If you want to contribute one though, that would be amazing. Should be pretty easy
I am super new to this, so I dont think I can do it. I just was able to understand the interface of ollama, and trying to learn more about indexing and local data feeding into the models. Though I would be interested how it is done.
Also I seen 3 prs closed with a message: I want to do it differently on llamaindex gh, so not sure about it πŸ˜„
which PRs did you see that said this?
ay yea, I would ignore anything sweep made lol
Also just a question if I am here, is there a way to skip open ai token validation?
In llama-index-ts? Nope (not unless you set embeddings and your LLM to be something other than openai)
I would be interested to have a look at this. Let me know how to get started. Thank you!
We have a small contributing guide here covering the setup
https://github.com/run-llama/LlamaIndexTS/blob/main/CONTRIBUTING.md

As for what to actually add, you probably just need to add an LLM file for ollama here
https://github.com/run-llama/LlamaIndexTS/tree/main/packages/core/src/llm
Sure! I will look into it sometime today. Thank you.
Thank you for looking into this, is it possible that you ping me if you have something? Would like to see it how its done and also try it out
Any news here?
Not yet. I couldn't get a chance to pick it up @Brum. Looks like I will only be able to pick it up after 10 days as I have been really busy in family function. Sorry about the same.
No problem, I looked onto it and I'm unable to do it within a month, need to understand more about llms. Sorry for bothering you, and I'm going to be happy any time if you are able to look at it. And good luck with anything that's going on in your life
No worries. Will have a look once I am back from the function. Thank you!
I am also a beginner so it might take sometime for me to raise a PR. Thought of letting you know. Will keep you in loop.
Hello @disco.dr, any update on this? I think I got to a point where slowly I could accommodate learning more and trying to understand how to do this task. So I am just checking so we accidentally don't do a double effort πŸ™‚ This would make an awesome addition to a LlamaIndex Obsidian plugin for local Ollama usage ^^
@Brum Please pick it up if you can. Let me know if I could be of any help. Have been busy with assignments.
By the way the team has come up with Ollama package in llamahub. https://docs.llamaindex.ai/en/latest/examples/llama_hub/llama_pack_ollama.html#
Add a reply
Sign up and join the conversation on Discord