Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
I have llama2 hosted on my own server
I have llama2 hosted on my own server
Inactive
0
Follow
At a glance
M
Mike
last year
Β·
I have llama2 hosted on my own server with oobabooga. I've been doing requests.post(url, json=parameters) to get a response. How can I do the same in Llamaindex?
L
M
3 comments
Share
Open in Discord
L
Logan M
last year
Probably implement a custom LLM
https://docs.llamaindex.ai/en/stable/module_guides/models/llms/usage_custom.html#example-using-a-custom-llm-model-advanced
M
Mike
last year
I barely just found that page myself. Thanks! Hope it's the right solution for me
L
Logan M
last year
Yea we don't have an official oobabooga integration, so that will be the next best thing π
Add a reply
Sign up and join the conversation on Discord
Join on Discord