Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
how to integrate a model running on
how to integrate a model running on
Inactive
0
Follow
s
sha-sy
last year
Β·
how to integrate a model running on another server to llama-index?
W
1 comment
Share
Open in Discord
W
WhiteFang_Jr
last year
You can use the customLLM class from LlamaIndex to use custom LLM.
for ref:
https://gpt-index.readthedocs.io/en/stable/core_modules/model_modules/llms/usage_custom.html#example-using-a-custom-llm-model-advanced
Add a reply
Sign up and join the conversation on Discord
Join on Discord