Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
i am looking to use xinfrence through
i am looking to use xinfrence through
Inactive
0
Follow
s
stdweird
9 months ago
Β·
i am looking to use xinfrence through their openapi compatible rest api. for llm usage there is llama-index-llms-openai-like, but not for embedding.
s
W
L
20 comments
Share
Open in Discord
s
stdweird
9 months ago
do we need to somehow extend the openai embedding metadata with the hosted xinference models?
s
stdweird
9 months ago
or should we wait for llama-index-embeddings-openai-like (i.e. open a feature request)
W
WhiteFang_Jr
9 months ago
You can try creating a custom embedding class:
https://docs.llamaindex.ai/en/stable/examples/embeddings/custom_embeddings.html#custom-embeddings-implementation
define your endpoint and update the method from where it can call to your model.
Second option would be to use
https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/embeddings/llama-index-embeddings-ollama/llama_index/embeddings/ollama/base.py
Not sure if this will work out of the box by just providing the base_url to your model but can be a good place to start
W
WhiteFang_Jr
9 months ago
Also: If you want to contribute you are most welcome for this feature!!πͺ
s
stdweird
9 months ago
@WhiteFang_Jr i am not familiar with ollama, but from
https://github.com/ollama/ollama/issues/305
it's unclaer if they also have an openai compatble api
s
stdweird
9 months ago
teh custom embedding class is not what i want i think
s
stdweird
9 months ago
i will try to monkeypath the openai predefined models
s
stdweird
9 months ago
and see how far i get with that
s
stdweird
9 months ago
the new code structure is a bit ... euhm ... complex
s
stdweird
9 months ago
in volume it's 90% poetry.lock files
s
stdweird
9 months ago
anyway
L
Logan M
9 months ago
It's pretty organized tbh
For openai-like embeddings, you can just change the api_base and model_name kwargs for OpenAIEmbedding
s
stdweird
9 months ago
@Logan M thx, i'll have a look
s
stdweird
9 months ago
wrt the code, i guess it's in some hybrid state and the intend is to make them all separate repos? anyway, atm it looks odd but i understand why this is better
s
stdweird
9 months ago
the
get_engine
code won't like unknown models
s
stdweird
9 months ago
but it looks easy to patch it
L
Logan M
9 months ago
if you pass
model_name="model"
it will skip get_engine
s
stdweird
9 months ago
@Logan M ah, thanks. now i see it.
s
stdweird
9 months ago
@Logan M wrt llm openai vs openai-like, it is onlyt the context length and if a model is chat an/or generate?
L
Logan M
9 months ago
pretty much π
Add a reply
Sign up and join the conversation on Discord
Join on Discord