Find answers from the community

Updated 4 months ago

is there a way i can use VLLM embedding

At a glance

The community member asked if there is a way to use a VLLM (Very Large Language Model) embedding endpoint. In the comments, another community member responded that they did not find any information related to VLLM embedding in the documentation, but suggested that the community member could extend a custom embedding class to point towards their VLLM embedding endpoint, providing a link to an example in the documentation.

Useful resources
is there a way i can use VLLM embedding endpoint?
W
1 comment
Didnt find anything related to vLLM embedding in the docs. You can extend custom embedding class to point towrds your vllm embedding endpoint: https://docs.llamaindex.ai/en/stable/examples/embeddings/custom_embeddings/#custom-embeddings
Add a reply
Sign up and join the conversation on Discord