Find answers from the community

Updated 2 months ago

llama_index/llama_index/llms/openai_like...

Would someone be able to assist here? I am trying to use the OpenAILike class with vLLM and am not sure where it makes this call to vLLM and have tried searching for a while. Would someone be able to assist? https://github.com/run-llama/llama_index/blob/main/llama_index/llms/openai_like.py
L
K
12 comments
Thanks @Logan M , but how do you know this is calling vLLM?
I don't see this call
Or can't follow the logic on this
You have to set the base url to point to your vLLM server
then it calls
you can set this to see some slightly more detailed logs from the client

Plain Text
import os
os.environ['OPENAI_LOG'] = "debug"
@Logan M however, but there isn't really an explicit call to vllm that I can see?
I'm still not sure where this is happening - can you point me to an example of this
The openai client calls the vllm server at the provided host url.

Since vllm has the same api as openai, we only need to adjust the base url, and the openai client does the work

Any time self._client is called, it's using the openai client to route requests to vllm

https://github.com/run-llama/llama_index/blob/1033d30a0d2bdd5119834d72d0b8c283a00acf13/llama_index/llms/openai.py#L350

That line exactly is an example of calling vllm
This is as specific as I can get πŸ˜†
Add a reply
Sign up and join the conversation on Discord