Find answers from the community

Updated 3 months ago

hello @everyone i have doubt how to set

hello @everyone i have doubt how to set vllm server endpoints to Settings.llm , please help
W
1 comment
You'll have to create vLLM instance and add it to Settings.

Settings.llm = vLLM instance here
https://docs.llamaindex.ai/en/stable/examples/llm/vllm/?h=vllm#completion-response
Add a reply
Sign up and join the conversation on Discord