Find answers from the community

Updated 6 months ago

Sorry been struggling setting an custom

At a glance

The community member is struggling to set a custom class using the vllm wrapper. They are trying to use the LangChain library, but it does not accept quantization. The community member provides sample prompts, creates a sampling parameters object, and an LLM object using the "TheBloke/Llama-2-7b-Chat-AWQ" model with "AWQ" quantization. They then generate text from the prompts and print the outputs.

In the comments, another community member asks if the npx (likely referring to a tool or library) accepts local LLMs. Another community member responds that the TS package has limited support for LLMs, but the FastAPI backend uses the Python package, which can change the LLM. They are not entirely sure about the other issue the original poster is having.

Sorry been struggling setting an custom class using vllm wrapper. Lang chain doesn't except quantization from vllm import LLM, SamplingParams
Plain Text
# Sample prompts.
prompts = [
    "Hello, my name is",
    "The president of the United States is",
    "The capital of France is",
    "The future of AI is",
]
# Create a sampling params object.
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)

# Create an LLM.
llm = LLM(model="TheBloke/Llama-2-7b-Chat-AWQ", quantization="AWQ")
# Generate texts from the prompts. The output is a list of RequestOutput objects
# that contain the prompt, generated text, and other information.
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
    prompt = output.prompt
    generated_text = output.outputs[0].text
    print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
D
L
4 comments
also does the npx except local llms
it defaults to openai, but you can change the LLM in the generated code (the TS package has limited support for LLMs, but the fastapi backend uses the python package)
Not entirely sure on the other issue though
Add a reply
Sign up and join the conversation on Discord