Find answers from the community

s
F
Y
a
P
Updated last month

i tried to set this

i tried to set this

Plain Text
service_context = ServiceContext.from_defaults(llm='local', chunk_size_limit=3000)


but I am still getting this error using llama2-13B, the default one

Plain Text
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/llama_cpp/llama.py", line 900, in _create_completion
    raise ValueError(
ValueError: Requested tokens (3993) exceed context window of 3900


Any ideas what I am doing wrongly?
L
d
3 comments
I can't remember if you figured this out already, but you actually want to set context_window
ok i'll try that. thanks!
i still get the same error. using this..

Plain Text
ServiceContext.from_defaults(llm='local', chunk_size_limit=1024, context_window=3000)
Add a reply
Sign up and join the conversation on Discord