Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated last year
0
Follow
Is it possible to use local llm s such
Is it possible to use local llm s such
Inactive
0
Follow
C
Chiken1
last year
·
Is it possible to use local llm's such as llama.cpp for DocumentSummaryIndex? I keep getting llama_tokenize_with_model: too many tokens
Error
L
2 comments
Share
Open in Discord
L
Logan M
last year
hmm that's kind of a weird error. Maybe decrease the context_window a bit on the LLM?
L
Logan M
last year
Might be due to token counting errors
Add a reply
Sign up and join the conversation on Discord
Join on Discord