Find answers from the community

Updated 2 months ago

Found the solution via the kapa gpt πŸ™‚

Found the solution via the kapa gpt πŸ™‚
from llama_index.core import Settings Settings.context_window = 2048 # Set the context window to your desired value
W
1 comment
yeah since your llm has more space you can increase the context to even more
Add a reply
Sign up and join the conversation on Discord