Hi! Doesn't
gpt-4o
model have 128k context window?
I'm getting this error while using
gpt-4o
model:
BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 16385 tokens. However, you requested 16543 tokens (15446 in the messages, 73 in the functions, and 1024 in the completion). Please reduce the length of the messages, functions, or completion.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Settings.llm = OpenAI(api_key="...", model_name="gpt-4o", max_tokens=1024)
class SummaryAndThemes(BaseModel):
"""Data model containing a summary and key themes for users' comments."""
summary: str
themes: List[str]
response = summarizer.get_response("""Summarize and extract key themes
""",
comment_chunks
)