Find answers from the community

Updated last month

Prompt Caching In OpenAI Models

At a glance

The community member is asking if there is a similar tutorial to the one for Anthropic prompt caching for OpenAI models. The comments indicate that OpenAI prompt caching is automatic, and provide a link to the OpenAI documentation on prompt caching. There is no explicitly marked answer in the comments.

Useful resources
Is there any similar tutorial(https://docs.llamaindex.ai/en/stable/examples/llm/anthropic_prompt_caching/) for prompt caching in OpenAI models?
L
2 comments
openai prompt caching is automatic afaik
Add a reply
Sign up and join the conversation on Discord