Add Prompt Caching Feature for Anthropic and OpenAI clients #29660
smartinezbragado
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Checked
Feature request
Add prompt caching features from anthropic and openai:
Motivation
This feature is key to reduce the cost of LLM calls
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions