Langcain with LiteLLM Proxy Gemini Cache #13367
Unanswered
orgabay-orca
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using Langchain with LiteLLM proxy server
i am trying to use all the 3 cloud vendors (AzureOpenAI, Bedrock anthropic, google gemini)
i am using the
cache_control_injection_points:
works for bedrock but not working for gemini 2.5 pro, seems like i always getting 0 caches tokens
any idea why? and how to tackle it?
Beta Was this translation helpful? Give feedback.
All reactions