Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 months ago
0
Follow
Plans to integrate anthropics with other models for faster computation
Plans to integrate anthropics with other models for faster computation
Inactive
0
Follow
d
drewskidang
2 months ago
ยท
Any plans of doing anthropics rag context with other models. Its so slow bc i tier 1 for anothropic
L
d
7 comments
Share
Open in Discord
L
Logan M
2 months ago
You can technically follow the cookbook and use any model
L
Logan M
2 months ago
But anthropic makes it so viable because of cheap prompt caching
L
Logan M
2 months ago
OpenAI will automatically prompt cache as well now, but its only a 50% price reduction
d
drewskidang
2 months ago
Oh dang will try wait how does it work with local models assumin i'm renting GPU lmao
d
drewskidang
2 months ago
extra_headers={"anthropic-beta": "prompt-caching-2024-07-31"},
do we need to change this to anything
L
Logan M
2 months ago
Thats just specific to using anthropic
L
Logan M
2 months ago
local models will probably be slower lol I would use vLLM or TGI to host
Add a reply
Sign up and join the conversation on Discord
Join on Discord