Find answers from the community

Updated 4 months ago

is there a way to use bge large and

At a glance
is there a way to use bge large and anthropic with my existing pinecone index. Its keep using bge small and trying use llama cpp at context service.. also would 10/10 hire logan if i could
L
D
5 comments
(LOL Thanks!)

Yea you should be able to! Something like this

Plain Text
from llama_index import ServiceContext, set_global_service_context

service_context = ServiceContext.from_defaults(llm=Anthropic(...), embed_model="local:BAAI/bge-large-en", ...)

set_global_service_context(service_context)
DUDE THANK YOU was struggling all year
sorry do i just token limit in the llm
Yea exactly πŸ‘

The embed model is limited to 512 tokens, but that gets handled automatically
logan the man will hire some day
Add a reply
Sign up and join the conversation on Discord