Find answers from the community

Updated last year

my llm object is like this -> llm =

At a glance
my llm object is like this -> llm = Bedrock(model="anthropic.claude-v2", profile_name="default")
L
l
2 comments
Plain Text
from llama_index import ServiceContext
service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)

get_response_synthesizer(...., service_context=service_context)
ahhhh thx Logan, damm i missed it.
Add a reply
Sign up and join the conversation on Discord