Find answers from the community

Updated last year

Llm

Hey @Logan M !

How are doing?

Can you assist me please a bit?

I've trying to use Anthropic in the basic Q&A flow, but I guess I did something wrong:(

Here is a my code example:
llm = Anthropic(api_key=api_key)
llm_predictor = LLMPredictor(llm=llm)
L
2 comments
No need to use the LLM predictor, you can just pass in the llm directly

So something like
ServiceContext.from_defaults(llm=llm)
Or what was the issue you ran into?
Add a reply
Sign up and join the conversation on Discord