Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Llm
Llm
Inactive
0
Follow
k
korzhov_dm
last year
Β·
Hey @Logan M !
How are doing?
Can you assist me please a bit?
I've trying to use Anthropic in the basic Q&A flow, but I guess I did something wrong:(
Here is a my code example:
llm = Anthropic(api_key=api_key)
llm_predictor = LLMPredictor(llm=llm)
L
2 comments
Share
Open in Discord
L
Logan M
last year
No need to use the LLM predictor, you can just pass in the llm directly
So something like
ServiceContext.from_defaults(llm=llm)
L
Logan M
last year
Or what was the issue you ran into?
Add a reply
Sign up and join the conversation on Discord
Join on Discord