Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
๐
๐
๐
Powered by
Hall
Inactive
Updated 5 months ago
0
Follow
For document summary index, can I use
For document summary index, can I use
Inactive
0
Follow
At a glance
U
Utine
last year
ยท
For document summary index, can I use other model instead of GPT to generate summary? Like "pszemraj/long-t5-tglobal-base-16384-book-summary" in huggingface
W
U
L
4 comments
Share
Open in Discord
W
WhiteFang_Jr
last year
Yes you can, Just need to update llm and embed_model in the
service_context
with your choice of llm.
U
Utine
last year
Okay, good! Thank you๐
L
Logan M
last year
(t5 is probably not what you want to use for embeddings tho)
U
Utine
last year
There are llm and embed_model settings in ServiceContext. I set llm with T5 and embed_model with OpenAI embedding.
Add a reply
Sign up and join the conversation on Discord
Join on Discord