Find answers from the community

Updated last year

hi logan for the fine gpt3 5 examples

At a glance
hi logan for the fine gpt3.5 examples would bert models /t5 flans model work for the gpt3.5 context service manager, i tried other models and they worked but can't get embeddings to work
L
D
7 comments
You can mix any embedding model with any LLM

However, some models are not meant to be embedding models really (like flan-t5)

What models have you tried? Generally anything on this page will work (and is probably where I would start)
https://huggingface.co/spaces/mteb/leaderboard
gpt_4_context = ServiceContext.from_defaults(
llm=c(model='claude-2', temperature=0.3), callback_manager=callback_manager
)

i was hoping to swap this out for t5 side note claude is the worst teacher every
I think you might be confusing embedding models and LLMs?

In any case, below will probably work alright

Plain Text
from llama_index.llms import Anthropic
gpt_4_context = ServiceContext.from_defaults(
    llm=Anthropic(model='claude-2', temperature=0.3), 
    embed_model="local:BAAI/bge-base-en-v1.5",
    callback_manager=callback_manager
)


flan-t5 will not make a good LLM either, it's a pretty bad model for llamaindex
oh i i meant that lol i wanted to use flan for the llm
but i guess its no bueno
I would use something like mistral-7b, or llama2-13b, if you want to use a local LLM

You could also try google Palm as another hosted option, if you have access
ty ty i maxed out my openai credits lmao
Add a reply
Sign up and join the conversation on Discord