Find answers from the community

Updated 2 months ago

CustomLLM

Hello
I don't succeed to use custom LLM with CustomRetriver. Here is what I try to do :



I try to set up a simple hybrid search
Like this one
https://gpt-index.readthedocs.io/en/latest/examples/query_engine/CustomRetrievers.html

I use a custom LLM

llm_predictor = LLMPredictor(llm=CustomLLM())
embed_model = LangchainEmbedding(HuggingFaceEmbeddings(model_name="T-Systems-onsite/cross-en-fr-roberta-sentence-transformer"))
service_context = ServiceContext.from_defaults( llm_predictor=llm_predictor, embed_model=embed_model)

And Indexes for this LLM
vector_index = VectorStoreIndex.from_documents(docs, service_context=service_context)


I set up the query engine like this :
vector_retriever = VectorIndexRetriever(index=vector_index, similarity_top_k=4)

vector_query_engine = RetrieverQueryEngine(
retriever=vector_retriever,
response_synthesizer=refined_response_synth,
)

the reponse synthetiser use refined text_qa_template
refined_response_synth = get_response_synthesizer(text_qa_template = qa_template2, response_mode = ResponseMode.COMPACT)

vector_query_engine = RetrieverQueryEngine(
retriever=vector_retriever,
response_synthesizer=refined_response_synth,
)

When I query I have this error
response = vector_query_engine.query("Quel est le contenu de l'article 222-29?")
print (response)

AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = ', or you can set the environment variable OPENAI_API_KEY=). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = '. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.

If I initialize the VectorIndexRetriever with the service_context I have the same error....

I didn't succeed to use custom LLM
L
s
2 comments
There are a bunch more places to pass in the service context 🥲

But to make it easy when working at this low level, just set a global service context and don't worry about passing it in

Plain Text
from llama_index import set_global_service_context

set_global_service_context(service_context)
Thx a lot !
Add a reply
Sign up and join the conversation on Discord