Find answers from the community

Updated 3 months ago

Azure

How to use azure openai with llamandex?
W
A
17 comments
I am able to connect to azure open AI. Now I want to query through specified docs.
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
above code is giving me error:
but when I run chat complete it works fine : response = llm.complete("The sky is a beautiful blue and")
print(response)
Try setting the service_context globally once.

Plain Text
from llama_index import set_global_service_context

set_global_service_context(service_context)
do you have example of service_context
what it will look like?
I did that still same error appeared
service_context = ServiceContext.from_defaults(
llm=llm,
embed_model=OpenAIEmbedding()
)

set_global_service_context(service_context)
You need to replace the embedding as well to AzureEmbedding

Here is the full example implementing Azure: https://docs.llamaindex.ai/en/stable/examples/customization/llms/AzureOpenAI.html
Ok I will try it.
Please could you tell how will I store embedding in chroma db and from there I want to search.
Thanks you helped me a lot I completed that project. Once again thanks !!!
Add a reply
Sign up and join the conversation on Discord