Find answers from the community

Updated 4 months ago

I'm working on implementing GraphRAG

I'm working on implementing GraphRAG using LlamaIndex, but the current examples use OpenAI models for summarization. Is it possible to use AWS Bedrock models instead?
W
A
3 comments
Yes, you can use any llm of your choice.
Just create the llm instance and pass it directly in each method or define it globally.

Plain Text
from llama_index.core import Settings

Settings.llm = Bedrock()

# Then proceed further
I’ve been following the https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/cookbooks/GraphRAG_v1.ipynb documentation . I replaced the OpenAI model with AWS Bedrock LLM. Everything seems to work fine until I run the query at the end. I’m using this query to test the model:

response = query_engine.query("What are the main news discussed in the document?")
print(response)

error:

EmptyNetworkError: EmptyNetworkError
llama_index\core\query_engine\custom.py:45, in CustomQueryEngine.query
---> 45 raw_response = self.custom_query(query_str)
...

Has anyone encountered a similar issue when using Bedrock models or any advice on how to resolve this? I’ve double-checked my network connection and settings but can’t seem to figure out what's causing this.

Any help would be appreciated!
Add a reply
Sign up and join the conversation on Discord