Find answers from the community

Updated 6 months ago

I'm working on implementing GraphRAG

At a glance

The community member is working on implementing GraphRAG using LlamaIndex, but the current examples use OpenAI models for summarization. They are wondering if it is possible to use AWS Bedrock models instead. The comments indicate that it is possible to use any LLM of choice by creating the LLM instance and passing it directly in each method or defining it globally. However, the community member has encountered an issue when running the query at the end, receiving an EmptyNetworkError. They have checked their network connection and settings but cannot figure out the cause of the issue. The community members are seeking advice on how to resolve this problem when using Bedrock models.

Useful resources
I'm working on implementing GraphRAG using LlamaIndex, but the current examples use OpenAI models for summarization. Is it possible to use AWS Bedrock models instead?
W
A
3 comments
Yes, you can use any llm of your choice.
Just create the llm instance and pass it directly in each method or define it globally.

Plain Text
from llama_index.core import Settings

Settings.llm = Bedrock()

# Then proceed further
I’ve been following the https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/cookbooks/GraphRAG_v1.ipynb documentation . I replaced the OpenAI model with AWS Bedrock LLM. Everything seems to work fine until I run the query at the end. I’m using this query to test the model:

response = query_engine.query("What are the main news discussed in the document?")
print(response)

error:

EmptyNetworkError: EmptyNetworkError
llama_index\core\query_engine\custom.py:45, in CustomQueryEngine.query
---> 45 raw_response = self.custom_query(query_str)
...

Has anyone encountered a similar issue when using Bedrock models or any advice on how to resolve this? I’ve double-checked my network connection and settings but can’t seem to figure out what's causing this.

Any help would be appreciated!
Add a reply
Sign up and join the conversation on Discord