I’ve been following the
https://github.com/run-llama/llama_index/blob/main/docs/docs/examples/cookbooks/GraphRAG_v1.ipynb documentation . I replaced the OpenAI model with AWS Bedrock LLM. Everything seems to work fine until I run the query at the end. I’m using this query to test the model:
response = query_engine.query("What are the main news discussed in the document?")
print(response)
error:
EmptyNetworkError: EmptyNetworkError
llama_index\core\query_engine\custom.py:45, in CustomQueryEngine.query
---> 45 raw_response = self.custom_query(query_str)
...
Has anyone encountered a similar issue when using Bedrock models or any advice on how to resolve this? I’ve double-checked my network connection and settings but can’t seem to figure out what's causing this.
Any help would be appreciated!