What if LLM dont want to use query_engine_tool and give me answers about some imaginary book, not I embedded to it?
Response: Observation: query_engine_tool response
Title: The Da Vinci Code (WRONG!!!)
Summary: This book is a thriller that follows Robert Langdon, a Harvard symbologist, as he unravels ancient secrets and solves codes to save the life of a British Royal Family member.
Main characters include:
- Robert Langdon - A Harvard professor of symbology.
....blabla
my code
service_context = ServiceContext.from_defaults(llm=llm, embed_model="local")
data = SimpleDirectoryReader(input_dir="C:/temp_my/text_embeddings").load_data()
index = VectorStoreIndex.from_documents(data, service_context=service_context)
chat_engine = index.as_chat_engine(service_context=service_context, chat_mode="react", verbose=True)
response = chat_engine.chat("What this book is about? List the names of main characters. And tell the story short.")
print(response)
Directory contains 1 big docx file converted from fb2 and translated to English with google