chat_engine = index.as_chat_engine( chat_mode="best", service_context=service_context, ) print(chat_engine.chat("What did the author do growing up?"))
best
and context
modes do not strictly use the indexbest
defaults to an agent. An agent looks at the chat history, and decides whether or not to use the query engine or respond directly context
retrieves context on every user message, and inserts it into the system prompt. The LLM can then respond using the context in the system prompt, or directly using the chat historycontext retrieves context on every user message...
from llama_index.tools import QueryEngineTool tool = QueryEngineTool.from_defaults( index.as_query_engine(...), name="my_query_engine", description="Usefor for querying information about X.", ) from llama_index import OpenAIAgent agent = OpenAIAgent.from_tools([tool], verbose=False) agent.chat("...", tool_choice="my_query_engine")