Find answers from the community

Updated last year

I want to make a chatbot that has access

I want to make a chatbot that has access to three different indexes and always queries all of them and synthesizes and answer out of three gpt-4 responses based on each index. How do I query gpt4 directly with a query engine that has no index when sending it the three responses to synthesize?
T
L
r
4 comments
You can use a LLM without a index like this:
Plain Text
from llama_index.llms import ChatMessage, OpenAI

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = OpenAI().chat(messages)
print(resp)


Docs on it:
https://docs.llamaindex.ai/en/stable/understanding/using_llms/using_llms.html

You could also take a look at the query engines and seeing if one of them fits your use-case:

https://docs.llamaindex.ai/en/stable/module_guides/deploying/query_engine/root.html
If you want it to always query all three, you could make a custom tool that calls all threee query engines too


Plain Text
from llama_index.tools import FunctionTool

def query(input: str) -> str:
  """Useful for getting information about XX."""
  response1 = query_engine1.query(input)
  response2 = query_engine2.query(input)
  response3 = query_engine3.query(input)

  # you will want to format this better lol
  return str(reaponse1) + "/n/n" + str(response2) + "\n\n" + str(response3)

# this parses the function schema for name + description/docstring
tool = FunctionTool.from_defaults(fn=query)


You could also use a SubQuestionQueryEngine as a QueryEngineTool , but this might be slower
I ended up using python multiprocessing to query all three in parallel
Add a reply
Sign up and join the conversation on Discord