Find answers from the community

Updated last year

Is it possible to use agent tools and

Is it possible to use agent tools and intergrate that with the chat engine at the same time
L
L
40 comments
An agent is a chat engine though? πŸ‘€
I mean like, how exactly, if I'm querying the chat engine, is possible to make it so I can use agent tools while querying in the chat engine kinda?
sorry if my wording is weird, i could be misunderstanding stuff
So like right now I have a index intialized to use as a chat engine in which I can input queries, is it possible that as im chatting with the engine, i can also give it access to like agent tools such as gmail or etc
Actually I think I may have it, if I write it to be an assistant like this

Plain Text
agent = OpenAIAssistantAgent.from_new(
    name="Test Assistant",
    instructions="You are a QA assistant designed to answer the question",
    tools=query_engine_tools, gmail
    instructions_prefix="x",
    verbose=True,
    run_retrieve_sleep_time=1.0,
)

and init query engine tools with my current vector store and docs
Plain Text
testengine = index.as_query_engine(chat_mode="condense_question", verbose=True)
query_engine_tools = [
    QueryEngineTool(
        query_engine=testengine,
        metadata=ToolMetadata(
            name="testenginequery",
            description=(
                "Provides information for test engine "
            ),
        ),
    )]

The agent should be access the vector story (as a query engine) as well as any other agent tools in a sense?
hey @Logan M sorry for ping, just one quick question, I thought I overidden the system prompt in the service context but tracing through the application, I still get this prompt, what exactly do I need to change?
I placed the system prompt in servicecontext, is there something I need to do to make it global or
Actually, that prompt is part of the internal template. Setting in the service context just prepends that prompt while using existing templates
I would expect to see two system prompts in that case
(if the service context is being passed around correctly that is)
I'm using it as a chat engine, maybe I can override it when doing it in the index initialization under summary template
Hi, currently trying to get this system prompt thing working
but I get this
and it says that they have a system prompt option.. am I implementing something incorrectly?
it's a kwarg but it's not implemented for condense plus context chat engine πŸ˜… Because it doesn't reaaaallly make sense to implement it? But other engines do so the kwarg is there for consistnency I guess?

It's rewriting the user message, and querying an index, and then returning the exact response from the index query engine
so really, the query engine has full control over the response
If I initialize an OpenAI agent, I can pass the query_engine as a tool right?
I take it back I have horrible memory we just talked about that
No worried haha you are right!
So I may be a bit confused, I cannot just inject a system prompt into a chat engine? I can do it as a query engine.
I may just be blind or smth
Maybe chat engine has prompt dicts I can update?
Any other chat engine you can set a system prompt -- just not condense question and condense plus context
Yes you can do it in the query engine
What exactly does the system prompt do in the service context?
im just curious because thats one thing I tried
It prepends a system prompt to every LLM call
(Within a query engine)
@Logan M sorry for ping but I'm try ing to do the same thing outlined in our previous convos and the examples basically getting output from "agent..chat_repl" like this:
https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html#openai-agent-with-query-engine-tools

Is it possible to get output saved to a variable like response = agent.chat("input"),
If you want the output, you can't use chat_repl -- repl is really just meant as a quick way to test an agent
What's the issue with using agent.chat() ?
Oh I'm being an idiot, agent.chat() is what im looking for lol
Thanks for the help Logan
I gotchu :dotsCATJAM:
Add a reply
Sign up and join the conversation on Discord