maybe I'm confused a bit by the code sample, but what are you trying to do?
I'm trying to provide a way to make some specific actions based on tools (like in your examples multiplying) along with answering questions. For example, here is the prompt "Answer questions based on provided context. If the question is about pricing calculate it based on the volume." To calculate this pricing I provide the tool but any other questions should be based on the context (context is taken from a vector store). I already played with tools and found them a fantastic way to do whatever I want but I'm a bit stuck trying to mix the current approach with this agent-tool approach.
So, a user starts chatting, and the custom_chat_engine from my example answers his/her questions based on the provided context but if they provide a volume and ask to calculate price, my custom tool (or, function) is called.
seems like you need some top level agent to handle routing between the custom query engine and other tools right?
Very rough guess
def custom_chat_engine_tool(query: str) -> str:
"""Useful for X."""
return str(custom_chat_engine.chat(query))
chat_engine_tool = FunctionTool.from_defaults(fn=custom_chat_engine)
tools = [chat_engine_tool, ....]
agent = OpenAIAgent.from_tools(tools, llm=llm)
Ah, very interesting! Is "routing" a special term or do you just use it in a casual way?
Routing is a general idea with RAG and query pipelines -- given a query, can I send it to the best possible tool/index for replying?
So the OpenAIAgent is essentially a router (picking tools to use)
There is also a specific RouterQueryEngine in llama-index, but it only works with other query engines right now (and does not keep track of chat history like an agent)
Ah, I see. In my example, what should I put instead of """"Useful for X."""" ?
And should I pass the system prompt to this agent, or it's a task for chat_engine only?
Also, should I upgrade llama-index to the latest version?
This is the description of the tool. Put what you think describes best what it should be used for π This will help the OpenAI agent select the right tool
Probably you can pass to both?
Oh, I see, thanks a lot!!
One question about the migration, though. I see the service context is deprecated and you introduce a kinda global setting. In my case, I have different settings for every single client including the model and chunk size. How could I use this new approach if every single call requires their own settings and some calls can be done asynchronously so I can't use a global variable for them.
yea you can pass in things to where they are used
index = VectorStoreIndex(..., embed_model=embed_model)
query_engine = index.as_query_engine(llm=llm)
chat_engien = index.as_chat_engine(llm=llm)
# etc.
Every component that uses an LLM or embedding model should accept it in the interface
So, do you mean, in this case I can omit using the service context at all? I'm using index.as_retriever, does it accept llm too?
if the retriever uses an LLM, yes π
With a vector index, you probably just need the embed model index.as_retriever(embed_model=embed_model)
If its not passed in, it inherits the embed model from the index
Hi, sorry to bother you but I can't figure out how to install the integrations (I need for Qdrant - no documentation, nothing, the notion page doesn't have it either ). The upgrade script is not working either
How is the upgrade script not working? It prints the package to install? You can copy/paste the printed packages to run pip install π
The package registry also lists the package containing qdrant
They all follow predictable patterns.
pip install llama-index-vector-stores-qdrant
It's also listed on the qdrant example
https://docs.llamaindex.ai/en/stable/examples/vector_stores/QdrantIndexDemo.htmlπ
Hi, thanks. The script throws this error: "lib\site-packages\llama_index\core\command_line\upgrade.py", line 253, in upgrade_py_md_file
lines = f.readlines()
File "Python388\lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
UnicodeDecodeError: 'charmap' codec can't decode byte 0x90 in position 528: character maps to <undefined>"
What is the package registry? I had also a problem with OpenAI as it's not mentioned in the migration docs, finally found it. For those who are stuck too:
from llama_index.llms.openai import OpenAI
Simple, but not obvious.
(also
https://llamahub.ai, but its about halfway through being updated, but most packages are in there)
Thanks. I saw it and looked one more time but have not idea how it can be used for updating imports. Also, it's not clear what exactly I should install? I reinstalled OpenAI and fixed imports. What is llama-index-llms-openai for example? Is it a separate package that should be installed?
It is indeed a seperate package
All integrations are pypi packages
the import path is the the package name llama-index-llms-openai
== llama_index.llms.openai
I have no idea why the upgrade script didn't work for you, it must have hit a weird file encoding. You could try running it one file at a time
llamaindex-cli upgrade-file <filepath>
Thanks, I will try this approach. I still don't understand - if it's a separate package, how does it land in the llama_index package anyway? It's very confusing....
Its called namespaced packaging π It keep the imports familiar, and works well with intelisense
For example, you can see the openai LLM package has all the code in the llama_index/llms/openai
subfolder
Ah, okay. I didn't know about that. But... does it mean I need to reinstall all these packages manually instead of installing just one? And why the instellisense is working even if I didn't install that (OpenAI) package separately? I just reinstalled the llama-index but now I can see this class, is it a right one?
llama-index is a wrapper around several packages, sort of like a starter bundle
Hi again, one more question... I'm trying to get rid of the service_context but can't figure out where I should pass the system prompt if I use index.as_chat_engine. Thanks!
index.as_chat_engine(..., system_prompt="Talk like a pirate.") shouuuuld work
Thanks! Let me try. Is there any documentation to show all the parameters? I was trying redthedocs search, algolia (sucks!) and some other search - no result
algolia does indeed suck π Mendable (bottom right) has been doing much better lately
I tried it too but it couldn't find the API reference for this method with all the possible parameters.
Hi, I'm still trying to implement this logic but I'm stuck. The problem is custom_chat_engine is different for each single call (because it may be from different users), so, it's unclear how to pass it into the custom_chant_engine_tool.
I was trying to implement QueryEngineTool but its param query_engine can't be ContextChatEngine (as I get it from query_engine = index.as_chat_engine()).
What should I do? Thanks!
So, right now my agent looks like this:
def custom_chat_engine_tool(user_chat_engine, query: str) -> str:
'''
This tool is for asking a question.
'''
return str(user_chat_engine.chat(query))
def setup_agent(user_llm, user_chat_engine, system_agent_prompt, query_tool_name, query_tool_description, chat_history):
chat_tool = FunctionTool.from_defaults(fn=custom_chat_engine_tool) # How to pass user_chat_engine to it????
# OR - not working as user_chat_engine is CustomChatEngine not QueryEngine
query_tool = QueryEngineTool(
query_engine=user_chat_engine,
metadata=ToolMetadata(
name=query_tool_name,
description=query_tool_description #(
#"Provides information about Lyft financials for year 2021. "
#"Use a detailed plain text question as input to the tool."
#),
),
)
agent = OpenAIAgent.from_tools(
[query_tool], llm=user_llm, verbose=True, system_prompt=system_agent_prompt, chat_history=chat_history
)
return agent
I think you need to recreate the agent for each user probably? Or inside def custom_chat_engine_tool()
grab the user_chat_engine
from some parent scope
I recreate an agent for each user but I still don't understand how to pass the user_chat_engine to the function linked to a FunctionTool
Can I just explicitly pass some parameters to the function specified in the FunctionTool.from_defaults?
no, they have to be provided ahead of time
whether thats pulling from global variable/parent scope or something else
Global var can't be used but I can use something like dictionary. Thanks for the idea
So, do I understand properly that parameters passed to the function specified in FunctionTool are coming solely from the agent or whatever that understands the problem, selects a tool and forms the parameters to be passed?
Wait a sec, I can't use any dictionary because the function doesn't know which value to use. I need an explicit way to pass a parameter to it or use something like QueryEngineTool but for a chat engine. Can I just use QueryEngineTool instead? Will it understand the chat history?
I think I can isolate the code for each user via classes.