Find answers from the community

Updated 3 months ago

Upgrading from 6 38 to 7 9 Getting this

Upgrading from .6.38 to .7.9. Getting this error when submitting question.
Plain Text
File "/app/./index_server.py", line 197, in query_index
    chat_engine = index.as_chat_engine(chat_mode='react',
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/llama_index/indices/base.py", line 383, in as_chat_engine
    return ReActAgent.from_tools(
           ^^^^^^^^^^^^^^^^^^^^^^
TypeError: ReActAgent.from_tools() got an unexpected keyword argument 'similarity_top_k'
Any direction will be greatly appreciated. Some of the migration docs appear to be 404.
L
e
k
5 comments
We did a docs refactor, although the new searchbar in the docs works pretty well

Seems like this is a bug with the react agent though :PSadge:

You can create the react agent without as_chat_engine though, which should work well

https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/usage_pattern.html#query-engine-tools
@kapa.ai how do you send a custom prompt to a ReactAgent?
Let me check the knowledge base for you. One moment! :books:
----------------------
To send a custom prompt to a ReactAgent in LlamaIndex, you can use the as_chat_engine method of the index and specify the system_prompt parameter with your custom prompt. Here's an example:

Plain Text
from llama_index.prompts.system import SHAKESPEARE_WRITING_ASSISTANT

chat_engine = index.as_chat_engine(
    chat_mode="react",
    system_prompt=SHAKESPEARE_WRITING_ASSISTANT,
    verbose=True,
)

response = chat_engine.chat("Hi, tell me a tragic story")


In this example, SHAKESPEARE_WRITING_ASSISTANT is a predefined prompt. If you want to use a custom prompt, you can define it using the Prompt class and then pass it to the system_prompt parameter. Here's an example of how to define a custom prompt:

Plain Text
from llama_index import Prompt

template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
custom_prompt = Prompt(template)


You can then use custom_prompt in place of SHAKESPEARE_WRITING_ASSISTANT in the as_chat_engine method.

For more details, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
The bot is right, the system prompt is just a string here πŸ‘
Add a reply
Sign up and join the conversation on Discord