Find answers from the community

Updated last year

hi somebody with this problem

hi ... somebody with this problem ??

INFO:openai:error_code=None error_message="This model's maximum context length is 4097 tokens,
L
C
6 comments
You sent something to the LLM thats longer than 4097 tokens πŸ™‚
yes yes ... fixed
now i have some problem with chat_history ... i am using initialize_agent ... and dont know why is like not detecting
Yea not so sure about that one πŸ€” I havent worked with memory that much, but it usually works with initialize_agent from what I remember trying
actually the problem is the prompt that i am used ... some how ... because of the prompt ...the agent not use chat_memory ... and only response the information that is in the docs indexed ... so .. my solution was eliminate the section of the prompt that limited the searching result outside the tools used .... but i need that the agent not use external information ... do you know how to do that ?? or have a clue ... πŸ™‚ .. thanks ...
How are you customizing the agent prompt?
Add a reply
Sign up and join the conversation on Discord