The community member is setting up LlamaIndex in an API and wonders if it's rational to have a static "chat_engine" and replace the "chat_history" with each request. Another community member responds that the "chat_history" property is read-only, and suggests passing the chat history on the function call instead, like chat(msg, chat_history=chat_history). The original poster thanks the commenter and mentions looking into the "npx create-llama backend" to see how they handle this.
If I'm setting up LlamaIndex in an API, is it rationale for me to have a static "chat_engine" that I simply replace the chat_history with every time I get a request? By literalyl doing