Find answers from the community

Updated last week

I Asked About the Company That Created You

Hi, I have question about chat store. I save the chat store

{"store": {"chat_history": [{"role": "user", "content": "which company create you?", "additional_kwargs": {}}, {"role": "assistant", "content": "I wasn't created by a specific company, but rather I am a product of Meta AI, a subsidiary of Meta Platforms, Inc.", "additional_kwargs": {}}, {"role": "user", "content": "Repeat the question I asked you", "additional_kwargs": {}}, {"role": "assistant", "content": "You asked: "Which company created you?" \n\n\nLet me know if you have any other questions!", "additional_kwargs": {}}]}, "class_name": "SimpleChatStore"}

What is "additional_kwargs" for? I want to let the chat store contains response time, token info, and source node. How to do it? It is possible to add those data into "addtional_kwargs"?

Currently, I am using

self.memory = ChatMemoryBuffer.from_defaults(
token_limit=3000,
chat_store=self.chat_store,
)

self.chat_engine = self.index.as_chat_engine(
chat_mode="context",
llm=self.cur_lm,
memory=self.memory
)
L
M
10 comments
additional kwargs is usually used to store extra things like tool calls. I thiiiink you can put whatever you want in it, but you'd have to either modify the chat message before putting it in the store, or modify it after its in the store

Since you are using a prebuilt chat engine, there's no way to fo the former, since it gets inserted for you
In my case, if I don't use prebuilt chat engine, how to do it? Do you have any advices?
what is the meaning of "tool calls" here?
llms can call tools/functions (i.e. this is how an agent work). Don't worry about it for now, since you aren't using this feature
Plain Text
from llama_index.core.llms import ChatMessage

retriever = index.as_retriever(similarity_top_k=2)
chat_history = memory.get()

nodes = retriever.retrieve(user_msg_str)
nodes_context = "\n\n".join(x.text for x in nodes)
system_prompt = ChatMessage(role="system", content=f"Here is some relevant context to help you assist the user:\n\n{nodes_context}")

llm_input = [system_prompt] + chat_history + [ChatMessage(role="user", content=user_msg_str)]

resp = llm.chat(llm_input)

memory.put(ChatMessage(role="user", content=user_msg_str))
memory.put(resp.message)

print(resp.message)


This is roughly what the context chat engine is doing for you
Is there any way to turn the chat store into dictionary, then update the dictionary and update the chat store?
chat_store.json() somehow turn it into string....
ok.. it looks that I can get chat_store.store to edit it with set_messages..
Add a reply
Sign up and join the conversation on Discord