A community member is experimenting with chat.llamaindex.ai and is confused about the different categories (system, user, assistant, function, data, and tool) displayed in the setup. The comments explain that the "system" is the default prompt, and the other categories are used to retrieve and put context into the prompt, along with the chat history. However, there is no clear explanation of what the content of these other categories would be. The community members discuss whether Chat LlamaIndex is meant to be used standalone or if there are additional steps required, with one member clarifying that it can be used standalone but won't have access to the user's data unless files are uploaded to be indexed.
So system is sort of the training you would give to an OpenAI API or playground? Like general? Would you be so kind to elaborate on how the other categories would be trained? I looked everywhere online and there is no documentation for this.
Ok, I'm a bit confused. Below Chat LLamaIndex it says "Create chat bots that know your data". So that refers to context about these six categories or perhaps the convos that you had what other bots (I see that the drop down in data souce points to other bots)?
If this is the case, would you please give me an example of what the content of the categories would be for user, assistant, function, data, and tool?
Wait a second, Chat LlamaIndex isn't meant to be used standalone, is it? That's what you mean by "Context that you indexed is retrieved, and it's put into a prompt (along with the chat history)". There are steps to be done before using Chat LLamaIndex