Find answers from the community

Updated 2 years ago

Hi πŸ™‚ I am trying to build my first

At a glance
Hi πŸ™‚ I am trying to build my first customized chatbot with LLM, Llamaindex and Langchain. So far I managed to integrate all my documents into an Llamaindex Vector Index and store it locally. I would like to create a chatbot using the index to get context information. So far I could use the query functions but I also want to have a chat history so the chatbot can communicate based on the questions and answers before as well. My first tries with the llama chat agent led to a chatbot with history but which answers also stuff completely out of the context of my documents and that is absolutely not my goal. Somebody knows how to integrate a chat history and restrict the chatbot to the context of my documents?
L
m
r
8 comments
You can use our own agents, and add a prefix/system prompt with extra instructions.

There will be full details/docs on this later today I think, but here's some notebooks that should help (the system prompt here is just a string!)

Also happy to help answer questions as you go
https://gpt-index.readthedocs.io/en/latest/examples/agent/openai_agent.html#agent-with-personality

And here's how to use it with query engines
https://gpt-index.readthedocs.io/en/latest/examples/agent/openai_agent_with_query_engine.html
If you want to stick with langchain, I think you have to modify the prefix of the agent. But it's a little complicated to setup ngl πŸ˜…
@Logan M thank you, sounds very interesting. I'll check the documentations πŸ™‚
@Logan M I managed to implement my index as a query engine tool and then use it with the openAI Agent. Everything works fine if I ask specific stuff about my documents, but I can still ask things completely out of context and it answers correctly. I tried with specifying 'system_prompt' and it kind of works. For example I changed the language to german, but I can not restrict the knowledge. Do you know if there is a simple way to restrict the agent to the query engine tools? I am not an expert with setting up my own agent so far πŸ˜…
Basically the only way to restrict is with the prompt πŸ˜… You might have to get creative.

For example

Plain Text
system_prompt = "You are a Q&A chatbot. Users will ask you questions, and you can only use answers provided by tools to help users. Do not use any prior knowledge to answer questions. If a tool does not help you answer, just inform the user that you are unable to help with their current question"
No idea if that will work haha, but an example anyways
@Logan M Okay, I'll try out some things πŸ™‚ thanks again πŸ™Œ
"We have provided context information below. \n" "---------------------\n" "{context_str}" "\n---------------------\n" "Using ONLY the context information and no other sources, please answer the question: {query_str}\n" "If you don't find answer within the context, SAY 'Sorry, I could not find the answer within the context.' and DO NOT provide a generic response. \n" - A similar case, not entirely same but this prompt helped me restrict the answers to the context.
Add a reply
Sign up and join the conversation on Discord