Find answers from the community

Updated 3 months ago

Don't know response

Hi, how can I change the response if the answer is not found? Right now I have the following text "There is no information provided in the context about where employees will use this for internal needs." Is it possible to answer something like "I'm sorry I don't know where employees will use this for internal needs." ? Thanks!
L
S
7 comments
You'll probably need to add that instruction to the prompt

Either you can try adding that instruction to the end of the query string, or you can create new prompt templates that are used internally my llama index.

The second option is more annoying, so I would try option 1 first lol
Thanks, I will try prompts. I also have a question about the context. How using Llama API, I can see which exact context was sent to the ChatGPT API? I'm asking because sometimes, I see tokens are used (what means the context was found and sent but ChatGPT can't find an answer)? Thanks!
The easiest way is to check response.source_nodes, to see which nodes were sent to the LLM

Seeing the complete prompts (your query + prompt template + node text) is a little more complicated, but if you need that too I can explain further
Great, let me try the source nodes first, thanks!
Now, after I saw the node I see why it can't find the answer. For some reason, the node it found doesn't contain any of the words in question. Do you have an idea why and how to fix it?
So the retrieval isn't really based on words per say, just embeddings/semantics.

Ways to improve this are playing around with chunk size or top k, playing around with how you structure your initial input documents, or possibly using a different type of index (although other index types might have their own issues πŸ˜‰)
Semantics doesn't fit well either. For some reason, the node has a piece of bibliography. It's more strange that I've already asked the same (almost) question and got pretty neat result!
Add a reply
Sign up and join the conversation on Discord