Hi All, Noobie to LlamaIndexTS here. Looks a great project.
I have been using the python version and have a working example in that of ingesting some of my own docs and interrogating them using 3.5-turbo. I have installed the Next example from (
https://github.com/run-llama/ts-playground) and have that working and ingesting my own documents and responding. The problem I have is it is also answering queries outside of my private documents. I don't have this issue with very similar code on Python. Am I missing something simple?
TLDR; How do I get answers only from my private data?
Also, big up to the devs. Documentation and site is well done.