Find answers from the community

Updated 3 months ago

LlamaIndex

May be a silly question, but I am just getting into AI. I sent some tests prompts through on some data and it correctly returned a message along the lines of 'I don't have that information', when asked about something not in the data set (currently just using OpenAI). Is it possible to use the custom data in conjunction with the LLM data? Or does it have to be an all or nothing due the model training? I ask, because I feel that the LLM has information that would be supplemental to my data set and it would be nice to mesh the two.
W
G
5 comments
Hi,
Did you get a chance to try LlamaIndex?
LlamaIndex will help you to use your custom data points in addition to LLM knowledge for you to find your answers.

Small description on what LlamaIndex does:
LlamaIndex is a data framework for LLM-based applications to ingest, structure, and access private or domain-specific data

Highly recommend you checkout the docs here: https://docs.llamaindex.ai/en/stable/index.html#
I did try it out as stated in my post. I am wondering I past a param of some sort to allow the LLM to supplement the data response with its own information rather than just using the custom data passed in.
You can play around with the prompts and provide instructions like if answer is not found in the context use best of your knowledge to answer the query.

https://docs.llamaindex.ai/en/stable/module_guides/models/prompts.html
Oh nice, thank you!
I will take a look at this
Add a reply
Sign up and join the conversation on Discord