Find answers from the community

Updated 5 months ago

I'm using openai API to pass query to

At a glance
I'm using openai API to pass query to GPT4 model. I correctly generated a VectorStoreIndex using LLama, but when querying it seems the knowledge is limited only to it, instead of having the base GPT4 knowledge + the one deriving from the VectorStoreIndex. How can i combine to have an hybrid model?
T
2 comments
There are a few different ways. You can for example change the underlying prompt to be more flexible and thus allow the model to also leverage it's internal knowledge in addition to context you provide. https://docs.llamaindex.ai/en/stable/core_modules/model_modules/prompts.html#prompts
You could also provide different tools
Add a reply
Sign up and join the conversation on Discord