The community member is using the OpenAI API to pass queries to the GPT4 model, and has correctly generated a VectorStoreIndex using LLama. However, when querying the VectorStoreIndex, the knowledge seems to be limited only to the VectorStoreIndex, instead of having the base GPT4 knowledge combined with the knowledge from the VectorStoreIndex. The community member is asking how to combine these to create a hybrid model.
In the comments, another community member suggests a few different ways to address this, such as changing the underlying prompt to be more flexible and allow the model to leverage its internal knowledge in addition to the provided context. Another community member suggests providing different tools, but there is no explicitly marked answer.
I'm using openai API to pass query to GPT4 model. I correctly generated a VectorStoreIndex using LLama, but when querying it seems the knowledge is limited only to it, instead of having the base GPT4 knowledge + the one deriving from the VectorStoreIndex. How can i combine to have an hybrid model?