I'm using openai API to pass query to GPT4 model. I correctly generated a VectorStoreIndex using LLama, but when querying it seems the knowledge is limited only to it, instead of having the base GPT4 knowledge + the one deriving from the VectorStoreIndex. How can i combine to have an hybrid model?