Find answers from the community

Home
Members
Shawn1998
S
Shawn1998
Offline, last seen 3 months ago
Joined September 25, 2024
Hi All: I have an application with two different functions:
one uses openai directly so has "from openai import OpenAI" and a different function using RAG with llamaindex that has:
from llama_index.llms.openai import OpenAI
from llama_index.core import Settings

i want to use settings to change the model but getting runtime errors. any guidance on how to solve this, please?
9 comments
R
S
L
Hi All, Are QueryEngines being deprecated?

I came across this on the documentation page:
https://docs.llamaindex.ai/en/stable/module_guides/querying/structured_outputs/query_engine/
(Deprecated) Query Engines + Pydantic Outputs

@Logan M - any guidance from your team?
4 comments
L
S
Shawn1998
·

Warning

Dear @Logan M I am using Llamaindex - both the NLSWL
sql_database = SQLDatabase(engine, include_tables=["product_model"])

and the following:
# construct vector store
vector_store = PineconeVectorStore(pinecone_index=pinecone_index)
storage_context = StorageContext.from_defaults(vector_store=vector_store)

# Load documents from the specified directory
documents = SimpleDirectoryReader("./knowledge").load_data()

# reload an existing one
index = VectorStoreIndex.from_documents(
documents, storage_context=storage_context
)

# Add the documents to the index
index.from_documents(documents)

# Create a query engine from the index
reasoning_model = current_app.config['REASONING_MODEL']
Settings.llm = llamaOpenAI(temperature=0, model=reasoning_model)
query_engine = index.as_query_engine()

I'm now getting this warning, which I didn't earlier.

UserWarning: Valid config keys have changed in V2:
  • 'allow_population_by_field_name' has been renamed to 'populate_by_name'
  • 'smart_union' has been removed
Any guidance on what's causing this warning and how to fix, please?
1 comment
L
Hi, which versions of Llamaindex are compatible with Pineconevectorstore -- my code was working fine but now getting:

'PineconeVectorStore' object has no attribute 'pydantic_private'. Did you mean: 'pydantic_complete'?

I am currently using llama-index-vector-stores-pinecone 0.1.8

Would appreciate any guidance on fixing, please.
3 comments
L
S