Whats the issue? How did you define the output class?
We migrated to pydantic v2, so need to be a little careful
from typing import List
from pydantic import BaseModel, Field
class Song(BaseModel):
"""Data model for a song."""
title: str = Field(..., description="The song name.")
writer: str = Field(..., description="The song writer.")
singer: str = Field(..., description="The song singer.")
query_engine = loaded_index.as_chat_engine(
chat_mode="context",
filters=filters,
system_prompt=system_prompt,
similarity_top_k=5,
memory=memory,
llm=llm,
output_cls=Song)
mmmm actually, chat engine does not take an output class?
especially the context chat engine, others might
what you probably want is
sllm = llm.as_structured_llm(Song)
query_engine = loaded_index.as_chat_engine(
chat_mode="context",
filters=filters,
system_prompt=system_prompt,
similarity_top_k=5,
memory=memory,
llm=sllm
)
ValidationError: 1 validation error for StructuredLLM
output_cls
subclass of BaseModel expected (type=type_error.subclass; expected_class=BaseModel)
i tried above approach before but it did not work for me
but beyond that, what version of llama-index-core
do you have? pip show llama-index-core
?
Name: llama-index-core
Version: 0.10.63
then you'll want to do from pydantic.v1 import ...
v0.11.x of core moved to pydantic v2
ValueError: Single '}' encountered in format string
from pydantic.v1 import BaseModel, Field
π€ sounds like another issue tbh
how can i better help you with debugging?
yes i couldn't get it to run
If you can provide a google colab that replicates the issue, happy to dive in further
still facing the issue with google colab
i believe it is with your chat_engine function.
it only works with query_engine
Something in the prompt is causing formatting to have bug. Not sure what the issue is tbh, unless I could replicate it
!pip install llama-index-vector-stores-pinecone==0.1.7 pinecone-client==3.2.2 llama-index-embeddings-openai
!pip install llama-index==0.10.36
from llama_index.llms.openai import OpenAI as llama_index_openai
import openai
from llama_index.embeddings.openai import OpenAIEmbedding
from pinecone import Pinecone
from llama_index.vector_stores.pinecone import PineconeVectorStore
from llama_index.core import (
VectorStoreIndex
)
from typing import List
from pydantic.v1 import BaseModel, Field
class Movie(BaseModel):
"""Object representing a single movie."""
name: str = Field(..., description="Name of the movie.")
year: int = Field(..., description="Year of the movie.")
class Movies(BaseModel):
"""Object representing a list of movies."""
movies: List[Movie] = Field(..., description="List of movies.")
openai.api_key = "XXX"
embed_model = OpenAIEmbedding(model="text-embedding-3-small")
llm = llama_index_openai(temperature=0.1, model="gpt-4o")
pc = Pinecone(api_key="XXX")
pinecone_index = pc.Index("quick_start")
vector_store = PineconeVectorStore(pinecone_index=pinecone_index)
loaded_index = VectorStoreIndex.from_vector_store(vector_store=vector_store, embed_model=embed_model)
sllm = llm.as_structured_llm(Movies)
query_engine = loaded_index.as_chat_engine(
chat_mode="context",
similarity_top_k=5,
llm=sllm
)
prompt = '''
Please generate related movies to Titanic
'''
response = query_engine.chat(prompt)
this is not working but it is supposed to work
let me know if you still face issue reproducing the problem