Find answers from the community

Updated 4 months ago

Hello, we are facing issues with

Hello, we are facing issues with Pydantic output with chat engine. They all stopped working today! query_engine = loaded_index.as_chat_engine(
chat_mode="context",
filters=filters,
system_prompt=system_prompt,
similarity_top_k=5,
memory=memory,
llm=llm,
output_cls=output_format
L
z
31 comments
Whats the issue? How did you define the output class?

We migrated to pydantic v2, so need to be a little careful
from typing import List
from pydantic import BaseModel, Field

class Song(BaseModel):
"""Data model for a song."""
title: str = Field(..., description="The song name.")
writer: str = Field(..., description="The song writer.")
singer: str = Field(..., description="The song singer.")

query_engine = loaded_index.as_chat_engine(
chat_mode="context",
filters=filters,
system_prompt=system_prompt,
similarity_top_k=5,
memory=memory,
llm=llm,
output_cls=Song)
something like that
mmmm actually, chat engine does not take an output class?
especially the context chat engine, others might
what you probably want is

Plain Text
sllm = llm.as_structured_llm(Song)

query_engine = loaded_index.as_chat_engine(
                chat_mode="context",
                filters=filters,
                system_prompt=system_prompt,
                similarity_top_k=5,
                memory=memory,
                llm=sllm
              )
ValidationError: 1 validation error for StructuredLLM
output_cls
subclass of BaseModel expected (type=type_error.subclass; expected_class=BaseModel)
i tried above approach before but it did not work for me
Had a small typo
don't pass output_cls
but beyond that, what version of llama-index-core do you have? pip show llama-index-core ?
Name: llama-index-core
Version: 0.10.63
then you'll want to do from pydantic.v1 import ...
v0.11.x of core moved to pydantic v2
ValueError: Single '}' encountered in format string
from pydantic.v1 import BaseModel, Field
πŸ€” sounds like another issue tbh
how can i better help you with debugging?
yes i couldn't get it to run
If you can provide a google colab that replicates the issue, happy to dive in further
still facing the issue with google colab
Attachment
Screenshot_2024-08-23_at_6.36.19_PM.png
here is the log file
i believe it is with your chat_engine function.
it only works with query_engine
Something in the prompt is causing formatting to have bug. Not sure what the issue is tbh, unless I could replicate it
!pip install llama-index-vector-stores-pinecone==0.1.7 pinecone-client==3.2.2 llama-index-embeddings-openai
!pip install llama-index==0.10.36

from llama_index.llms.openai import OpenAI as llama_index_openai
import openai
from llama_index.embeddings.openai import OpenAIEmbedding
from pinecone import Pinecone
from llama_index.vector_stores.pinecone import PineconeVectorStore
from llama_index.core import (
VectorStoreIndex
)
from typing import List
from pydantic.v1 import BaseModel, Field

class Movie(BaseModel):
"""Object representing a single movie."""

name: str = Field(..., description="Name of the movie.")
year: int = Field(..., description="Year of the movie.")


class Movies(BaseModel):
"""Object representing a list of movies."""

movies: List[Movie] = Field(..., description="List of movies.")

openai.api_key = "XXX"
embed_model = OpenAIEmbedding(model="text-embedding-3-small")

llm = llama_index_openai(temperature=0.1, model="gpt-4o")
pc = Pinecone(api_key="XXX")
pinecone_index = pc.Index("quick_start")
vector_store = PineconeVectorStore(pinecone_index=pinecone_index)
loaded_index = VectorStoreIndex.from_vector_store(vector_store=vector_store, embed_model=embed_model)
sllm = llm.as_structured_llm(Movies)
query_engine = loaded_index.as_chat_engine(
chat_mode="context",
similarity_top_k=5,
llm=sllm
)
prompt = '''
Please generate related movies to Titanic
'''
response = query_engine.chat(prompt)
this is not working but it is supposed to work
let me know if you still face issue reproducing the problem
Add a reply
Sign up and join the conversation on Discord