class TestObj(BaseModel): id: str = Field(description="Unique identifier.") def test_fn(obj: TestObj): return 1 FunctionTool.from_defaults(test_fn),
RuntimeError: no validator found for <class 'app.leader_agent.TestObj'>, see arbitrary_types_allowed in Config
IngestionPipeline
run that's from a StorageContext
? the vector_store
param on IP takes a BasePydanticVector store a storage_context.vector_store
is my SimpleVectorStore
. it's throwing a dict invalid errorIngestionPipeline( transformations=[...], docstore=storage_context.docstore, # loads/uses sc docstore # errors: vector_store=storage_context.vector_store ).run(documents=documents)
docstore
after getting the document nodes, since there's no storage_context
arg to SimpleDirectoryReader)vector_store
here but should i also update the docstore
? will vector_stores[vector_store.name]
automatically add the chunks/nodes from transformation? will it only do that if one of the transformation steps adds embeddings to each node?add_index_struct
and read it later by rebuilding an index from an index struct?)class CustomExtractor(BaseExtractor): def extract(self, nodes): metadata_list = [ { "custom": ( node.metadata["document_title"] + "\n" + node.metadata["excerpt_keywords"] ) } for node in nodes ] return metadata_list
TypeError: Can't instantiate abstract class CustomExtractor with abstract method aextract
when pasting the documentation code as isStorageContext.from_defaults(persist_dir)
i get a FileNotFoundError: [Errno 2] No such file or .../docstore.json
. shouldn't that be created if it doesn't exist (first time) and otherwise loaded in?stop_sequence
to Anthropic LLM? TypeError: Messages.create() got an unexpected keyword argument 'stop_sequence'
system_prompt
kwargllm = Anthropic( system_prompt=system_prompt
self.memory.put(ChatMessage(content=system_prompt, role=MessageRole.SYSTEM))
TypeError: Messages.create() got an unexpected keyword argument 'tools'
. is there any guidance on tool calling with llama's anthropic wrapper?chat_repl
). has anyone else bumped into that?