Find answers from the community

m
mato
Offline, last seen 3 months ago
Joined September 25, 2024
Is it possible to chain multiple OpenAI agents?

It seems that it is possible but I don't know how.

https://docs.llamaindex.ai/en/stable/module_guides/deploying/agents/tools/root.html

In the docs: Note: since our agent abstractions inherit from BaseQueryEngine, these tools can also wrap other agents.


Chaining openai agents though gives me this:
Got output: Error: Error code: 400 - {'error': {'message': "'$.messages[0].content' is invalid. Please check the API reference: https://platform.openai.com/docs/api-reference.", 'type': 'invalid_request_error', 'param': None, 'code': None}}
========================
3 comments
m
L
how can we specify the arguments of PandasQueryEngine?
6 comments
m
W
Hi guys, wondering one thing.

I have these pipelines, however they create new data every time they are run (so after 3 runs and retrieval top_k = 3 they all retrieve the same text)...

Why?

pipelines = {
"QA": IngestionPipeline(
transformations=[
SentenceSplitter(paragraph_separator="\n\n\n", chunk_size=300, chunk_overlap=20),
TitleExtractor(),
OpenAIEmbedding(model="text-embedding-3-large"),
],
vector_store=self.vector_store,
cache=IngestionCache(),
),
"Klubista": IngestionPipeline(
transformations=[
SentenceSplitter(chunk_size=400, chunk_overlap=50),
TitleExtractor(),
OpenAIEmbedding(model="text-embedding-3-large"),
],
vector_store=self.vector_store,
cache=IngestionCache(),
),
"PrevadzkovyPoriadok": IngestionPipeline(
transformations=[
SentenceSplitter(chunk_size=400, chunk_overlap=50),
TitleExtractor(),
OpenAIEmbedding(model="text-embedding-3-large"),
],
vector_store=self.vector_store,
cache=IngestionCache(),
),
"OtherDocs": IngestionPipeline(
transformations=[
SentenceSplitter(chunk_size=400, chunk_overlap=50),
TitleExtractor(),
OpenAIEmbedding(model="text-embedding-3-large"),
],
vector_store=self.vector_store,
cache=IngestionCache(),
),
}
33 comments
m
L
Anyone has ideas how to use QueryPipeline with Routers? I haven't found any exmaple.

I have something like this:
qp = QueryPipeline(verbose=True)
qp.add_modules(
{
"input": InputComponent(),
"retriever": retriever,
"summarizer": summarizer,
}
)

qp.add_link("input", "retriever")
qp.add_link("retriever", "summarizer", dest_key="nodes")
qp.add_link("input", "summarizer", dest_key="query_str")


[0.9.47] - 2024-02-11
Last patch before v0.10!

New Features
add conditional links to query pipeline (#10520)
refactor conditional links + add to cookbook (#10544)
agent + query pipeline cleanups (#10563)
7 comments
L
m
Hi all. Anyone has an idea how to use QueryPipeline as tool (to use in Router?)
5 comments
L
g
https://github.com/run-llama/llama_index/pull/14088 i can see this code, however when i use it in the code, i get :

File "/home//anaconda3/envs//lib/python3.11/site-packages/llama_index/core/agent/function_calling/step.py", line 158, in from_tools
return cls(
^^^^
File "/home//anaconda3/envs//lib/python3.11/site-packages/llama_index/core/agent/function_calling/step.py", line 102, in init
raise ValueError(
ValueError: Model name models/gemini-1.5-flash-latest does not support function calling API.
7 comments
L
m
what is the best way to work with csvs?
4 comments
m
T