Find answers from the community

Updated 3 months ago

Hi all can anyone Logan M please explain

Hi all - can anyone ( !) please explain the difference between SQL Auto Vector Query Engine and SQL Query Join Engine? It looks like Join is more recent, and more flexible (in that any other engine can be used whereas Auto Vector only supports SQL + Vector DB) - is that right?
R
L
20 comments
... further I'm trying to use SQLTableRetrieverQueryEngine in conjunction with either of these, but I believe that only NLSQLTableQueryEngine is supported: is that correct?
And lastly: should I be thinking of using Data Agents perhaps instead?
My aim is to use an SQL DB (with many tables) and Vector DB as data sources
That's right! The auto vector version came first, but is actually deprecated. Realized there was no reason to restrict it to vector stuff haha
A data agent would also work nicely for this. Each query engine becomes a tool (I.e. a sql query engine, a vector engine, etc.) And the agent should use as many tools as needed to accomplish a task
Excellent thanks Logan - I'll looks into using a data agent, although lack of "support" for Vertex AI slightly worries me...! Do you think I could get SQLTableRetrieverQueryEngine working well with SQL Query Join Engine without too much hassle, or better to focus on data agent?
Looking further into it, it looks like explicit support for Vertex AI doesn't matter so much as the ReAct agent will 🎯
We support Google PaLM if that's what you are looking for!

Or you can use langchain integrations in llamaindex as well πŸ‘
Basically, we have an openai agent that uses their function calling api. And then all other LLMs use the react agent

In our findings though, the openai version is a little more reliable πŸ‘
When you say you support PaLM, you mean via a langchain llm wrapper right? ie from langchain.llms import VertexAI
nah we have a dedicated PaLM LLM
ahhhhh... hadn't seen that! A new arrival?
Mmm within the last 3 weeks I think, so pretty new πŸ‘
Note for posterity (and correct me if I'm wrong Logan) but that PaLM implementation uses the "public" PaLM API, not Vertex AI's "enterprise" oriented offering
Hey Logan, so the following isn't happy:
Plain Text
from langchain.llms import VertexAI

llm = VertexAI(**VERTEXAI_KWARGS)
agent = ReActAgent.from_tools(query_engine_tools, llm=llm, verbose=True)
agent.query("How long did it take to build Rome?")

*** AttributeError: 'VertexAI' object has no attribute 'chat'

... no matter whether I call agent.query() of agent.chat()
I can see that VertexAI has no chat() member method... maybe I need to wrap within a CustomLLM? I can't use the PaLM abstraction you guys offer: need to use Vertex AI
Ah, langchain LLMs need to be wrapped

In the service context it wraps it automatically, but I guess for an agent you have to do it

Plain Text
from llama_index.llms import LangchainLLM

llm =LangchainLLM(VertexAI(...))
Aha - excellent, thank you that's got it working
Add a reply
Sign up and join the conversation on Discord