Find answers from the community

Updated 2 months ago

hello llamaindex people.

hello llamaindex people.
Anyone knows a quick and dirty way to specify an llm for the googlespectools to use? can't seem to find a way. been digging since yesterday.

i am specifying it as such:
Plain Text
google_spec = GoogleSearchToolSpec(key=GOOGLE_SEARCH_API_KEY,engine=search_engine,num=5)
    googletools = LoadAndSearchToolSpec.from_defaults(google_spec.to_tool_list()[0],).to_tool_list()

then adding it to an agent in the tools list
Plain Text
([vector_query_tool, *google_tools])


seems that the google_tools specs has a load and read functions components. and it seems the read component uses an llm (i might be misunderstanding here). is there a way to specify directly which llm to use?
L
R
5 comments
Its the load and search tool spec that uses an embedding model and llm

You set the global defaults on Settings object

I think its missing an easier way to pass it in though
here's the question though, if I set
Settings.llm to haiku let's say

but then separately define a gpt4_llm and pass it to the openAIAgent as a parametr llm=gp4_llm.
Would it respect that? or would the settings.llm override the parameter?
passing in to a local constructor will always override the global settings πŸ‘
that's great. thanks
need to lower the cost per token πŸ˜…
Add a reply
Sign up and join the conversation on Discord