from llama_index.llms.gemini import Gemini
from llama_index.llms.gemini import Gemini
is correct. This line of code is used to import the Gemini
class from the llama_index.llms.gemini
module. You can use this class to create an instance of the Gemini model and use its methods for various tasks such as generating text completions or having a chat conversation.from llama_index.llms.gemini import Gemini llm = Gemini(model_name="models/gemini-ultra", api_key="YOUR_API_KEY") resp = llm.complete("Write a poem about a magic backpack") print(resp)
Gemini
class is created with a specific model name and an API key. The complete
method is then called on this instance with a prompt to generate a text completion."YOUR_API_KEY"
with your actual API key.from llama_index.core.agent import ReActAgent from llama_index.core.tools import FunctionTool from llama_index.llms.gemini import Gemini # Define a sample Tool def multiply(a: int, b: int) -> int: """Multiply two integers and return the result integer""" return a * b # Create a Tool instance for the multiply function multiply_tool = FunctionTool.from_defaults(fn=multiply) # Initialize the language model llm = Gemini(model_name="models/gemini-ultra", api_key=" ", transport="rest") # Initialize the ReAct agent with the multiply tool agent = ReActAgent.from_tools([multiply_tool], llm=llm, verbose=True) # Query the agent to calculate 85 multiplied by 9 resp = agent.query("Calculate 85 multiplied by 9") # Print the response from the agent print(resp.response)
FunctionTool
instance."YOUR_API_KEY"
with your actual API key.from llama_index.core.agent import ReActAgent from llama_index.core.tools import FunctionTool from llama_index.llms.gemini import Gemini # Define a sample Tool def multiply(a: int, b: int) -> int: """Multiply two integers and return the result integer""" return a * b # Create a Tool instance for the multiply function multiply_tool = FunctionTool.from_defaults(fn=multiply) # Initialize the language model llm = Gemini(model_name="models/gemini-ultra", api_key="YOUR_API_KEY", transport="rest") # Initialize the ReAct agent with the multiply tool agent = ReActAgent.from_tools([multiply_tool], llm=llm, verbose=True) # Query the agent to calculate 85 multiplied by 9 resp = agent.query("Calculate 85 multiplied by 9") # Print the response from the agent print(resp.response)
FunctionTool
and LlamaParse
are two different components used in the LlamaIndex library, each serving a different purpose.FunctionTool
is a class that allows users to easily convert any user-defined function into a tool that can be used by an agent. It takes a function as an argument and can also auto-infer the function schema. This is useful when you want to create a tool that performs a specific operation or task. For example, you can create a FunctionTool
that multiplies two numbers, adds two numbers, etc. Here is an example of how to use it:from llama_index.core.tools import FunctionTool def multiply(a: int, b: int) -> int: """Multiply two integers and return the result integer""" return a * b multiply_tool = FunctionTool.from_defaults(fn=multiply)
LlamaParse
is not mentioned in the provided knowledge sources. It's possible that you may be referring to a different component or concept in the LlamaIndex library. If you could provide more context or details about LlamaParse
, I would be able to give a more accurate comparison.FunctionTool
, LlamaParse
, or possibly both.