Find answers from the community

Updated last year

is there a way to make it access that

is there a way to make it access that function and execute it?
L
L
15 comments
Yes! You can create a tool from any function. It's important to write a good docstring, especially for complex functions

https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/usage_pattern.html
Awesome, do I assume it identifies the function by it's docstring?
It looks at the function signature and the docstring yes πŸ‘
Awesome! Thank you
quick question and sorry for the bother but if I wanted to use a HuggingFace Model (looking all over the docs rn, still am), how exactly do I switch the local predictor and the embed model?
I assume I also need to download the model and point it locally to a file or
Yea basically you can use the hugggingface llm or embed model, and it will download the model you point it to.

The LLMs sometimes require careful customization to the prompts, make sure to read the model card. Happy to help with that if you get stuck

LLM
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/llms/usage_custom.html#example-using-a-huggingface-llm

Embedding
https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/embeddings/usage_pattern.html#embedding-model-integrations
Hi, sorry for bother, where is the updated documentation of this I cant find it
Apologies, I meant for creating custom functions in Python and giving the LLM the ability to execute them
I thought it used to be in Llama Hub, its the ability to define a function by docstring and func signature and utilize it
I want to test nlp queries using llama index to update sql
Oh that's just FunctionTool

Plain Text
from llama_index.tools import FunctionTool

def add(a: int, b: int) -> int:
    """Add two numbers."""
    return a + b

tool = FunctionTool.from_defaults(fn=add)

agent = OpenAIAgent.from_tools([tool])
Add a reply
Sign up and join the conversation on Discord