Find answers from the community

Updated 4 months ago

Hello , <@334536717648265216> , I am looking to integrate a prompt framework for my Ag

At a glance
Hello , ,

I am looking to integrate a prompt framework for my Agent [Function Calling Agent Worker]. It would be great if you could suggest a suitable option.

I have started working with Promptify but encountered version compatibility issues between LlamaIndex [0.10.27] and Promptify [2.0.23].

Specifically:
  • LlamaIndex core requires openai>=1.1.0, while Promptify supports 0.27.0.
  • LlamaIndex core requires 4.66.1 <= tqdm <= 5.0.0, but Promptify supports 4.65.0.
Our agent code is ready, but I need a robust prompt library or framework to integrate with LlamaIndex. Your suggestions would be greatly appreciated.
k
p
W
6 comments
Please stand by @pavan._01 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
LlamaIndex itself provides robust prompt abstractions that capture all the common interaction patterns with LLMs. It supports a set of default prompt templates () that work well out of the box. You can also provide your own prompt templates to further customize the behavior of the framework.

For example, you can define a prompt template using the ChatPromptTemplate class in LlamaIndex, as shown in this example ():

Plain Text
from llama_index.core import ChatPromptTemplate
from llama_index.core.llms import ChatMessage

prompt = ChatPromptTemplate(
    message_templates=[
        ChatMessage(
            role="system",
            content=(
                "You are an expert assistant for summarizing and extracting insights from sales call transcripts."
            ),
        ),
        ChatMessage(
            role="user",
            content=(
                "Here is the transcript: \n"
                "------\n"
                "{transcript}\n"
                "------"
            ),
        ),
    ]
)


You can also specify a system prompt to give the agent additional instruction or personality, as shown in this example ():

Plain Text
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.prompts.system import SHAKESPEARE_WRITING_ASSISTANT

llm = OpenAI(model="gpt-3.5-turbo-0613")

agent = OpenAIAgent.from_tools(
    [multiply_tool, add_tool],
    llm=llm,
    verbose=True,
    system_prompt=SHAKESPEARE_WRITING_ASSISTANT,
)
For more complex workflows, LlamaIndex provides robust abstractions for creating sequential prompt chains, as well as general DAGs to orchestrate prompts with any other component. You can learn more about this in the Query Pipelines guide ().

I hope this helps! If you have any other questions, feel free to ask.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Hyee @Logan M , @WhiteFang_Jr .
I get it if you're busy, but a quick one or two-line reply would mean a lot to me.
Hey!,
You'll have to check whether promptify can work with lower openai version and tqdm.
Sure , Thank you πŸ˜€
Add a reply
Sign up and join the conversation on Discord