Find answers from the community

c
cK5150
Offline, last seen 3 months ago
Joined September 25, 2024
Include sources in the response
2 comments
k
Take this tool tools = [
Tool(
name="GPT Index",
func=lambda q: str(index.query(q)),
description="Always use this tool. The input to this tool should be a complete english sentence.",
return_direct=True,
),
] and integrate it into a llama tool
15 comments
k
c
How does llama_index handle default lanchain prompt when using intialize_agent
17 comments
k
c
How to logger when calling a query through a agent
34 comments
c
b
L
k
given this prompt how do I print the contents of context_str to the console. query_str = "What did the author do growing up?"
QA_PROMPT_TMPL = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
QA_PROMPT = QuestionAnswerPrompt(QA_PROMPT_TMPL)
19 comments
c
L
k
what is llama_chat_agent
21 comments
c
k
c
cK5150
·

Responses

Are two querys and two responses involved in this? The first query being the question the user asks and passed to the first prompt. The bot gives a response, then that response is used by llama to generate a query for the second prompt, then that response being returned to the user? Thank you for the clarification.**
9 comments
L
c
a