Find answers from the community

V
Vi
Offline, last seen 3 weeks ago
Joined September 25, 2024
Why does the node metadata in my llamaindex is always in /8284/hsk2/82849 ? Especially for CJK fonts ?
1 comment
W
How do i get to display citations and sources in llama index ?

Citations query engine doesn't seem to be enough
2 comments
W
V
V
Vi
·

Types

It has better typehints
1 comment
L
Plain Text
from llama_index.agent.openai import OpenAIAgent


i noticed this doesn't work on stable version of llamaindex https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
where can I get OpenAiAgent class ?

tried
Plain Text
from llama_index.core.agent.openai import OpenAIAgent


didn't work either
6 comments
V
W
Is there a way to have the openaiagent behaving like openai assistant, as in their response consisting of tool call then message then tool call again until they see their response as good enough ? As opposed to the default in llamaindex openaiagent where the agent will call tool after tool and only send a message when they think the answer is good
1 comment
V
i'm eyeing this part of the doc, and testing locally https://docs.llamaindex.ai/en/stable/examples/agent/agent_runner/agent_runner.html

my question is :
  • how can I get the streaming of the step / response ?
  • I saw the typehint, but the output is the same for task.arunstep and task.astreamstep
8 comments
V
Also is there a small to big indexing / retriever
10 comments
V
L
Why do i keep getting index error when using LLMRerank ?
2 comments
V
W
is there a way to do hybrid retriever / bm25 retriever without storing and loading from filesystem (docstore / objectstore) ?
7 comments
V
P
L
how can I incorporate this tool into react/ openai agent ?
https://docs.llamaindex.ai/en/stable/api_reference/tools/duckduckgo/?h=duck

Where to import ? do i need to install additional things first ?
1 comment
W
so I'm using Unstructured node parser and got this error
1 comment
L
I'm using OpenaiAgent stepwise, and I'm wondering how can I stream these original openai output that is saved in chathistory ?
Right now I can only manage to stream ToolOutput class, and ChatMessage class... ToolOutput class is kinda unusuable to be fedback into the agent
2 comments
V
also is there a way to have somekind of middleware before and after the tool call made by the agent ? I want to publish what the bot is doing, what tool they chose
2 comments
L
is there a way to attach user_id or thread_id or even api key to the tool call made by OpenaiAgent ?
7 comments
L
V