Find answers from the community

Home
Members
Marco Castellari
M
Marco Castellari
Offline, last seen 3 months ago
Joined September 25, 2024
Nevermind, I managed to force the query everytime using chat_mode="openai" and function_call="query_engine_tool".
1 comment
L
The query part of my code looks like this actually:

chat_engine = index_ready.as_chat_engine( verbose=True, system_prompt=system_prompt, condense_question_prompt=custom_prompt, chat_history=history, similarity_top_k=3, ) response = chat_engine.chat(data["question"])

When I do a query, sometimes it print this on console:

=== Calling Function === Calling function: query_engine_tool with args: { "input": "my query text condensed" } Got output: Some text. ========================

and show the correct answer, but sometimes it just dont print anything on the console and show an unrelated answer that is not based on the documents I inserted, im going crazy with this, anyone know what can be wrong?
2 comments
M
L
https://gpt-index.readthedocs.io/en/v0.8.51/examples/discover_llamaindex/document_management/Discord_Thread_Management.html
In the end of this page there's a "weight" property, but it doesn't seems to work, does anyone know about that?
3 comments
M
L
Everytime i use starge_context.persist() it will erase my current data on /storage and save a new one or it will just "append" the new content?
5 comments
L
M
how I reduce the number of requests made to OpenAI api?
7 comments
L
M
Hello, we're trying to use our support chat history (from our database) to feed the llama_index, right now I've exported all the data to a JSON file and put it on the "data" folder (millions of lines). But I was wondering how will the AI know what is the customer question vs what is the chat attendant answer? Is there a way to send like an index for each phrase (QUESTION:, ANSWER:)?
5 comments
L
M
Hello guys, I'll feed my index with documents from my chat logs and my FAQ data, is that possible to set a priority for them on the save OR on the retrieve, like, obviously the FAQ information should be used PRIOR the chat logs, because it will always be more precise.
2 comments
M
T