QueryType
?from llama_index.schema import QueryType
"QueryType" is unknown import symbol (reportGeneralTypeIssues)
llama_index.llms import OpenAI
how to set the log to "debug"?openai.log = "debug"
, but doesn't work for my setup.... from llama_index.llms import OpenAI import openai openai.log = "debug" app = FastAPI() loader = SitemapReader() llm = OpenAI(temperature=0.1, model="gpt-3.5-turbo") service_context = ServiceContext.from_defaults(llm=llm)
openai.log = "debug"
, I wonder what they mean in llama index context or setup?SimpleWebPageReader
, I believe that the number of URL sources might have a big role on this, correct?def query(question: Union[str, None] = None): documents = SimpleWebPageReader(html_to_text=True) .load_data(["https://docs.foobar.com/some-knowledge"]) index = SummaryIndex.from_documents(documents) query_engine = index.as_query_engine() answer = query_engine.query(question) return { "answer": str(answer )}
SimpleWebPageReader.load_data
and query question
goes on a single request to Open AI and not two or more?res = query_engine.query("My foobar?")
def query(question: Union[str, None] = None): ... return { "answer": res }
ERROR: Exception in ASGI application
, which I believe is related to the fn query return type as tested by hard typing "some random text" as res
.print(res)
SimpleWebPageReader
and TreeIndex
more specifically from_documents
where the documents are the load_data
from SimpleWebPageReader
. Also, including that the query engine update_prompts
use a custom template as follows:query_engine.update_prompts({ "response_synthesizer:text_qa_template": qa_template })
... index = SummaryIndex.from_documents(documents) query_engine = index.as_query_engine() answer = query_engine.query(question) ...
cat /opt/homebrew/lib/python3.11/site-packages/llama_hub/web/sitemap/__init__.py
pip3 install llama-hub
cmd arg1 arg2