Find answers from the community

Updated last year

does llamaindex Q&A use only the information provided

At a glance

The post asks if LlamaIndex's Q&A functionality uses only the information provided. The community members respond that yes, LlamaIndex is designed to work with the data you provide, whether it's private or domain-specific data. LlamaIndex supports various forms of question answering, including semantic search, summarization, and querying over documents or structured data. It can also combine queries across multiple sources.

Additionally, the community members explain that LlamaIndex can be integrated with OpenAI's models to enhance its capabilities, using OpenAI's function calling API to improve the robustness of the sub-question query engine. They provide an example of how to use OpenAI with LlamaIndex.

@kapa.ai does llamaindex Q&A use only the information provided
k
d
5 comments
Please stand by @dhruva as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, LlamaIndex's Q&A functionality is designed to work with the data you provide. It is a data framework for Large Language Model (LLM)-based applications to ingest, structure, and access private or domain-specific data. This means it can work with your data, which may be private or specific to the problem you're trying to solve, and could be behind APIs, in SQL databases, or trapped in PDFs and slide decks.

LlamaIndex supports many forms of question and answering, including semantic search, summarization, and querying over documents or structured data. It can also combine queries across an arbitrary number of sources and route the question to the best source.

For more detailed information, you can refer to the LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how to also include openai information
Let me check the knowledge base for you. One moment! :books:
----------------------
LlamaIndex can be integrated with OpenAI's models to enhance its capabilities. This is done by using OpenAI's function calling API to improve the robustness of the sub-question query engine.

Here's an example of how to use OpenAI with LlamaIndex:

Plain Text
from llama_index import VectorStoreIndex, ServiceContext
from llama_index.llms import OpenAI

llm = OpenAI(model="gpt-3.5-turbo", temperature=0.1)
service_context = ServiceContext.from_defaults(llm=llm)

index = VectorStoreIndex.from_documents(
    documents, service_context=service_context
)

query_engine = index.as_query_engine(
    output_cls=Biography, response_mode="compact"
)

response = query_engine.query("Who is Paul Graham?")


In this example, the OpenAI class is used to create a language model (llm) with the specified model and temperature. This llm is then used to create a ServiceContext, which is used to create a VectorStoreIndex. The VectorStoreIndex is then used to create a query_engine, which can be used to query the data.

You can find more details in the LlamaIndex documentation () and GitHub examples ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord