context_str
to the console, you can use the format
method of the QuestionAnswerPrompt
instance. Here's how you can do it:context_str = "The author grew up in a small town, playing sports and reading books." query_str = "What did the author do growing up?" QA_PROMPT_TMPL = ( "We have provided context information below. \n" "---------------------\n" "{context_str}" "\n---------------------\n" "Given this information, please answer the question: {query_str}\n" ) QA_PROMPT = QuestionAnswerPrompt(QA_PROMPT_TMPL) formatted_prompt = QA_PROMPT.format(context_str=context_str, query_str=query_str) print(formatted_prompt)
context_str
and query_str
values filled in. The context_str
value will be displayed between the dashed lines in the output.Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:context_str
is a string containing the context information that you want to provide to the model. In the examples provided in the extracts, the context_str
is manually set or retrieved from an index.context_str
is set manually:context_str = "The author grew up in a small town, playing sports and reading books."
context_str
is set manually for a specific table:city_stats_text = ( "This table gives information regarding the population and country of a given city.\n" "The user will query with codewords, where 'foo' corresponds to population and 'bar'" "corresponds to city." ) table_context_dict = {"city_stats": city_stats_text}
context_str
can also be retrieved from an index using the query_index_for_context
function:query_str = "Which city has the highest population?" context_str = query_index_for_context(table_schema_index, query_str)
context_str
is either manually set or retrieved from an index, and then used in the prompt template.Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:context_str
is retrieved from an index using the query_index_for_context
function. In the SQL guide (https://gpt-index.readthedocs.io/en/latest/guides/tutorials/sql_guide.html), the context_str
is retrieved as follows:query_str = "Which city has the highest population?" context_str = SQLContextContainerBuilder.query_index_for_context( table_schema_index, query_str, store_context_str=True )
context_str
to the console, you can simply use the print
function:print(context_str)
Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:GPTSimpleVectorIndex
is a class in the GPT Index library that provides an in-memory vector store for semantic search. It allows you to create an index from a collection of documents and then query the index using natural language questions. The index uses embeddings for the input text chunks and returns relevant responses based on the query.GPTSimpleVectorIndex
:from gpt_index import GPTSimpleVectorIndex, SimpleDirectoryReader # Load documents, build the GPTSimpleVectorIndex documents = SimpleDirectoryReader('../paul_graham_essay/data').load_data() index = GPTSimpleVectorIndex.from_documents(documents) # Query index response = index.query("What did the author do growing up?")
# Save index to disk index.save_to_disk('index_simple.json') # Load index from disk index = GPTSimpleVectorIndex.load_from_disk('index_simple.json')
Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon:GPTSimpleVectorIndex
does not have a context_str
that it sets directly. The GPTSimpleVectorIndex
is used for semantic search over a collection of documents. It stores document embeddings and retrieves relevant documents based on the query.context_str
is used in the context of custom prompts, like the QuestionAnswerPrompt
, where you provide context information to the model. The context_str
is typically set manually or retrieved from an index, such as in the SQL guide (https://gpt-index.readthedocs.io/en/latest/guides/tutorials/sql_guide.html) example.context_str
with GPTSimpleVectorIndex
, you would need to manually set the context or retrieve it from another source, and then use it in your custom prompt.Note: you can ask me a follow up question by @ mentioning me again
:speech_balloon: