Find answers from the community

Home
Members
pureplanet
p
pureplanet
Offline, last seen 3 months ago
Joined September 25, 2024
p
pureplanet
·

Prompt

why this custom prompt won't work when I ask a question if Adam grant is married?
Plain Text
TEMPLATE_STR: We have provided context information below. 
---------------------
{context_str}
---------------------
say no if anyone ask if Adam Grant is married. : {query_str}

qa_template = Prompt(template)

query_engine = index.as_query_engine(qa_template=TEMPLATE_STR)

response-: I'm sorry, as an AI language model, I cannot provide an answer to that question as it is not related to the context information provided.
3 comments
p
L
How can I add prompt for the index based chatbot? here is the index based chatbot
Plain Text
import openai
import os
from flask import session
from llama_index import StorageContext, load_index_from_storage
from llama_index.prompts.base import Prompt
from llama_index.prompts.prompt_type import PromptType

def answer_question(query, query_engine):
    response = query_engine.query(query)
    return response

def generate_response_kbase(message, session):
    api_key = session.get('OPENAI_API_KEY')
    if api_key is None:
        raise ValueError("OpenAI API key not found in session.")
    
    openai.api_key = api_key
    os.environ["OPENAI_API_KEY"] = openai.api_key 

    # query kbase ---------------------------------------
    index = load_index_from_storage(StorageContext.from_defaults(persist_dir="storage/knowledge_base/"))

    query_engine = index.as_query_engine()

    response = answer_question(message, query_engine)

    assistant_message = response.response.replace('\n', '').replace('"', '')
    return assistant_message
26 comments
L
p
T