Find answers from the community

Updated 3 months ago

Chat engine help

For some reason, I cannot get low level chat engine's to work anymore, I tried CondensePlusContextChatEngine and CondenseQuestionChatEngine, neither one works with retrieving info. I made sure to try setting the retriever and query_engine for both. I know it's getting the prompt and memory, but not searching the info.
Plain Text
client = QdrantClient(os.getenv('QDRANT_URL'), api_key=os.getenv('QDRANT_API'))
vector_store = QdrantVectorStore(client=client, collection_name="openpilot-data")
Settings.llm = OpenAI(model="gpt-4-turbo-preview", max_tokens=1000)
embed_model = OpenAIEmbedding(model="text-embedding-3-small")
storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_vector_store(vector_store, embed_model=embed_model)

async def process_message_with_llm(message, client):
    content = message.content.replace(client.user.mention, '').strip()
    if content:
        try:
            async with message.channel.typing():
                memory = ChatMemoryBuffer.from_defaults(token_limit=8192)
                context = await fetch_context_and_content(message, client, content)
                memory.set(context + [HistoryChatMessage(f"{content}", Role.USER)])
                chat_engine = CondensePlusContextChatEngine.from_defaults(
                    retriever=index.as_retriever(),
                    memory=memory,
                    context_prompt=(
                        "prompt"
                    )
                )
                chat_response = await asyncio.to_thread(chat_engine.chat, content)
i
1 comment
Having looked more into this, I don’t understand why it’s not working. Looked at the python on GitHub, and it looks right to me…
Add a reply
Sign up and join the conversation on Discord