Find answers from the community

Updated 5 months ago

(I get errors similar to these)

At a glance

The post describes various errors encountered by the community member while running a Python script, including 'list' object has no attribute 'query_str', 'name 'chat_engine' is not defined', and 'AgentChatResponse' object has no attribute 'format'. The comments suggest that the code may not be using certain things properly, and provide suggestions to address the issues, such as using stream_chat() correctly and passing the appropriate parameters to the chat engine. The community members discuss the code and work together to try to resolve the errors.

(I get errors similar to these)
Plain Text
User: Hey
An error occurred: 'list' object has no attribute 'query_str'
(.venv) gavsgav@buntu-white:~/AthenaIndex/FlaskAPP$ /home/gavsgav/AthenaIndex/.venv/bin/python /home/gavsgav/AthenaIndex/StreamTest.py
User: Hey
An error occurred: name 'chat_engine' is not defined
(.venv) gavsgav@buntu-white:~/AthenaIndex/FlaskAPP$ /home/gavsgav/AthenaIndex/.venv/bin/python /home/gavsgav/AthenaIndex/StreamTest.py
Traceback (most recent call last):
  File "/home/gavsgav/AthenaIndex/StreamTest.py", line 28, in <module>
    chat_engine = index.as_chat_engine(chat_prompt=message, service_context=service_context, chat_mode="condense_plus_context", memory=memory, verbose=False)
NameError: name 'message' is not defined
(.venv) gavsgav@buntu-white:~/AthenaIndex/FlaskAPP$ /home/gavsgav/AthenaIndex/.venv/bin/python /home/gavsgav/AthenaIndex/StreamTest.py
User: Hey
An error occurred: 'list' object has no attribute 'query_str'
(.venv) gavsgav@buntu-white:~/AthenaIndex/FlaskAPP$ /home/gavsgav/AthenaIndex/.venv/bin/python /home/gavsgav/AthenaIndex/StreamTest.py
User: Hey
An error occurred: 'AgentChatResponse' object has no attribute 'format'
(.venv) gavsgav@buntu-white:~/AthenaIndex/FlaskAPP$ /home/gavsgav/AthenaIndex/.venv/bin/python /home/gavsgav/AthenaIndex/StreamTest.py
User: Hey.
An error occurred: 'AgentChatResponse' object has no attribute 'format'
L
L
14 comments
Can you share your code? Just seems like a few things not being used properly
Plain Text
import os
import sys
from llama_index import VectorStoreIndex, ServiceContext
from llama_index.prompts import PromptTemplate  
from llama_index.llms import Ollama, ChatMessage, MessageRole
from llama_index.chat_engine.condense_question import (
    CondenseQuestionChatEngine,
)
from llama_index import SimpleDirectoryReader
from pathlib import Path
from llama_index.embeddings import OptimumEmbedding, HuggingFaceEmbedding

OptimumEmbedding.create_and_save_optimum_model(
    "BAAI/bge-small-en-v1.5", "./bge_onnx"
)
embed_model = OptimumEmbedding(folder_name="./bge_onnx")

promt_path = Path("~/AthenaIndex/prompts/").expanduser()

documents = SimpleDirectoryReader(promt_path).load_data()

index = VectorStoreIndex.from_documents(documents)
service_context = ServiceContext.from_defaults(embed_model=embed_model, llm = Ollama(model="dolphin2.2-mistral:7b-q8_0", base_url="http://10.252.0.216:11434"))


file_path = Path("~/AthenaIndex/personas/Athena.txt").expanduser()
persona = open(file_path, "r").read()

while True:
    text_input = input("User: ")
    if text_input == "exit":
        break
    messages = [
        ChatMessage(
            role="system", content=persona, metadata={"persona": os.path.basename(file_path)}
        ),
        ChatMessage(role=MessageRole.USER, content=text_input),
    ]
    chat_engine = index.as_chat_engine(chat_mode="context", service_context=service_context)
    resp = chat_engine.stream_chat(messages)
    print(resp)

Not the best. πŸ˜„
Its all over the place atm because ive been trying to add/take away different things trying to get it to work
nah it seems pretty good!

So with this code, what is the error you get now? The only thing I noticed is that you are using stream_chat() but not streaming the result

I would do

Plain Text
resp = chat_engine.stream_chat(messages)
for token in resp.response_gen:
  print(token, end="")
That seems to have work. Thank you! I just need to figure out the memory now. (I do really appreciate you getting back to me. )
It was the query str error
Ok maybe not sorted ... i added
Plain Text
resp = chat_engine.stream_chat(messages[0].content)
    for token in resp.response_gen:
        print(token, end="")

I added [0].content For it to work but now the Agent is talking to itself πŸ˜„
With out the [0].content It gives me the error
stream chat expects a ChatMessage object -- I would do resp = chat_engine.stream_chat([messages[0]])
this is a chat engine, I'm thinking of another API
Yea ok, for this, you'll want to do two things
resp = chat_engine.stream_chat("hello")

This needs to be a string. The chat engine will keep track of the chat history itself (which you can access with chat_engine.chat_history)

If you want to override the chat history, you can pass in a list of ChatMessage objects like

Plain Text
resp = chat_engine.stream_chat("Hello", chat_history=chat_history)


Hopefull that makes sense?
I think so. I'll have a play and call back if i have any issues if thats ok?
Add a reply
Sign up and join the conversation on Discord