Find answers from the community

Home
Members
ethan0807
e
ethan0807
Offline, last seen 3 months ago
Joined September 25, 2024
Just ran into the same issue. No idea why. 😢
4 comments
e
L
After updating to the current version of llama_index, I get the error "'NodeWithScore' object has no attribute 'extra_info'". What is the correct attribute now?
2 comments
k
After updating to the current version of llama_index, I get the error "'TextNode' object has no attribute 'source_text'". What is the correct attribute now?
4 comments
e
k
After updating to the current version of llama_index I get the error "'NodeWithScore' object has no attribute 'node_text'". What is the correct attribute now?
2 comments
k
Do the CRUD methods on VectorStoreIndex not support a PGVectorStore?
10 comments
e
L
I think the following behavior is generally undesirable and confusing. Currently llama_index will default to use openai if not set explicitly. Ok. However, if a ValueError gets thrown when trying to resolve to OpenAI, it will then try to use LlamaCPP instead for some reason, which by default is hard coded to download a quantized llama2 7b model from TheBloke via huggingface. I would not expect that a openai config issue would lead to me downloading an entire random llm to my system. Is this really the intended behavior? Here is the code from llms -> utils.py:

def resolve_llm(llm: Optional[LLMType] = None) -> LLM: """Resolve LLM from string or LLM instance.""" if llm == "default": # return default OpenAI model. If it fails, return LlamaCPP try: llm = OpenAI() except ValueError as e: llm = "local" print( "******\n" "Could not load OpenAI model. Using default LlamaCPP=llama2-13b-chat. " "If you intended to use OpenAI, please check your OPENAI_API_KEY.\n" "Original error:\n" f"{e!s}" "\n******" )
5 comments
e
L
D
For some reason Postgres (pgvector) no longer shows up under the vector stores menu on the docs page. Still listed on the vector store page though. Just to be sure, support for that isn't going anywhere right?
2 comments
L