Find answers from the community

Updated 4 months ago

I'm running redis locally on my mac and

At a glance
A community member is running Redis locally on their Mac and receiving an AttributeError: 'Redis' object has no attribute 'module_list' error when running their pipeline. They suspect this is related to LlamaIndex trying to validate something that doesn't exist. The community members discuss updating Redis to a higher version, but encounter compatibility issues with RediSearch. They provide their code and try various approaches, including stopping Redis and installing Redis-Stack-Server. Eventually, they solve the issue by removing the SentenceSplitter parameters and realizing that documents is the operative word, not document. The community members also discuss using Redis with LlamaIndex, including examples for document stores, index stores, and chat stores, but there is no information on using Redis with LlamaIndex in TypeScript.
Useful resources
I'm running redis locally on my mac and receiving this error: AttributeError: 'Redis' object has no attribute 'module_list' when I run my pipeline. This seems like something llamaindex is trying to validate but it does not exist where it is searching. Any thoughts on how to remediate?
1
L
B
s
30 comments
That's kind of weird. What version of redis do you have?
pip show redis
Name: redis
Version: 3.5.3
Hmm maybe try updating? I had like v5 when I checked earlier
I feel like you need at least v4
will do, will let you know
Name: redis
Version: 5.0.1
and I am getting this error: ValueError: Redis failed to connect: You must add the RediSearch (>= 2.4) module from Redis Stack. Please refer to Redis Stack docs: https://redis.io/docs/stack/
but redisearch is not compatible with redis v5 I don't think
I am not running redis in docker
and here is the code:

from llama_index import SimpleDirectoryReader from llama_index import Document from llama_index.embeddings import HuggingFaceEmbedding from llama_index.ingestion import ( DocstoreStrategy, IngestionPipeline, IngestionCache, ) from llama_index.text_splitter import SentenceSplitter from llama_index.ingestion.cache import RedisCache from llama_index.storage.docstore import RedisDocumentStore from llama_index.vector_stores import RedisVectorStore document = SimpleDirectoryReader( input_dir="/Users/Derek/PycharmProjects/xxxx/output_files" ).load_data() document = Document(text="\n\n".join([doc.text for doc in document])) embed_model = HuggingFaceEmbedding(model_name="BAAI/bge-small-en-v1.5") pipeline = IngestionPipeline( transformations=[ SentenceSplitter(include_metadata=False, include_prev_next_rel=False, chunk_size=25, chunk_overlap=0), embed_model, ], docstore=RedisDocumentStore.from_host_and_port( "localhost", 6379, namespace="document_store" ), vector_store=RedisVectorStore( index_name="redis_vector_store", index_prefix="vector_store", redis_url="redis://localhost:6379", ), cache=IngestionCache( cache=RedisCache.from_host_and_port("localhost", 6379), collection="redis_cache", ), docstore_strategy=DocstoreStrategy.UPSERTS, ) nodes = pipeline.run(document=document)

Nothing really different from the reference documentation, just no docker container(s) running
I'm passed the error now, I got it past it by stopping redis
after I installed redis-stack-server
I have confirmed that the ports are listening:

`tcp46 0 0 *.8001 . LISTEN
tcp46 0 0 *.6379 . LISTEN
tcp4 0 0 127.0.0.1.63342 . LISTEN
tcp4 0 0 127.0.0.1.6942 . LISTEN
tcp6 0 0 *.49172 . LISTEN
tcp4 0 0 *.49172 . LISTEN
tcp6 0 0 *.5000 . LISTEN
tcp4 0 0 *.5000 . LISTEN
tcp6 0 0 *.7000 . LISTEN
tcp4 0 0 *.7000 . LISTEN
and I even downloaded docker to my mac and ran the cmd from the docs, but I am unable to ingest any docs into redis. I feel this is a limitation of my understanding in how the code needs to interact with redis.
Hmm I usually just run the redis docker container and then pip install redis, and it usually just works lol
hmmm, did you already have redis and redis-stack installed on your local?
@Logan M so I have verified my set up and I can add a document to the redis-stack store, but when I run the above code, nothing is added to the store. Have you run this outside of a jupiter notebook?
@Logan M I have solved this. I removed the SentenceSplitter parms from my code and realized that documents was an operative word, it can't be document
@kapa.ai how to use redis in llamats
Let me check the knowledge base for you. One moment! :books:
----------------------
LlamaIndex supports Redis as an alternative backend for document stores, index stores, and chat stores. Here's how you can use Redis in different contexts:

  1. Document Store: You can use Redis as a document store backend that persists data as Node objects are ingested. Here's a code snippet showing how to do this:
Plain Text
from llama_index.storage.docstore import RedisDocumentStore
from llama_index.node_parser import SentenceSplitter

# create parser and parse document into nodes
parser = SentenceSplitter()
nodes = parser.get_nodes_from_documents(documents)

# create (or load) docstore and add nodes
docstore = RedisDocumentStore.from_host_and_port(
    host="127.0.0.1", port="6379", namespace="llama_index"
)
docstore.add_documents(nodes)

# create storage context
storage_context = StorageContext.from_defaults(docstore=docstore)

# build index
index = VectorStoreIndex(nodes, storage_context=storage_context)

Source ()

  1. Index Store: Redis can also be used as an index store backend. Here's how:
Plain Text
from llama_index.storage.index_store import RedisIndexStore
from llama_index import VectorStoreIndex

# create (or load) docstore and add nodes
index_store = RedisIndexStore.from_host_and_port(
    host="127.0.0.1", port="6379", namespace="llama_index"
)

# create storage context
storage_context = StorageContext.from_defaults(index_store=index_store)

# build index
index = VectorStoreIndex(nodes, storage_context=storage_context)

Source ()

  1. Chat Store: Redis can be used to store your chat history remotely. Here's how:
```python
from llama_index.storage.chat_store import RedisChatStore
from llama_index.memory import ChatMemoryBuffer
chat_store = RedisChatStore(redis_url="redis://localhost:6379", ttl=300)

chat_memory = ChatMemoryBuffer.from_defaults(
token_limit=3000,
chat_store=chat_store,
chat_store_key="user1",
)
`` **Source** (<https://docs.llamaindex.ai/en/latest/module_guides/storing/chat_stores.html>) Remember to replace "127.0.0.1" and "6379" with your Redis host and port respectively, and "llama_index"` with your desired namespace.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai in llamaIndex ts
Let me check the knowledge base for you. One moment! :books:
----------------------
I'm sorry, but your question seems incomplete. Could you please provide more details or clarify what you're asking about LlamaIndex? I'm here to help!
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@kapa.ai how to use redis in llamaIndex.ts
Let me check the knowledge base for you. One moment! :books:
----------------------
I'm sorry for any confusion, but the information provided in the extracts is about using Redis with LlamaIndex in Python. There's no mention or information about a TypeScript version (llamaIndex.ts).

LlamaIndex is a Python library, and the examples provided are Python code snippets. If you're looking to use LlamaIndex with TypeScript, you might need to check if there's a TypeScript version or binding available for LlamaIndex, or consider using a Python-to-JavaScript transpiler. However, these options are not mentioned in the provided extracts.

For accurate information, I recommend checking the official LlamaIndex documentation or their GitHub repository.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord