Find answers from the community

Updated last year

Redis cloud

At a glance

The community member is experiencing an error when creating a vectorized index in Redis, specifically "redis.exceptions.ResponseError: Vector index initial capacity 20000 exceeded server limit (1704 with the given parameters)". They are following an example from the GPT-Index documentation and are unsure of what additional parameter to pass.

Other community members suggest that the issue may be related to limitations in the Redis Cloud service, and recommend trying to deploy a local Redis instance instead. One community member contacted Redis Labs support and received a response indicating that the free Redis Cloud subscription has a memory limit of 30MB, and that upgrading to a paid plan may be necessary to get the required memory.

There is no explicitly marked answer in the comments, but the Redis Labs support response provides some guidance on the potential cause of the issue and a possible solution.

Useful resources
Hello everyone, I have an error that I'm sure is stupid.

I'm creating a vectorized index in redis and I'm getting this error: redis.exceptions.ResponseError: Vector index initial capacity 20000 exceeded server limit (1704 with the given parameters)

It is necessary to pass some additional parameter in this code:

vector_store = RedisVectorStore(
index_name="test",
index_prefix="test_p",
redis_url="redis://<data-my-servives-redis>"
)

I am following this example: https://gpt-index.readthedocs.io/en/latest/examples/vector_stores/RedisIndexDemo.html
W
N
C
7 comments
Seems like redis cloud has some sort of limitations. Are you getting this on single file ingestion as well ?
Code setup in the above example says to set overwrite value as well. Did you try with that as well?

Plain Text
vector_store = RedisVectorStore(
    index_name="pg_essays",
    index_prefix="llama",
    redis_url="redis://localhost:6379",
    overwrite=True,
)
First thank you for your response @WhiteFang_Jr

I think that the limitation of redis cloud is the size, but if you pay you have as much as you want. I now have 100Mb more than enough.

If I follow the example by deploying a local redis, it works for me, but with cloud redis it doesn't work, I think it could be some parameter or something in the configuration.

The overwrite=True option is for the index that is created, it does not seem to have anything to do with the error.

Thanks again.
I have the same issue. It works fine if I have a local reids container running but when I try to hit my redislabs instance, this error occurs. I have contacted redislabs support to see if they can help.
Hope this helps, from RedisLabs suport,
Amit Levy (Redis Labs)

Oct 19, 2023, 00:28 PDT

Hello Craig,

This is Amit from the Redis Customer Success team.
Fixed Subscriptions come with an index size memory limit, which is set at 10% of the total size of the database. The Free subscription provides a maximum memory size of 30MB.
You can explore the available offerings and choose the best for your needs here.

Regards,
Amit

Redis - Customer Success

Learn more about Redis Enterprise Cloud Flexible

This email is a service from Redis Labs. Delivered by Zendesk.
[JXE3JV-L5W9N]

Craig Merchant
1:11 PM (0 minutes ago)
to Support

Hi Amit,

Thanks, that appears to be correct. When I run the redis container on an ec2 t2.micro instance I get this error. When I run on a t2.large with 8GB memory, it works as expected..

Are you suggesting that if I move away from the free tier to a paid plan, I would get the 8GB of memory required?

Thanks,
Craig
Thanks for your response @Craig
Add a reply
Sign up and join the conversation on Discord