Hello Team,
I have a requirement of consuming and processing 50,000 messages in parallel from Redis stream. I couldn't find any guidelines on capacity planning to handle such a large volume of message processing with Redis. In our scenario, it's going to be 50,000 docker containers running in Kubernetes and processing the messages in parallel. (Mostly from a single consumer group)
Any guidelines for supporting such a scenario with Redis would very helpful. Also, looking for guidelines on setting up Redis in Kubernetes to support such workload and considerations in terms of capacity planning and benchmarks for the environment.
Thanks for your advice and help!
Regards, Sowmyan
Comment From: anandr781
While this does not answer the question . What was the outcome of this experiment. Did you verify If they are possibly heterogeneous messages please avoid using the same stream.
Comment From: madolson
The github is reserved for issues, please reach out to the communities listed here to get help about running Redis in production https://redis.io/community.