We are working on live video streaming use case using EdgeX and Redis server. We were getting an error
Client id=11 addr=10.168.3.0:51414 fd=9 name= age=266 idle=35 flags=P db=0 sub=0 psub=1 multi=-1 qbuf=0 qbuf-free=0 obl=122 oll=1 omem=114868232 events=r cmd=psubscribe scheduled to be closed ASAP for overcoming of output buffer limits
After some research we found that we should increase the client-output-buffer-limit pubsub in redis.conf file. So, I increased the limit to 1024mb 120mb 180sec. Still we are getting the same error. Video file size is of 30mb (300frames, 640x480 pixels).
Please let us know how do we resolve this issue. Thanks in advance.
Comment From: itamarhaber
Hi @neerajasjawali
After having made the changes to the redis.conf file, have you restarted the server, so the changes take effect?
Alternatively, you can use the CONFIG SET command to apply new configuration during runtime, verify values with the CONFIG GET, and persist the changes with CONFIG REWRITE.
Comment From: neerajasjawali
Hi @itamarhaber Yes, the changes were effective. I had changed the redis configuration using CONFIG SET command and Cross verified using the CONFIG GET command.
Comment From: itamarhaber
So, IIUC, you pushing a 30mb file as Pub/Sub message? How often are these messages published?
Comment From: neerajasjawali
The file is being published in loop. But, after 10-15 times of publishing the file, Redis is throwing an error.
Comment From: itamarhaber
Understood. IMO, what's happening here is that the subscriber can't consume the messages at the rate that they are being published. As a result, the output buffer gets filled, and eventually, the connection is closed when the limit is met.
Here's a minimal reproduction (minus golang and EdgeX) of the phenomena:
import redis
def s_handler(msg):
print(len(msg['data']))
s = redis.Redis()
sp = s.pubsub()
sp.subscribe(**{'channel': s_handler})
t = sp.run_in_thread(sleep_time=0.001)
p = redis.Redis()
while True:
p.publish('channel', '0' * 30 * pow(10,6))
Which obviously ends with this logline after a few iterations:
44407:M 14 Dec 2022 17:52:11.785 # Client id=25 addr=[::1]:62956 laddr=[::1]:6379 fd=10 name= age=2 idle=0 flags=P db=0 sub=1 psub=0 ssub=0 multi=-1 qbuf=0 qbuf-free=16890 argv-mem=0 multi-mem=0 rbs=16384 rbp=16384 obl=0 oll=37 omem=1080006008 tot-mem=1080040112 events=rw cmd=subscribe user=default redir=-1 resp=2 scheduled to be closed ASAP for overcoming of output buffer limits.
I'm not convinced that your use case (real-time video streaming) and the chosen implementation are a good match. AFAIK, Redis' Pub/Sub is usually used for broadcasting significantly smaller payloads, whereas this feels like abusing the feature :)
Increasing the output buffer won't help. You need the subscriber to keep up with the publisher, so if possible, try scaling/optimizing it. Note, however, that even if a single subscriber can consume fast enough, having multiple subscribers would ultimately saturate the server's network interface (assuming enough RAM for many biggish output buffers).
I'm sorry I can't help more. I'm transferring this issue to core repository in case someone has an answer I'm missing.
Comment From: neerajasjawali
I understand your point. Thank you for your support.
Comment From: madolson
Is there still some issue here @itamarhaber? It seems like this has been addressed.
Comment From: neerajasjawali
Hi @itamarhaber , @madolson I got some hint saying that the issue is caused because the message bus is using Redis pub/sub instead of MQTT. Would it be one of the possible cause for the OOM error?
Thanks in advance
Comment From: madolson
Yes, you are getting the OOM error because you are sending redis messages using pub/sub and not consuming them quickly enough. I'm not sure how you would be able to switch to MQTT.
Comment From: neerajasjawali
Hi @madolson Could you tell me what is the Redis pub/sub memory limitations? And can we save the Redis data into Ram instead of saving it on disk?
Thanks in advance
Comment From: madolson
@neerajasjawali The easier question is that Redis only saves into RAM, not to disk.
Comment From: madolson
For memory, it's controlled by 'client-output-buffer-limit pubsub' which defaults to a 32mb hard limit or 8mb for 60 seconds.
Comment From: neerajasjawali
Hi @madolson I have configured client-output-buffer-limit pub/sub to 1024mb,64mb,180secs. Still facing the OOM error.
Comment From: madolson
Are the log messages less frequent now? It seems like your clients aren't pulling data from redis as fast as it is being generated, which is resulting in a backup on the client which eventually results in it being killed.