This bug appears to go away if you change MALLOC=libc to MALLOC=jemalloc.
Reproduction script:
(git clone -q https://github.com/antirez/redis -b 5.0.7 --depth=1 ~/redis.tmp && \
trap 'rm -rf ~/redis.tmp' EXIT && \
cd ~/redis.tmp && \
make -s MALLOC=libc -j16 && \
ulimit -n 1048576 && { \
src/redis-server & { \
sleep 0.1 && \
echo "config set maxclients 1048544" | src/redis-cli; \
killall redis-server -w -q; \
}; \
})
Bug report:
=== REDIS BUG REPORT START: Cut & paste starting from here ===
3762:M 08 Mar 2020 13:58:38.621 # Redis 5.0.7 crashed by signal: 11
3762:M 08 Mar 2020 13:58:38.621 # Crashed running the instruction at: 0x563faac09f06
3762:M 08 Mar 2020 13:58:38.621 # Accessing address: 0x7fcc62adf110
3762:M 08 Mar 2020 13:58:38.621 # Failed assertion:
------ STACK TRACE ------ EIP: src/redis-server *:6379(aeProcessEvents+0x156)[0x563faac09f06]
Backtrace: src/redis-server :6379(logStackTrace+0x5a)[0x563faac56c8a] src/redis-server :6379(sigsegvHandler+0xb1)[0x563faac57441] /lib/x86_64-linux-gnu/libpthread.so.0(+0x12890)[0x7fcc62302890] src/redis-server :6379(aeProcessEvents+0x156)[0x563faac09f06] src/redis-server :6379(aeMain+0x2b)[0x563faac0a2eb] src/redis-server :6379(main+0x505)[0x563faac06ea5] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xe7)[0x7fcc61f20b97] src/redis-server :6379(_start+0x2a)[0x563faac070ea]
------ INFO OUTPUT ------
Server
redis_version:5.0.7 redis_git_sha1:4891612b redis_git_dirty:0 redis_build_id:55b73976110b6832 redis_mode:standalone os:Linux 4.15.0-1054-aws x86_64 arch_bits:64 multiplexing_api:epoll atomicvar_api:atomic-builtin gcc_version:7.4.0 process_id:3762 run_id:0a42c8fddaa1435fa4d14b322746462b672cbbec tcp_port:6379 uptime_in_seconds:0 uptime_in_days:0 hz:10 configured_hz:10 lru_clock:6642814 executable:${HOME}/redis.tmp/src/redis-server config_file:
Clients
connected_clients:1 client_recent_max_input_buffer:2 client_recent_max_output_buffer:3286400 blocked_clients:0
Memory
used_memory:54921400 used_memory_human:52.38M used_memory_rss:5439488 used_memory_rss_human:5.19M used_memory_peak:54921400 used_memory_peak_human:52.38M used_memory_peak_perc:1308.12% used_memory_overhead:911614 used_memory_startup:861920 used_memory_dataset:54009786 used_memory_dataset_perc:99.91% allocator_allocated:4198512 allocator_active:5401600 allocator_resident:5401600 total_system_memory:66008809472 total_system_memory_human:61.48G used_memory_lua:37888 used_memory_lua_human:37.00K used_memory_scripts:0 used_memory_scripts_human:0B number_of_cached_scripts:0 maxmemory:0 maxmemory_human:0B maxmemory_policy:noeviction allocator_frag_ratio:1.29 allocator_frag_bytes:1203088 allocator_rss_ratio:1.00 allocator_rss_bytes:0 rss_overhead_ratio:1.01 rss_overhead_bytes:37888 mem_fragmentation_ratio:1.30 mem_fragmentation_bytes:1240976 mem_not_counted_for_evict:0 mem_replication_backlog:0 mem_clients_slaves:0 mem_clients_normal:49694 mem_aof_buffer:0 mem_allocator:libc active_defrag_running:0 lazyfree_pending_objects:0
Persistence
loading:0 rdb_changes_since_last_save:0 rdb_bgsave_in_progress:0 rdb_last_save_time:1583701118 rdb_last_bgsave_status:ok rdb_last_bgsave_time_sec:-1 rdb_current_bgsave_time_sec:-1 rdb_last_cow_size:0 aof_enabled:0 aof_rewrite_in_progress:0 aof_rewrite_scheduled:0 aof_last_rewrite_time_sec:-1 aof_current_rewrite_time_sec:-1 aof_last_bgrewrite_status:ok aof_last_write_status:ok aof_last_cow_size:0
Stats
total_connections_received:1 total_commands_processed:2 instantaneous_ops_per_sec:0 total_net_input_bytes:72 total_net_output_bytes:11468 instantaneous_input_kbps:0.01 instantaneous_output_kbps:0.00 rejected_connections:0 sync_full:0 sync_partial_ok:0 sync_partial_err:0 expired_keys:0 expired_stale_perc:0.00 expired_time_cap_reached_count:0 evicted_keys:0 keyspace_hits:0 keyspace_misses:0 pubsub_channels:0 pubsub_patterns:0 latest_fork_usec:0 migrate_cached_sockets:0 slave_expires_tracked_keys:0 active_defrag_hits:0 active_defrag_misses:0 active_defrag_key_hits:0 active_defrag_key_misses:0
Replication
role:master connected_slaves:0 master_replid:62138f763c03eeadbb5f5b2c7f81c4c9b92c05de master_replid2:0000000000000000000000000000000000000000 master_repl_offset:0 second_repl_offset:-1 repl_backlog_active:0 repl_backlog_size:1048576 repl_backlog_first_byte_offset:0 repl_backlog_histlen:0
CPU
used_cpu_sys:0.015817 used_cpu_user:0.003954 used_cpu_sys_children:0.000000 used_cpu_user_children:0.000000
Commandstats
cmdstat_config:calls=1,usec=15396,usec_per_call=15396.00 cmdstat_command:calls=1,usec=504,usec_per_call=504.00
Cluster
cluster_enabled:0
Keyspace
------ CLIENT LIST OUTPUT ------ id=3 addr=127.0.0.1:42770 fd=8 name= age=0 idle=0 flags=N db=0 sub=0 psub=0 multi=-1 qbuf=0 qbuf-free=32768 obl=5 oll=0 omem=0 events=r cmd=config
------ REGISTERS ------ 3762:M 08 Mar 2020 13:58:38.621 # RAX:0000000000000000 RBX:0000000000000001 RCX:0000000000000000 RDX:0000563faacbee6c RDI:0000563faacbee58 RSI:0000000000000037 RBP:00007fcc62adf110 RSP:00007fff804e2ea0 R8 :0000563facd4b385 R9 :0000000000000000 R10:0000000000000000 R11:0000563faacc2c5f R12:0000563facd00da0 R13:0000000000000000 R14:0000000000000001 R15:0000000000000008 RIP:0000563faac09f06 EFL:0000000000010246 CSGSFS:002b000000000033 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2eaf) -> 0000563faac06ea5 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2eae) -> 0000000000000eb2 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ead) -> 0000563faac0a2eb 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2eac) -> 0000000000000000 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2eab) -> 0000000000000008 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2eaa) -> 0000000000000000 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea9) -> 00007fff804e3068 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea8) -> 0000000000000000 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea7) -> 0000563facd00da0 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea6) -> 0000000000000000 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea5) -> 690281aa206d3500 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea4) -> 0000000000000000 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea3) -> 0000000000093d2f 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea2) -> 000000005e655c7e 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea1) -> 690281aa0000000b 3762:M 08 Mar 2020 13:58:38.621 # (00007fff804e2ea0) -> 000000005e655c7e
------ FAST MEMORY TEST ------ 3762:M 08 Mar 2020 13:58:38.621 # Bio thread for job type #0 terminated 3762:M 08 Mar 2020 13:58:38.621 # Bio thread for job type #1 terminated 3762:M 08 Mar 2020 13:58:38.622 # Bio thread for job type #2 terminated *** Preparing to test memory region 563faaf04000 (94208 bytes) *** Preparing to test memory region 563facc9e000 (974848 bytes) *** Preparing to test memory region 7fcc5d186000 (8392704 bytes) *** Preparing to test memory region 7fcc5d987000 (33558528 bytes) *** Preparing to test memory region 7fcc5f988000 (12587008 bytes) *** Preparing to test memory region 7fcc6058a000 (8388608 bytes) *** Preparing to test memory region 7fcc60d8b000 (8388608 bytes) *** Preparing to test memory region 7fcc6158c000 (8388608 bytes) *** Preparing to test memory region 7fcc622ec000 (16384 bytes) *** Preparing to test memory region 7fcc6250b000 (16384 bytes) *** Preparing to test memory region 7fcc62cca000 (16384 bytes) *** Preparing to test memory region 7fcc62cda000 (4096 bytes) .O.O.O.O.O.O.O.O.O.O.O.O Fast memory test PASSED, however your memory can still be broken. Please run a memory test for several hours if possible.
------ DUMPING CODE AROUND EIP ------ Symbol: aeProcessEvents (base: 0x563faac09db0) Module: src/redis-server *:6379 (base 0x563faabdf000) $ xxd -r -p /tmp/dump.hex /tmp/dump.bin $ objdump --adjust-vma=0x563faac09db0 -D -b binary -m i386:x86-64 /tmp/dump.bin
3762:M 08 Mar 2020 13:58:38.881 # dump of function (hexdump of 470 bytes): 4157415641554154555331db4883ec3864488b042528000000488944242831c040f6c603897424080f846a01000089f04989fc83e006833fff0f84f102000083f8020f84f102000031c9f6442408044d8b6c24380f95c183e901418b542404498b7508418b7d00e864c6ffff85c089c30f8e8a010000498b7d088d40ff498b6c2420488d0440488d770c4989e94889e94c8d1c86eb0e662e0f1f8400000000004883c60c8b1789d083e0014189c24183ca02f6c204410f45c24189c24183ca02f6c208410f45c24189c24183ca0283e2108b5704410f45c24883c1084889f78941fc8951f84939f375b6498b4424484885c07414f644240808740a4c89e7ffd0498b6c24204989e94531ed0f1f4400004b8d04e9486328448b70044989ef48c1e50549036c24188b450089c24421f2a8040f8589000000f6c2010f84a8020000488b55184489f14489fe4c89e7ff55084489f083e0028545007419488b4510483b4508740f488b55184489f14489fe4c89e7ffd04983c5014439eb0f8e9f0000004d8b4c2420eb880f1f84000000000001eb660f1f44000089d8488b5c24286448331c25280000000f85b90200004883c4385b5d415c415d415e415fc30f1f00f6c202750b83e20174aa488b4508eb95488b55184489
=== REDIS BUG REPORT END. Make sure to include from START to END. ===
Comment From: antirez
Thanks, on it.
Comment From: antirez
Reproduced, after having to setup again my work environment here from home (I'm in Italy, cov2019 is posing a few issues). So now I can fix it.
Comment From: mehrdadn
Oh wow, thanks! But please take care of yourself first! There's nothing on my end depending on this bug to be fixed anytime soon, just stay safe!
Comment From: antirez
Sure! Working from home, we are isolated from the outiside for a few days now, and will keep it this way. In Italy you can no longer go outside without a good reason.
Comment From: antirez
Bug fixed, thanks.