The following code does not work as expected:

try (FileOutputStream out = new FileOutputStream("test2")) {
    DataBufferUtils.read(Path.of("test1"), new NettyDataBufferFactory(new PooledByteBufAllocator()), 1024, StandardOpenOption.READ)
        .transformDeferredContextual((f, ctx) -> {
            System.out.println("1: " + ctx.getOrDefault("key", "EMPTY"));
            return f;
        })
        .transform(f -> DataBufferUtils.write(f, out))
        .transformDeferredContextual((f, ctx) -> {
            System.out.println("2: " + ctx.getOrDefault("key", "EMPTY"));
            return f;
        })
        .contextWrite(Context.of("key", "TEST"))
        .subscribe();
}

Expected result:

2: TEST
1: TEST

Actual result:

2: TEST
1: EMPTY

Indeed, if you put Hooks.enableContextLossTracking(); Hooks.onOperatorDebug(); before that code, you see that the context is lost at DataBufferUtils.write where a new flux is created that does not pass the subsriber's context to the original flux.

Environment: - Spring Boot 2.5.5 - Spring Framework 5.3.10 - Project Reactor 3.4.10

Comment From: lucianiz

This issue still happens in spring-framework 6.x . I recently updated to spring boot 3.1 and this causes problems. @poutsma fyi