After migrating from Spring Boot 2.1.8 to Spring Boot 3.0 with Java 17, I'm experiencing an issue where the MDC context map is not loading with context variables. In my application, I rely on certain context variables (specifically X-B3-TraceId and X-B3-SpanId) which are used in the logback pattern appender. The problem seems to be related to the version of Logback version 1.4.6 used in Spring Boot 3.0 which is not effectively loading the MDC context. I would greatly appreciate any help or suggestions to resolve this issue.

Comment From: philwebb

Without more information it's very hard to say what the issue is. If it's related to the Logback version then the issue will need to be reported against Logback itself.

Could you please provide a minimal sample that shows the problem?

Comment From: nkbandarus

Previously, in the application using Spring Boot 2.1.8, the traceId and spanId were successfully captured using the provided encoder format in logback.xml:

<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <encoder>
        <pattern>%d{YYYY:MM:dd HH:mm:ss.SSSZ} 
            [%X{X-B3-TraceId:-},%X{X-B3-SpanId:-}]
            [${HOSTNAME}] [${PID:- }]
        </pattern>
    </encoder>
</appender>

The traceId and spanId were captured through Spring Cloud Sleuth observability metrics. However, I came across documentation during the Spring migration process indicating that there will be no Sleuth compatible version for Spring 6 and Spring Boot 3. The documentation can be found here: Migration to new 1.10.0 Observation API.

In Spring Boot 3, Micrometer serves as the tracing facade. Now, I am seeking guidance on how to capture the X-B3-TraceId and X-B3-SpanId with Micrometer in this updated environment.

Comment From: wilkinsona

Have you followed the suggestion in the reference documentation:

You can include the current trace and span id in the logs by setting the logging.pattern.level property to %5p [${spring.application.name:},%X{traceId:-},].

This should be used with the default logging configuration. If you're using your own custom logging configuration you should use %X{traceId:-} and %X{spanId:-} directly in that configuration.

Comment From: nkbandarus

As mine is kafka consumer application by setting below property to kafka ConcurrentKafkaListenerContainerFactory factory.getContainerProperties().setObservationEnabled(true); fixed the issue.

Besides adding the below micrometer observability dependencies

        <dependency>
            <groupId>io.micrometer</groupId>
            <artifactId>micrometer-tracing</artifactId>
        </dependency>
        <dependency>
            <groupId>io.micrometer</groupId>
            <artifactId>micrometer-tracing-bridge-brave</artifactId>
        </dependency>
        <dependency>
            <groupId>io.zipkin.reporter2</groupId>
            <artifactId>zipkin-reporter-brave</artifactId>
        </dependency>

Comment From: wilkinsona

Glad to hear you figured it out. Thanks for letting us know.