Expected Behavior
The observability of Spring AI is quite impressive, and it has helped us resolve numerous issues. Thank you so much!
However, as a crucial component of the Spring AI, function callings lack support for observability. In fact, our agent application contains a significant number of function calling invocations, and we would like to see them as spans in the trace. This visibility would help us understand how the LLM organizes these function calls and allow us to see the input and output during the invocation process (it's important data for evaluate LLM correctness).
We have conducted some preliminary research and believe that adding observability instrumentation to the org.springframework.ai.chat.model.AbstractToolCallSupport#executeFunctions
method would be a good solution. If you have plans to support this but are currently not available, we would be more than happy to contribute.
Current Behavior
Function callings lack support for observability.
Context
No more context. If you have more you need to know, feel free to ask. :)
Comment From: chickenlj
I have encountered exactly the same issue when using Spring Ai, it would be great if the function calling branch could be displayed in the tracing result.
Comment From: Cirilla-zmh
By the way, we are also attempting to use the opentelemetry-java-instrumentation
to export trace data generated by Spring AI to an OTLP-compatible observability backend. However, we have not yet found a suitable implementation approach. Are there any best practices or similar solutions that you could recommend? I would greatly appreciate any guidance.
Additionally, I have already raised this issue in the OpenTelemetry community: https://github.com/open-telemetry/opentelemetry-java-instrumentation/issues/12878
Comment From: xiaohai-78
Observability for function calls would help me pinpoint performance bottlenecks more accurately. Looking forward to this feature!
Comment From: ThomasVitale
@Cirilla-zmh thanks for submitting this issue. I've started working on an implementation for this.
It's worth noting that Spring AI is instrumented the same way the rest of the Spring portfolio, based on the Micrometer Observation API. You can export traces and metrics via OTLP to an OpenTelemetry backend by adding the needed Micrometer dependencies.
Using opentelemetry-java-instrumentation
would be an alternative to Spring-own instrumentation. You can get the metrics and traces supported by the OpenTelemetry Java Instrumentation, but you won't get the ones defined by Spring itself (including Spring AI).
You can find full observability examples for models and vector stores here: https://github.com/ThomasVitale/llm-apps-java-spring-ai/tree/main/observability
Comment From: Cirilla-zmh
I've started working on an implementation for this.
Cool! Thank you and look forward to seeing it soon. ;)
As for best practices when using OpenTelemetry, let's discuss those under the OpenTelemetry project.