Overview
The class OpenAiChatModel
:
- extends the abstract class AbstractFunctionCallSupport
(implementing methods callWithFunctionSupport
and callWithFunctionSupportStream
)
- implements the two interfaces ChatModel
and StreamingChatModel
(defining the call
and stream
methods).
OpenAiChatModel
implements the call
method relying on the callWithFunctionSupport
method, as expected. However, the implementation of the stream
method does not rely on callWithFunctionSupportStream
as one would expect.
Issue Description
This inconsistency in the implementation of the stream method makes it challenging to customize the existing models. When the call method uses callWithFunctionSupport
, it provides a standard, extendable approach that can be easily adapted or overridden for custom behavior. However, the custom implementations of the stream method bypass callWithFunctionSupportStream
, resulting in a fragmented and non-uniform codebase. This discrepancy hampers the ability to make consistent enhancements or apply uniform customizations across different models.
Current Implementation
The model classes implement the call
method relying on the callWithFunctionSupport
method, as expected. However, the implementation of the stream
method does not rely on callWithFunctionSupportStream
as one would expect.
OpenAiChatModel
@Override
public ChatResponse call(Prompt prompt) {
ChatCompletionRequest request = createRequest(prompt, false);
return this.retryTemplate.execute(ctx -> {
ResponseEntity<ChatCompletion> completionEntity = this.callWithFunctionSupport(request);
// other code
});
}
@Override
public Flux<ChatResponse> stream(Prompt prompt) {
ChatCompletionRequest request = createRequest(prompt, true);
return this.retryTemplate.execute(ctx -> {
Flux<OpenAiApi.ChatCompletionChunk> completionChunks = this.openAiApi.chatCompletionStream(request);
// other code
});
}
Suggested Change
Refactor the stream
method in each of the model classes to use the callWithFunctionSupportStream
method, similar to how the call
method uses callWithFunctionSupport
.
Note
This problem may also be present in other chat models implementing AbstractFunctionCallSupport
Comment From: markpollack
We are making some changes in this area, will review your comments with these in flight changes. Thanks for reporting.
Comment From: tzolov
Hi @RikJux , in past two months we refactored the function support for chat models to improve their consistency, observability support and ability to proxy. Can you please confirm if the issue you've reported is still relevant?
Comment From: tzolov
I believe that after M4 this issue has been addressed. Now bot the call() and the stream() rely on common set of utilities for function calling. In the case of OpenAiChatModel: - call() -> https://github.com/spring-projects/spring-ai/blob/c05714881303e9b43ab710ec900cef11f3dbd9e4/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatModel.java#L266-L272 - stream() -> https://github.com/spring-projects/spring-ai/blob/c05714881303e9b43ab710ec900cef11f3dbd9e4/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatModel.java#L337-L342
I'm closing this as resolved. @RikJux feel free to re-open it if you still see unresolved problems.