For the chat models and the embedding models that are in OpenAiApi add logging at the debug level.

This should also be done for streaming endpoints using the technique in PromptChatMemoryAdvisor, e.g.


    @Override
    public Flux<ChatResponse> adviseResponse(Flux<ChatResponse> fluxChatResponse, Map<String, Object> context) {

        return new MessageAggregator().aggregate(fluxChatResponse, chatResponse -> {
            List<Message> assistantMessages = chatResponse.getResults()
                .stream()
                .map(g -> (Message) g.getOutput())
                .toList();

            this.getChatMemoryStore().add(this.doGetConversationId(context), assistantMessages);
        });
    }

Comment From: ThomasVitale

I shared some thoughts about this in https://github.com/spring-projects/spring-ai/issues/512#issuecomment-2185096414

Comment From: markpollack

Logging will be part of #512 , so closing this issue