Bug description
I am using the chat client as below. This includes both user message and a system message.
response = chatClient.call(
new Prompt(
List.of(userMessage, systemMessage),
OpenAiChatOptions.builder()
.withModel(openAiDto.getModel())
.withTemperature(openAiDto.getTemperature())
.withMaxTokens(1024)
.withSeed(42)
.build()
));
Now while trying to get the tokens used in the prompt, it is always returning 0. But when only single prompt(userMessage) is being used, it is giving correct response. Only when I am using both(userMessage, systemMessage), I am not getting anything other than 0.
response.getMetadata().getUsage().getPromptTokens()
Environment Spring AI version 0.8.0. I am using Ollama (0.1.27) as the model server and using its OpenAI compatible endpoint.
Steps to reproduce When I am using UserMessage, SystemPromptTemplate and System message, the issue is occurring.
Expected behavior It should return the correct number of tokens used in the prompt.
Comment From: mohsenetc
Hi, I already fixed the Issue https://github.com/spring-projects/spring-ai/pull/519