First of all, it would be nice to have support for OpenRouter in the Chat Model API.
I think the adoption of this tool is already very high and a lot of people are using it. Especially since you can use OpenAI's o1 model without any tier restrictions.
Referencing OpenAI models via OpenRouter using the OpenAI Chat Model API works fine.
However, I get the following error when I try to use the OpenAI Chat Model API to refer to the Claude 3.5 model via OpenRouter:
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type org.springframework.ai.openai.api.OpenAiApi$ChatCompletionFinishReason
from String "end_turn": not one of the values accepted for Enum class: [stop, function_call, length, content_filter, tool_call, tool_calls]
at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION
disabled); line: 283, column: 198] (through reference chain: org.springframework.ai.openai.api.OpenAiApi$ChatCompletion["choices"]->java.util.ArrayList[0]->org.springframework.ai.openai.api.OpenAiApi$ChatCompletion$Choice["finish_reason"])
The question is, can this be fixed by extending the OpenAI Chat Model API?
If not, can I create my own response parser implementation and plug it into ChatModel? How to do it?
Comment From: longyiwu
var openAiApi = new OpenAiApi(apiBase, apiKey, RestClient.builder().messageConverters(
converter -> {
converter.stream().filter(c -> c instanceof MappingJackson2HttpMessageConverter).findAny().ifPresent(c -> {
SimpleModule module = new SimpleModule();
module.addDeserializer(OpenAiApi.ChatCompletionFinishReason.class, new FinishReasonDeserializer());
((MappingJackson2HttpMessageConverter) c).getObjectMapper().registerModule(module);
});
}
).requestFactory(clientHttpRequestFactory), WebClient.builder());
ChatCompletionFinishReason:
@Override
public OpenAiApi.ChatCompletionFinishReason deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException, JacksonException {
String str = jsonParser.getCodec().readValue(jsonParser, String.class);
for (OpenAiApi.ChatCompletionFinishReason reason : OpenAiApi.ChatCompletionFinishReason.values()) {
if (reason.name().equalsIgnoreCase(str)) {
return reason;
}
}
return OpenAiApi.ChatCompletionFinishReason.STOP;
}
I hope this helps
Comment From: tboeghk
Thank you @longyiwu for your excellent fix. However I struggled to wire it properly into my Spring Boot application and came up with a simpler solution (in terms of wiring) by configuring the global ObjectMapper
:
@Configuration
public class OpenRouterConfiguration {
@Bean
public ObjectMapper objectMapper() {
final SimpleModule openrouter = new SimpleModule()
.addDeserializer(OpenAiApi.ChatCompletionFinishReason.class,
new FinishReasonDeserializer());
return new ObjectMapper()
.registerModules(openrouter);
}
}
public class FinishReasonDeserializer extends JsonDeserializer<OpenAiApi.ChatCompletionFinishReason> {
@Override
public OpenAiApi.ChatCompletionFinishReason deserialize(JsonParser jsonParser,
DeserializationContext deserializationContext) throws IOException, JacksonException {
String str = jsonParser.getCodec().readValue(jsonParser, String.class);
for (OpenAiApi.ChatCompletionFinishReason reason : OpenAiApi.ChatCompletionFinishReason.values()) {
if (reason.name().equalsIgnoreCase(str)) {
return reason;
}
}
return OpenAiApi.ChatCompletionFinishReason.STOP;
}
}
Using this fix I was able to use Google Gemini models via the Openrouter API in Spring AI. The additional configuration needed was (just in case stumbles across this :-)
spring:
ai:
api-key: OPENROUTER_API_KEY
base-url: https://openrouter.ai/api