First of all, it would be nice to have support for OpenRouter in the Chat Model API.

I think the adoption of this tool is already very high and a lot of people are using it. Especially since you can use OpenAI's o1 model without any tier restrictions.

Referencing OpenAI models via OpenRouter using the OpenAI Chat Model API works fine.

However, I get the following error when I try to use the OpenAI Chat Model API to refer to the Claude 3.5 model via OpenRouter:

Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `org.springframework.ai.openai.api.OpenAiApi$ChatCompletionFinishReason` from String "end_turn": not one of the values accepted for Enum class: [stop, function_call, length, content_filter, tool_call, tool_calls]
 at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 283, column: 198] (through reference chain: org.springframework.ai.openai.api.OpenAiApi$ChatCompletion["choices"]->java.util.ArrayList[0]->org.springframework.ai.openai.api.OpenAiApi$ChatCompletion$Choice["finish_reason"])

The question is, can this be fixed by extending the OpenAI Chat Model API?

Comment From: philwebb

The Spring AI project issue tracker can be found at https://github.com/spring-projects/spring-ai/issues. We don't have any AI code in Spring Boot.