Current Behavior
OpenAI supports the store option which can be used as part of the OpenAI evals and model distillation processes. In addition, the metadata field can be used to help filter messages for these processes (amongst other thing). Currently the OpenAiApi.ChatCompletionRequest class supports the option fields, however there does not appear to be away to set either of these options using the ChatClient of ChatModel intefaces.
Expected Behavior
It would be desirable for the OpenAiChatOptions class to support the model and metadata options.
Eg.
var promptOptions = OpenAiChatOptions.builder().store(true).metadata(Map.of("use", "training")).build();
chatClient.prompt().messages(userMessage).options(promptOptions).call().content();
Context
Currently attempting to be able to use stored prompts in OpenAI to evaluate the accuracy of prompts as well as create cheaper/streamlined models using the distillation and fine tuning process. SpringAI does not currently appear to have a convenient method to set the store and metadata options using the ChatClient or ChatModel interfaces.
One alternative is using the low level OpenAiApi.ChatCompletionRequest structure for building prompt JSON and use other libraries for HTTP calls to OpenAI, but this removes many of the value adds (e.g. Advisors) available in ChatClient.
Comment From: saluzafa
Hi @gm2552,
Have you found a workaround for this by any chance?
Cheers!
Comment From: gm2552
Hi, @saluzafa. If I want to use the ChatClient interface, not yet.
Comment From: saluzafa
I was able to create a "dirty" workaround for this issue.
I copied the entire OpenAiChatModel source code into a new file named CustomOpenAiChatModel and added the following line:
request = ModelOptionsUtils.merge(Map.of("store", true), request, ChatCompletionRequest.class);
This line was added right before the return request; statement in the createRequest method. It ensures the store parameter is set to true for every request.
After that, I switched to using CustomOpenAiChatModel as the chat model instead of OpenAiChatModel.
Here's the modified file for reference: OpenAiChatModel.java
Comment From: markpollack
I believe the way we designed it is that you can create a subclass of OpenAiChatOptions to add what you want, the way the properites are picked off it is json 'duck typing' style. You could even pass in a custom impl of ChatOptions
Comment From: gm2552
That's great to know, @markpollack as I've gotten another request to support "label" options in the Vertex implementation. That might be a temporary workaround, but we'll see (Vertex API might be a little different of a beast).
Comment From: gm2552
@markpollack I tried the workaround of subclassing, and it didn't work. Looks like the copying is done in this method of OpenAIChatModel, and returned object is an instance of OpenAIChatOptions instead of my subclass.
updatedRuntimeOptions = (OpenAiChatOptions)ModelOptionsUtils.copyToTarget((FunctionCallingOptions)prompt.getOptions(), FunctionCallingOptions.class, OpenAiChatOptions.class);
Comment From: markpollack
added docs.
will have to investigate what you found @gm2552 - that wasn't the intention :(
merged in b08b9e8cf4cf8bd33994b7753b0c1eb4ee1bb8e0