Naming of OllamaOptions
should follow Spring conventions in being consistent with other ChatOptions
classes.
Comment From: ThomasVitale
The challenge with Ollama is that it doesn't distinguish between different models when it comes to options, they're all in the same bucket and there's no documented split. For that reason, the OllamaOptions
class follows the Ollama strategy and is not model-specific, instead, it implements both the ChatOptions
and EmbeddingOptions
interfaces (in the future, perhaps also ImageOptions
) and it's used in both scenarios. Some additional information on this topic: https://github.com/spring-projects/spring-ai/issues/230 and https://github.com/ollama/ollama/issues/2349
Comment From: johnsonr
Fair enough. Although isn't the format
property, at least, only relevant to LLM generation?
Comment From: markpollack
There are probably many other examples that can be put into a bucket, but I'm not going to go through the laundry list of options and definitively categorize them. I don't know what 1/2 of them do and I've found docs on the topic to be lacking.
Comment From: markpollack
One can create your own options class that collects the specific fields you want. The logic is using 'duck typing' so you can pass in anything that has the correct shape (property names)
Comment From: markpollack
Will look into response-format going into common options. See https://github.com/spring-projects/spring-ai/issues/1271