The Ollama documentation used to include a template parameter for the chat completion endpoint. However, that parameter only exists for the generation endpoint. It was probably a copy/paste mistake in the documentation. I have submitted a fix to the Ollama project which has now been merged (https://github.com/ollama/ollama/pull/3515).

This pull request is for removing the template parameter from the ChatRequest class and from the OllamaOptions class since it's not part of the Ollama API for Chat Completion. I have also updated the related documentation. The Ollama Server will not fail if the parameter is included, but it won't have any effect.

Comment From: ThomasVitale

@tzolov I created this PR as a followup to my comment from yesterday in https://github.com/spring-projects/spring-ai/pull/554#issuecomment-2041062684.

Unrelated question about contributing to Spring AI: For small pull requests like this one, should I also open a related Issue or is it enough submitting a PR?

Comment From: tzolov

Thanks @ThomasVitale. I can see that the Ollama ChatRequest Api indeed doesn't have the template parameter.

Unrelated question about contributing to Spring AI: For small pull requests like this one, should I also open a related Issue or is it enough submitting a PR? Agree, for issues like this you can fire a direct PR.

Btw, the Ollama options and documentation look very chaotic for my taste: https://github.com/ollama/ollama/issues/2349