among other things it should be possible to specify a function in the OllamaChatOptions

Comment From: Grogdunn

Ollama API doesn't support function calling at the moment, or I miss some update?

Comment From: ThomasVitale

Ollama doesn't support function calling at the moment, but there are a few related feature requests to introduce that feature, mentioned in https://github.com/ollama/ollama/issues/4386

Comment From: Grogdunn

Ok, lets see when they will support functions :crossed_fingers:

Comment From: tchoteau

Hello, Mistral 0.3 supports function calling with Ollama : Spring-ai Support invoking functions in models that support it via Ollama

Comment From: jidaojiuyou

now, ollama is supported function calling we need it! reference

Comment From: ThomasVitale

That one is a custom wrapper implementing a workaround where "tool" messages are handled as "assistant" messages, tricking Ollama into accepting them (see here). I wonder if a similar trick would work in Spring AI.

Ollama doesn't support yet function calling via its Chat Completion API, even though the models themselves do. There is a raw mode available, as mentioned by @tchoteau, that allows working with functions. But that requires a different implementation and design in Spring AI compared to all the other chat completion integrations.

Comment From: Ricard-Kollcaku

It looks like they added the support for function calling in ollama, to the models that support tools like mistral or llama3-groq-tool-use this pr was merged some days ago and you can already make calls in ollama using function calling as the comments here

would be nice to add the support for function calling for ollama in spring ai

Comment From: markpollack

Hi. Fantastic, thanks for taking the time to update the issue @Ricard-Kollcaku ! Not sure we can get it in time for M2, but will take a look once we take a look at the final list of issues.

Comment From: ThomasVitale

It's worth distinguishing these three different options for function calling in Ollama:

  1. Raw mode. This option supports function calling already. I don't see it as a good fit for Spring AI to support since it skips entirely the Ollama APIs and acts at a lower level. Example: https://github.com/ollama/ollama/issues/1729#issuecomment-1937763369
  2. Ollama API. In the past few weeks, several changes have been delivered to the Ollama project to support function calling through the Ollama API. I'm currently working on a PR for supporting this also in Spring AI.
  3. OpenAI-Compatible API. This option supports function calling already. It will work with the Spring AI OpenAI integration as soon as this bug in Ollama gets fixed: https://github.com/ollama/ollama/issues/5796.

One more important thing to mention is that there are only a few models in the Ollama Library that support function calling via options 2 and 3 (whereas more models might work with option 1). For example, you can use function calling with mistral-nemo but not with llama3, because the latter has not been trained for function calling.

Comment From: ThomasVitale

This first PR adds function calling support at the API level: https://github.com/spring-projects/spring-ai/pull/1103. A second PR will extend the function calling support at the ChatModel level.