I am writing some blog post on Ollama et Spring AI and I wonder if it's a bug or a bad configuration on my end but when I do a direct on Ollama using llava:34B. llava-34B-request.json
The answer is always the same it recognizes the text on the image "Hello generative AI Meetup" but the I use Spring AI it's really far from the right answer so I guess Spring AI gives some default parameters that are not the same as the default on Ollama. I would expect the same behaviour.
The spring ai code I use is : https://github.com/mikrethor/spring-ai-llava
Comment From: f3rnandomoreno
Maybe you could try an integration test like this one and check if you still get the same error
models/spring-ai-ollama/src/test/java/org/springframework/ai/ollama/OllamaChatClientIT.java
https://github.com/spring-projects/spring-ai/blob/1c93ae50a805dcb54b05a6fbbc11667b3aac6562/models/spring-ai-ollama/src/test/java/org/springframework/ai/ollama/OllamaChatClientIT.java
Comment From: dashaun
Hey! @mikrethor I found this issue while diagnosing a different bug around Ollama+Llava.
The code in your example is referencing a snapshot.
1.0.0-M1 has been released since May. There are a ton of changes.
I think this issue should be closed. The use case in your example code will work as expected with 1.0.0-M1.
I have a similarly trivial example repository, using 1.0.0-M1 with "llava" here:
https://github.com/javagrunt-com/com.javagrunt.service.cleanchecker
And the repo includes a link to a YouTube video demonstration.
Upgrading is totally worth it!
Comment From: markpollack
Thanks @dashaun Closing the issue. Please reopen if there is anything we missed.
Comment From: mikrethor
Just to make sure. I did some stuff with 1.0.0-M4 and it works as expected.