For anything other than bug reports and feature requests (performance, refactoring, etc), just go ahead and file the issue. Please provide as many details as possible. 图片

If you have a question or a support request, please open a new discussion on GitHub Discussions or ask a question on StackOverflow.

Please do not create issues on the Issue Tracker for questions or support requests. We would like to keep the issue tracker exclusively for bug reports and feature requests.

Comment From: zjarlin

图片 I found that manual switching doesn't work here. It's still the model specified by ollama in yml.

Comment From: ThomasVitale

You can configure a model at runtime using the Options API. You can find more information here: https://docs.spring.io/spring-ai/reference/api/chat/ollama-chat.html#chat-options.

Full example with ChatClient and Ollama: https://github.com/ThomasVitale/llm-apps-java-spring-ai/blob/main/01-chat-models/chat-models-ollama/src/main/java/com/thomasvitale/ai/spring/ChatController.java#L43

chatClient
    .prompt("What's the capital of Denmark?")
    .options(ChatOptionsBuilder.builder()
        .withModel("llama3.2")
        .build())
    .call()
    .content();

Comment From: zjarlin

3q

Comment From: zjarlin

Can you just change the model configuration in ChatModel without chatClient?

Comment From: ThomasVitale

Yes, here's an example: https://github.com/ThomasVitale/llm-apps-java-spring-ai/blob/main/01-chat-models/chat-models-ollama/src/main/java/com/thomasvitale/ai/spring/model/ChatModelController.java#L40

Comment From: zjarlin

图片 /** * i mean , change the options settings in the model object here🙈

    cannot set this attribute like this:
    ---------------------------------------------------------
    model.defaultOptions = OllamaOptions.builder()
    .withModel(modelName)
    .build()
    -------------------------------------------------
     */

// how Change the options property of ChatModel before building the ChatClient instance