feat(ollama): Add streaming support for function calls and improve OllamaApi
- Implement streaming tool call support in OllamaApi and OllamaChatModel
- Add OllamaApiHelper to manage merging of streaming chat response chunks
- Remove @Disabled annotations for streaming function call tests
- Update documentation to reflect new streaming function call capabilities
- Add a new default constructor for ChatResponse
- Update Ollama chat documentation to clarify streaming support requirements
- Deprecated withContent(), withImages(), and withToolCalls() methods
- Replaced with content(), images(), and toolCalls() methods
Add token and duration aggregation for Ollama chat responses
- Modify OllamaChatModel to support accumulating tokens and durations across multiple responses
- Update ChatResponse metadata generation to aggregate usage and duration metrics
- Add tests to verify metadata aggregation behavior
Refactor Ollama duration fields and tests
- Replace Duration fields in OllamaApi.ChatResponse with Long to represent durations in nanoseconds, ensuring precision and compatibility.
- Update methods to convert Long nanoseconds to Duration objects (getTotalDuration, getLoadDuration, getEvalDuration, getPromptEvalDuration).
- Adjust merge logic in OllamaApiHelper to sum Long values for duration fields.
- Modify test cases in OllamaChatModelTests to align with Long duration representation and Duration.ofNanos conversions.
- Add new test class OllamaDurationFieldsTests to validate JSON deserialization and Duration conversion for duration fields.
Resolves #1847 Related to #1800 Resolves #1796 Related to #1307
Comment From: ilayaperumalg
LGTM, squashed, rebased and merged as c4ca5826