This commit adds support for tool context in various chat options classes across different AI model implementations and enhances function calling capabilities.

The tool context allows passing additional contextual information to function callbacks.

  • Add toolContext field to chat options classes
  • Update builder classes to support setting toolContext
  • Enhance FunctionCallback interface to support context-aware function calls
  • Update AbstractFunctionCallback to implement BiFunction instead of Function
  • Modify FunctionCallbackWrapper to support both Function and BiFunction and to use the new SchemaType location
  • Add support for BiFunction in TypeResolverHelper
  • Update ChatClient interface and DefaultChatClient implementation to support new function calling methods with Function, BiFunction and FunctionCallback arguments
  • Refactor AbstractToolCallSupport to pass tool context to function execution
  • Update all affected ChatOptions with tool context support
  • Simplify OpenAiChatClientMultipleFunctionCallsIT test
  • Add tests for function calling with tool context
  • Add new test cases for function callbacks with context in various integration tests
  • Modify existing tests to incorporate new context-aware function calling capabilities

Resolves #864, #1303, #991

Comment From: markpollack

I added this to openai-chat-functions.adoc for now

==== How to Use Tool Context

You can set the tool context when building your chat options and use a BiFunction for your callback:

[source,java]
----
BiFunction<MockWeatherService.Request, Map<String, Object>, MockWeatherService.Response> weatherFunction = 
    (request, toolContext) -> {
        String sessionId = (String) toolContext.get("sessionId");
        String userId = (String) toolContext.get("userId");

        // Use sessionId and userId in your function logic
        double temperature = 0;
        if (request.location().contains("Paris")) {
            temperature = 15;
        }
        else if (request.location().contains("Tokyo")) {
            temperature = 10;
        }
        else if (request.location().contains("San Francisco")) {
            temperature = 30;
        }

        return new MockWeatherService.Response(temperature, 15, 20, 2, 53, 45, MockWeatherService.Unit.C);
    };

OpenAiChatOptions options = OpenAiChatOptions.builder()
    .withModel(OpenAiApi.ChatModel.GPT_4_O.getValue())
    .withFunctionCallbacks(List.of(FunctionCallbackWrapper.builder(weatherFunction)
        .withName("getCurrentWeather")
        .withDescription("Get the weather in location")
        .build()))
    .withToolContext(Map.of("sessionId", "123", "userId", "user456"))
    .build();
----

In this example, the `weatherFunction` is defined as a BiFunction that takes both the request and the tool context as parameters. This allows you to access the context directly within the function logic.

You can then use these options when making a call to the chat model:

[source,java]
----
UserMessage userMessage = new UserMessage("What's the weather like in San Francisco, Tokyo, and Paris?");
ChatResponse response = chatModel.call(new Prompt(List.of(userMessage), options));
----

Comment From: markpollack

merged in 9c10a08bef839ef045cc23881db6810b62a29ae9