Change the MessageType to FUNCTION in FunctionMessage
Comment From: tzolov
@yarisvt good catch.
But the fact is that this message type has never been used. Spring AI handles the Tool Function interactions with the model internally without exposing at the chat client level.
The only reason we might want to keep this Message type is if it is needed in the future for Message History (E.g. conversation memory).
@markpollack what do you think?
Comment From: yarisvt
Hi @tzolov, I was playing around with OpenAI and I created a custom function to retrieve some data from a JSON file to use as a database to match against user input. I figured that it worked correctly when I used the MessageType.FUNCTION
but not when using MessageType.SYSTEM
.
E.g.
database.json:
[
{
"idCar": "<private-id>",
"carModel": "Kia Sorento 1.6 T-GDi AT6 Hybrid ComfortLine 5zits 5d 169kW",
},
{
"idCar": "<private-id>",
"carModel": "Kia Sorento 1.6 T-GDi AT6 Hybrid DynamicLine 7zits 5d 169kW",
},
{
"idCar": "<private-id>",
"carModel": "Kia Sorento 1.6 T-GDi AT6 Hybrid DynamicPlusLine 7z 5d 169kW",
},
{
"idCar": "<private-id>",
"carModel": "Kia Sorento 1.6 T-GDi AT6 PHEV 4WD Edition 5d 195kW",
}
]
System prompt:
You are an employee of a car leasing company whose daily task is to make lease offers based on a quote from a car dealer.
Your job is to determine which car model is on the car dealer quote. Before you determine what car model is on the car dealer quote, you will
call the function 'carModelFunction' to get an overview of available car models. The overview is in format:
{format}.
User prompt
What car model is present on this quote from a car dealer?
The quote is written in {language}.
QUOTE:
{quote}
{format}
with format being:
public record CarModel(String idCar, String carModel) {}
I have set it up like this (left the Prompt formatting out)
ChatResponse response = chatClient.call(new Prompt(List.of(systemMessage, userMessage),
OpenAiChatOptions.builder()
.withModel("gpt-4-turbo-preview")
.withFunction("carModelFunction")
.build()));
The carModelFunction
function just returns the content of database.json
I use the following quote:
Dit is een Kia Sorento PHEV Edition 1.6 T-GDi Plug-in Hybrid AT6 AWD (A). Een hele mooie auto
When MessageType.SYSTEM
is used, I get the following response:
{
"carModel": "Sorento",
"idCar": ""
}
While when using MessageType.FUNCTION
, I get the following response:
{
"carModel": "Kia Sorento 1.6 T-GDi AT6 PHEV 4WD Edition 5d 195kW",
"idCar": "<private-id>"
}
So, when MessageType.FUNCTION
is used, I get the actual data that is present in database.json
, while I get a ChatGPT generated answer when MessageType.SYSTEM
is used.
Comment From: tzolov
@yarisvt , thank you for the feedback.
Are you using OpenAI function calling as documented in our reference: https://docs.spring.io/spring-ai/reference/0.8-SNAPSHOT/api/clients/functions/openai-chat-functions.html Or some custom implementation?
Comment From: yarisvt
@tzolov I indeed use the OpenAI function calling as documented in the reference.
Comment From: tzolov
@yarisvt, strange not sure why our ITs tests didn't catch this.
I will merge your PR but can you please provide some sample to reproduce it so i can update our ITs.
Comment From: yarisvt
@tzolov Example: https://github.com/yarisvt/spring-ai-function-example
Comment From: yarisvt
I have tested some more with the above example, and I actually get the same result when either MessageType.SYSTEM
or MessageType.FUNCTION
is used. However when I use version 0.8.0
I get the incorrect result. So it looks like something else in the 0.8.1-SNAPSHOT
version actually fixed the problem