Expected Behavior For some customers we do not have necessity to pass the function-result to LLMs again.
Eg: extract some information from a blob of text and save it in structured mode. That can be accomplished with a function call with in input the "data structure" but the output is not necessary, so we can avoid totally the "second call" to LLM.
Current Behavior Every function must return a value that is passed to LLM to produce a nice output.
Context As workaround we implement some function with "spying" capability (like tests) and use it to grab the "first call" data. And ignore completly the second call-result.
We try to add this feature in spring-ai I think that can be useful to others. Or at least anyone can decide.
Comment From: tzolov
@Grogdunn , are you referring to providing support for Function<SomeRequest, Void>
? E.g. function with Void response?
If so shouldn't the implementation detect the Void response definition and handle it accordingly? rather than masking it via fake string response and additional property flags?
Comment From: Grogdunn
Ok! Good point
Comment From: Grogdunn
@tzolov Ok Done, if you use a Function or a Consumer the second round-trip is skipped
EDIT: if more function are called (if model support it) even if only one called function returns a value the second roundtrip is done.
Comment From: markpollack
The PR #656 has some discussion, but perhaps this can be solved via clever prompt engineering. A prompt such as Respond only with required function calls and a single word "yes" or "no". Do not provide explanations or additional text.
may work since you want to suppress the final response.
The contract with AI models and function calling is a contract driven by the AI model itself, so one does need to reply to finish the conversation.
The lower level API that lets you control the conversation is demonstrated here. Perhaps after the first call you can simply not reply. Don't know what state this will leave things in but you can experiment with these two approaches.
Comment From: Grogdunn
For me this issue can be closed (as pull request).
There are 2 workarounds to handle this:
1. using low level API
2. using org.springframework.ai.converter.BeanOutputConverter
with a correct prompt
Option 1 is less prone to hallucination, but "disrupts" the standard AI workflow/contract. Option 2 can have some hallucination but it's clear, and respects the contract with the AI.
So feel free to close this issue.