Fixes #814
I'm aware of PR #829, but I would like to propose a more complete solution here.
This PR:
- Adds support for the stream_options
chat completion request property.
- Prevents sending stream_options
if the call
instead of the stream
method is used (otherwise, OpenAI API returns an error).
- Adds an integration test comparing token usage obtained with the call
and stream
methods.
Comment From: tzolov
Thanks @didalgolab ,
I wonder if the include usage shouldn't be enabled by default. In which case the question of opting out became relevant.
We can introduce a shortcut withStreamUsage(boolean)
that will set the StreamOptions.INCLUDE_USAGE
.
Will try to do this while merging your PR.
Comment From: didalgolab
I wondered about that too. I'd love to make it enabled by default! However, there is an issue with using OpenAiApi with different providers (e.g., Groq, OpenRouter...) in such case. As far as I know, none of them support stream_options
, although some stream token usage out of the box without any opt-in. Enabling the include_usage
option by default would make the default chat options incompatible with different providers' endpoints, requiring opt-out in the code each time. If you are okay with that, I am too.
Or perhaps OpenAiChatModel
could detect if the default baseUrl is used and enable or disable stream_options
in the default options accordingly, although I must admit I generally dislike conditional logic like that.
Feel free to share any thoughts or suggestions about this.
Adding a shortcut withStreamUsage(boolean)
is an excellent idea.
Comment From: tzolov
@didalgolab, I've added the Builder shortcut along with a field-less boolean getter/setters to provide astream-usage
property.
Updated the docs, tests and did some small amendments.
extended, rebased, squashed and merged at 20e4b5609a35c204b0e93ac6fcd38ae821a86d1d