- Fix Azure OpenAI chat model's functioncalling to report accumulated token usage
- Fix both call() and stream() operations
- For streaming operation, use buffering to store the usage from the last response when stream option include usage is enabled
- Add tests
Comment From: tzolov
rebased and merged at 7bbd3ef5931664236abd6da78ed452b4ba021d0c