Skip to content

Python: fix(openai): guard against null delta in streaming chunks from non-co…#5734

Open
Serjbory wants to merge 1 commit intomicrosoft:mainfrom
Serjbory:fix/5732
Open

Python: fix(openai): guard against null delta in streaming chunks from non-co…#5734
Serjbory wants to merge 1 commit intomicrosoft:mainfrom
Serjbory:fix/5732

Conversation

@Serjbory
Copy link
Copy Markdown
Contributor

Motivation and Context

Fixes #5732.

When using OpenAIChatCompletionClient with streaming, certain OpenAI-compatible providers (e.g. Azure OpenAI with specific configurations) send "delta": null on finish chunks instead of the spec-compliant "delta": {}. This causes an AttributeError: 'NoneType' object has no attribute 'content' crash in _parse_text_from_openai, terminating the stream mid-response.

Description

Added a runtime guard in _parse_response_update_from_openai that checks for a None delta before attempting to parse text, tool calls, or reasoning content from the streaming chunk.

Key design decisions:

  • The guard is placed at the call site (inside the for choice in chunk.choices loop) rather than in individual parsing methods, so all downstream content-parsing is protected by a single check.
  • finish_reason is extracted before the guard, ensuring terminal stream state is always captured even from non-compliant chunks.
  • Uses getattr(choice, "delta", None) instead of choice.delta to avoid a Pyright reportUnnecessaryComparison error, since the OpenAI SDK types delta as non-optional despite it being None at runtime from non-compliant providers.
  • reasoning_details access was updated to use the local delta variable instead of re-accessing choice.delta.

Added 5 regression tests covering:

  • Core crash scenario (delta=None with finish_reason="stop")
  • finish_reason preservation across different stop reasons ("length")
  • Spec-compliant empty delta (ChoiceDelta()) is not skipped
  • Usage data coexisting with a null delta
  • Tool-call parsing skipped when delta is None

Contribution Checklist

  • The code builds clean without any errors or warnings
  • The PR follows the Contribution Guidelines
  • All unit tests pass, and I have added new tests where possible
  • Is this a breaking change? If yes, add "[BREAKING]" prefix to the title of the PR.

Copilot AI review requested due to automatic review settings May 10, 2026 14:57
@github-actions github-actions Bot changed the title fix(openai): guard against null delta in streaming chunks from non-co… Python: fix(openai): guard against null delta in streaming chunks from non-co… May 10, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes a streaming crash in the Python OpenAIChatCompletionClient when OpenAI-compatible providers emit non-spec finish chunks with "delta": null, by adding a single guard in the streaming chunk parser and covering the scenario with regression tests.

Changes:

  • Add a delta is None guard in _parse_response_update_from_openai to prevent downstream parsing from dereferencing None while still capturing finish_reason.
  • Update reasoning-details parsing to use the guarded local delta reference.
  • Add regression tests for null-delta finish chunks (including finish reason preservation, usage coexistence, and tool-call/text parsing behavior).

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
python/packages/openai/agent_framework_openai/_chat_completion_client.py Adds a streaming-time delta=None guard and adjusts reasoning-details access to avoid AttributeError on non-compliant provider chunks.
python/packages/openai/tests/openai/test_openai_chat_completion_client.py Adds regression tests reproducing and preventing the null-delta streaming crash, including finish-reason and usage behaviors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Python: [Bug]: OpenAIChatCompletionClient exception

3 participants