Python: make FoundryChatClient an async context manager#5464
Python: make FoundryChatClient an async context manager#5464he-yufeng wants to merge 4 commits intomicrosoft:mainfrom
Conversation
eavanvalkenburg
left a comment
There was a problem hiding this comment.
so, I need to think about this one, because currently no chat client has aenter, so this would introduce it, I'm not opposed, but need to think about this generally. I will close this for now (I removed the async with from the sample you referred to)
There was a problem hiding this comment.
this is from a different PR, please remove here
d5a4cef to
834b8db
Compare
|
@eavanvalkenburg you're right — that change was from #5462 leaking through because this branch was based on it. Rebased onto current main ( On the broader question of whether chat clients should have |
|
@microsoft-github-policy-service agree |
|
Friendly ping @eavanvalkenburg — |
834b8db to
bd00576
Compare
There was a problem hiding this comment.
Pull request overview
This PR fixes FoundryChatClient usage in async with blocks by implementing the async context manager protocol on RawFoundryChatClient, ensuring the internally-created AIProjectClient is properly closed on exit (addressing #5428 and avoiding resource leaks).
Changes:
- Added
close()with ownership tracking (_should_close_client) to only close internally-createdAIProjectClientinstances. - Implemented
__aenter__/__aexit__onRawFoundryChatClientto supportasync with FoundryChatClient(...). - Updated imports to use
Selffor accurate context manager typing across Python versions.
Comments suppressed due to low confidence (1)
python/packages/foundry/agent_framework_foundry/_chat_client.py:199
- The constructor uses
if not project_client:even thoughproject_clientis typed asAIProjectClient | Noneand earlier logic uses explicitis Nonechecks. Using truthiness here is less precise and could mis-handle any falsy sentinel object; preferif project_client is None:for consistency and correctness with the annotated type.
self._should_close_client = False
if not project_client:
if not project_endpoint:
raise ValueError(
"Azure AI project endpoint is required. Set via 'project_endpoint' parameter "
"or 'FOUNDRY_PROJECT_ENDPOINT' environment variable,"
"or pass in a AIProjectClient."
)
| if not credential: | ||
| raise ValueError("Azure credential is required when using project_endpoint without a project_client.") | ||
| project_client_kwargs: dict[str, Any] = { | ||
| "endpoint": project_endpoint, | ||
| "credential": credential, # type: ignore[arg-type] | ||
| "user_agent": get_user_agent(), | ||
| } | ||
| if allow_preview is not None: | ||
| project_client_kwargs["allow_preview"] = allow_preview | ||
| project_client = AIProjectClient(**project_client_kwargs) | ||
| self._should_close_client = True | ||
|
|
||
| openai_kwargs: dict[str, Any] = {} | ||
| if default_headers: | ||
| openai_kwargs["default_headers"] = default_headers | ||
|
|
||
| super().__init__( | ||
| model=resolved_model, | ||
| async_client=project_client.get_openai_client(**openai_kwargs), | ||
| default_headers=default_headers, | ||
| instruction_role=instruction_role, | ||
| compaction_strategy=compaction_strategy, | ||
| tokenizer=tokenizer, | ||
| additional_properties=additional_properties, | ||
| ) | ||
| self.project_client = project_client |
| async def close(self) -> None: | ||
| """Close the project client if we created it.""" | ||
| if self._should_close_client: | ||
| await self.project_client.close() | ||
|
|
||
| async def __aenter__(self) -> Self: | ||
| """Enter the async context manager.""" | ||
| return self | ||
|
|
||
| async def __aexit__( | ||
| self, | ||
| exc_type: type[BaseException] | None, | ||
| exc_val: BaseException | None, | ||
| exc_tb: Any, | ||
| ) -> None: | ||
| """Close the project client on exit (only when owned).""" | ||
| await self.close() | ||
|
|
74b4e99 to
28e1121
Compare
|
Pushed
|
2cd6366 to
be40170
Compare
|
Rebased this onto the current Local checks on Windows:
|
be40170 to
fb59037
Compare
|
Rebased onto current Validation on Windows:
|
Fixes microsoft#5428. `built_in_chat_clients` and similar samples wrap the client in `async with client:`, which works for OpenAIChatClient and the Azure variants but fails on FoundryChatClient with `TypeError: object does not support the asynchronous context manager protocol`. The chat client holds an AIProjectClient internally (self- created when the caller passes project_endpoint + credential), so it also needs a lifecycle hook — otherwise the project client leaks. Mirror the pattern already used in `FoundryChatAgent`: - Track `_should_close_client` in `__init__`, flipped on only when we construct the project client ourselves. - Add `async def close()` that awaits `project_client.close()` only when owned. - Implement `__aenter__` (no-op, returns self) and `__aexit__` (awaits close). `FoundryChatClient` inherits both via the existing class hierarchy, so the sample starts working without any call-site changes.
fb59037 to
49a8c3a
Compare
|
Rebased onto the latest main again and re-ran the Foundry package checks on head $head. Validation:
|
Fixes #5428.
The `built_in_chat_clients` sample (and any other code path that does `async with client:`) wraps the chat client in an async context manager. This works for `OpenAIChatClient` and the Azure variants but fails on `FoundryChatClient` with:
```
TypeError: FoundryChatClient object does not support the asynchronous context manager protocol (missed aexit method)
```
`FoundryChatClient` owns an `AIProjectClient` internally — self-created when the caller passes `project_endpoint` + `credential`, otherwise injected — so beyond the missing protocol methods, leaving without closing the self-created `project_client` also leaks it.
Fix
Mirror the pattern already used in `FoundryChatAgent` (`_agent.py`):
`FoundryChatClient` inherits these via the existing class hierarchy, so the sample starts working without any call-site changes and without touching the public API.
Test plan