Conversation
Co-authored-by: Anthonios Partheniou <partheniou@google.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
The operation wrapped by ExtendedOperation may define other fields or methods that the user may wish to use.
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Anthonios Partheniou <partheniou@google.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
* chore(python): use black==22.3.0 Source-Link: googleapis/synthtool@6fab84a Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:7cffbc10910c3ab1b852c05114a08d374c195a81cdec1d4a67a1d129331d0bfe * ci: use black 22.3.0 * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Anthonios Partheniou <partheniou@google.com>
) Source-Link: googleapis/synthtool@7804ade Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:eede5672562a32821444a8e803fb984a6f61f2237ea3de229d2de24453f4ae7d
Source-Link: googleapis/synthtool@06e8279 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:b3500c053313dc34e07b1632ba9e4e589f4f77036a7cf39e1fe8906811ae0fce Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Source-Link: googleapis/synthtool@eb78c98 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:8a5d3f6a2e43ed8293f34e06a2f56931d1e88a2694c3bb11b15df4eb256ad163 Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
* chore(deps): update all dependencies * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Source-Link: googleapis/synthtool@f15cc72 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:bc5eed3804aec2f05fad42aacf973821d9500c174015341f721a984a0825b6fd
* test: use == instead of is when comparing equality * use not instead of ==
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
…374) Source-Link: googleapis/synthtool@6b4d5a6 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:f792ee1320e03eda2d13a5281a2989f7ed8a9e50b73ef6da97fac7e1e850b149 Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Source-Link: googleapis/synthtool@453a5d9 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:81ed5ecdfc7cac5b699ba4537376f3563f6f04122c4ec9e735d3b3dc1d43dd32
feat: adds support for audience in client_options.
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Also adds upper limits for extras. fix(deps): require googleapis-common-protos >= 1.56.2
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
* chore: allow releases from older version branches * 🦉 Updates from OwlBot post-processor See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
`UNAVAILABLE` instead of `UNVAILABLE` Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-api-core/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes #<issue_number_goes_here> 🦕
* chore: test minimum dependencies in python 3.7 * remove duplicate entry
This fix removes the `nox.options.sessions` that were left in the noxfile when those sessions were removed in #873.
Require Python 3.9+ Require protobuf 4.25.8+ Simplify code accordingly. Fixes #15019 This also required removing 3.7 and 3.8 as required checks in the repo config, under branch protection rules.
… into migration.python-api-core.migration.2026-02-13_19-20-54.migrate
Summary of ChangesHello @parthea, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request executes a significant structural migration of the Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request migrates the google-api-core library into a sub-package structure. The migration appears comprehensive, including source code, tests, CI configurations, and documentation. My review focuses on improving the robustness of time-based logic, enhancing error handling for REST responses, and ensuring adherence to internal rules regarding dictionary serialization. Specifically, I recommend using monotonic time for throttling, handling potential JSON parsing errors in async REST transports, and programmatically sorting log dictionaries before serialization.
| def _process_chunk(self, chunk: str): | ||
| if self._level == 0: | ||
| if chunk[0] != "[": |
There was a problem hiding this comment.
| with self._entry_lock: | ||
| cutoff_time = datetime.datetime.now() - self._time_window | ||
|
|
||
| # drop the entries that are too old, as they are no longer relevant | ||
| while self._past_entries and self._past_entries[0] < cutoff_time: | ||
| self._past_entries.popleft() | ||
|
|
||
| if len(self._past_entries) < self._access_limit: | ||
| self._past_entries.append(datetime.datetime.now()) | ||
| return 0.0 # no waiting was needed | ||
|
|
||
| to_wait = (self._past_entries[0] - cutoff_time).total_seconds() | ||
| time.sleep(to_wait) | ||
|
|
||
| self._past_entries.append(datetime.datetime.now()) | ||
| return to_wait | ||
|
|
There was a problem hiding this comment.
Using datetime.datetime.now() for interval calculations can be problematic if the system clock is adjusted. It is recommended to use time.monotonic() for measuring elapsed time and calculating wait durations to ensure robustness against clock jumps.
| with self._entry_lock: | |
| cutoff_time = datetime.datetime.now() - self._time_window | |
| # drop the entries that are too old, as they are no longer relevant | |
| while self._past_entries and self._past_entries[0] < cutoff_time: | |
| self._past_entries.popleft() | |
| if len(self._past_entries) < self._access_limit: | |
| self._past_entries.append(datetime.datetime.now()) | |
| return 0.0 # no waiting was needed | |
| to_wait = (self._past_entries[0] - cutoff_time).total_seconds() | |
| time.sleep(to_wait) | |
| self._past_entries.append(datetime.datetime.now()) | |
| return to_wait | |
| def __enter__(self): | |
| with self._entry_lock: | |
| cutoff_time = time.monotonic() - self._time_window.total_seconds() | |
| # drop the entries that are too old, as they are no longer relevant | |
| while self._past_entries and self._past_entries[0] < cutoff_time: | |
| self._past_entries.popleft() | |
| if len(self._past_entries) < self._access_limit: | |
| self._past_entries.append(time.monotonic()) | |
| return 0.0 # no waiting was needed | |
| to_wait = self._past_entries[0] - cutoff_time | |
| time.sleep(to_wait) | |
| self._past_entries.append(time.monotonic()) | |
| return to_wait |
| value = getattr(record, field_name, None) | ||
| if value is not None: | ||
| log_obj[field_name] = value | ||
| return json.dumps(log_obj) |
There was a problem hiding this comment.
Per the general rules, dictionaries should be programmatically sorted before being serialized to ensure consistent key ordering in the output.
| return json.dumps(log_obj) | |
| return json.dumps(dict(sorted(log_obj.items()))) |
References
- To ensure dictionary keys remain sorted without manual effort, programmatically sort the dictionary before returning it (e.g., using
dict(sorted(metadata.items()))) instead of relying on manual ordering in the code.
|
|
||
| # Return the response | ||
| api_response = operations_pb2.ListOperationsResponse() | ||
| json_format.Parse(response.content, api_response, ignore_unknown_fields=False) |
There was a problem hiding this comment.
Setting ignore_unknown_fields=False can cause the client to break if the server adds new fields to the response. It is generally safer to set this to True for forward compatibility, which is consistent with other parts of this library.
| json_format.Parse(response.content, api_response, ignore_unknown_fields=False) | |
| json_format.Parse(response.content, api_response, ignore_unknown_fields=True) |
| payload = json.loads(content.decode("utf-8")) | ||
| request_url = "{host}{uri}".format(host=self._host, uri=uri) | ||
| raise core_exceptions.format_http_response_error(response, method, request_url, payload) # type: ignore |
There was a problem hiding this comment.
The response content might not be valid JSON (e.g., if an intermediate proxy returns an HTML error page). Parsing it directly with json.loads can raise a JSONDecodeError. It is safer to handle this exception and provide a fallback payload.
| payload = json.loads(content.decode("utf-8")) | |
| request_url = "{host}{uri}".format(host=self._host, uri=uri) | |
| raise core_exceptions.format_http_response_error(response, method, request_url, payload) # type: ignore | |
| decoded_content = content.decode("utf-8") | |
| try: | |
| payload = json.loads(decoded_content) | |
| except json.JSONDecodeError: | |
| payload = {"error": {"message": decoded_content or "unknown error"}} | |
| request_url = "{host}{uri}".format(host=self._host, uri=uri) | |
| raise core_exceptions.format_http_response_error(response, method, request_url, payload) # type: ignore |
|
This PR is ready for review but not merge |
See #10892.
This PR should be merged with a merge-commit, not a squash-commit, in order to preserve the git history.