From 875ef7911b2b8c3cc8c3931d779fe521a33b890e Mon Sep 17 00:00:00 2001 From: WhiteSource Renovate Date: Thu, 24 Jun 2021 17:26:27 +0200 Subject: [PATCH 1/7] chore(deps): update dependency google-cloud-bigquery-datatransfer to v3.2.0 (#168) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Change | Age | Adoption | Passing | Confidence | |---|---|---|---|---|---| | [google-cloud-bigquery-datatransfer](https://togithub.com/googleapis/python-bigquery-datatransfer) | `==3.1.1` -> `==3.2.0` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-datatransfer/3.2.0/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-datatransfer/3.2.0/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-datatransfer/3.2.0/compatibility-slim/3.1.1)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-bigquery-datatransfer/3.2.0/confidence-slim/3.1.1)](https://docs.renovatebot.com/merge-confidence/) | --- ### Release Notes
googleapis/python-bigquery-datatransfer ### [`v3.2.0`](https://togithub.com/googleapis/python-bigquery-datatransfer/blob/master/CHANGELOG.md#​320-httpswwwgithubcomgoogleapispython-bigquery-datatransfercomparev311v320-2021-06-22) [Compare Source](https://togithub.com/googleapis/python-bigquery-datatransfer/compare/v3.1.1...v3.2.0) ##### Features - support self-signed JWT flow for service accounts ([046c368](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/046c368cf5a75a210b0ecc7e6e1eee9bcd907669)) ##### Bug Fixes - add async client to %name\_%version/init.py ([046c368](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/046c368cf5a75a210b0ecc7e6e1eee9bcd907669)) ##### Documentation - omit mention of Python 2.7 in 'CONTRIBUTING.rst' ([#​1127](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/1127)) ([#​164](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/164)) ([2741e4f](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/2741e4fb1d9074494872fafcec96d870b14b671d)), closes [#​1126](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/1126) ##### [3.1.1](https://www.github.com/googleapis/python-bigquery-datatransfer/compare/v3.1.0...v3.1.1) (2021-04-07) ##### Bug Fixes - require proto-plus>=1.15.0 ([91910f1](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/91910f1ea01c5324fa63a7d85a034d08aeaae3f9)) - use correct retry deadline ([#​121](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/121)) ([91910f1](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/91910f1ea01c5324fa63a7d85a034d08aeaae3f9))
--- ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] If you want to rebase/retry this PR, check this box. --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery-datatransfer). --- samples/snippets/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/snippets/requirements.txt b/samples/snippets/requirements.txt index 27259498..c3cb098d 100644 --- a/samples/snippets/requirements.txt +++ b/samples/snippets/requirements.txt @@ -1 +1 @@ -google-cloud-bigquery-datatransfer==3.1.1 +google-cloud-bigquery-datatransfer==3.2.0 From b3b951c085b360693dedb7eed34adc94883b4063 Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Sat, 26 Jun 2021 11:26:21 +0000 Subject: [PATCH 2/7] chore(python): simplify nox steps in CONTRIBUTING.rst (#169) Source-Link: https://github.com/googleapis/synthtool/commit/26558bae8976a985d73c2d98c31d8612273f907d Post-Processor: gcr.io/repo-automation-bots/owlbot-python:latest@sha256:99d90d097e4a4710cc8658ee0b5b963f4426d0e424819787c3ac1405c9a26719 --- .github/.OwlBot.lock.yaml | 2 +- CONTRIBUTING.rst | 14 ++++++-------- 2 files changed, 7 insertions(+), 9 deletions(-) diff --git a/.github/.OwlBot.lock.yaml b/.github/.OwlBot.lock.yaml index 0954585f..e2b39f94 100644 --- a/.github/.OwlBot.lock.yaml +++ b/.github/.OwlBot.lock.yaml @@ -1,3 +1,3 @@ docker: image: gcr.io/repo-automation-bots/owlbot-python:latest - digest: sha256:df50e8d462f86d6bcb42f27ecad55bb12c404f1c65de9c6fe4c4d25120080bd6 + digest: sha256:99d90d097e4a4710cc8658ee0b5b963f4426d0e424819787c3ac1405c9a26719 diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 5165217c..a09a0802 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -68,14 +68,12 @@ Using ``nox`` We use `nox `__ to instrument our tests. - To test your changes, run unit tests with ``nox``:: + $ nox -s unit - $ nox -s unit-3.8 - $ ... +- To run a single unit test:: -- Args to pytest can be passed through the nox command separated by a `--`. For - example, to run a single test:: + $ nox -s unit-3.9 -- -k - $ nox -s unit-3.8 -- -k .. note:: @@ -142,7 +140,7 @@ Running System Tests - To run system tests, you can execute:: # Run all system tests - $ nox -s system-3.8 + $ nox -s system # Run a single system test $ nox -s system-3.8 -- -k @@ -215,8 +213,8 @@ Supported versions can be found in our ``noxfile.py`` `config`_. .. _config: https://github.com/googleapis/python-bigquery-datatransfer/blob/master/noxfile.py -We also explicitly decided to support Python 3 beginning with version -3.6. Reasons for this include: +We also explicitly decided to support Python 3 beginning with version 3.6. +Reasons for this include: - Encouraging use of newest versions of Python 3 - Taking the lead of `prominent`_ open-source `projects`_ From cd4494f0dc7304469e7d4a0ed6e13d716b3cacbf Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Wed, 30 Jun 2021 17:29:00 -0600 Subject: [PATCH 3/7] feat: add always_use_jwt_access (#171) * chore: use gapic-generator-python 0.50.3 Committer: @busunkim96 PiperOrigin-RevId: 382142900 Source-Link: https://github.com/googleapis/googleapis/commit/513440fda515f3c799c22a30e3906dcda325004e Source-Link: https://github.com/googleapis/googleapis-gen/commit/7b1e2c31233f79a704ec21ca410bf661d6bc68d0 --- .coveragerc | 1 - .../data_transfer_service/async_client.py | 6 + .../services/data_transfer_service/client.py | 6 + .../data_transfer_service/transports/base.py | 42 ++--- .../data_transfer_service/transports/grpc.py | 10 +- .../transports/grpc_asyncio.py | 10 +- owlbot.py | 15 +- setup.py | 2 +- testing/constraints-3.6.txt | 2 +- .../test_data_transfer_service.py | 144 ++++++------------ 10 files changed, 96 insertions(+), 142 deletions(-) diff --git a/.coveragerc b/.coveragerc index 38dd96f7..63daceaf 100644 --- a/.coveragerc +++ b/.coveragerc @@ -2,7 +2,6 @@ branch = True [report] -fail_under = 100 show_missing = True omit = google/cloud/bigquery_datatransfer/__init__.py diff --git a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/async_client.py b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/async_client.py index d0454623..7f502338 100644 --- a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/async_client.py +++ b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/async_client.py @@ -18,6 +18,7 @@ import re from typing import Dict, Sequence, Tuple, Type, Union import pkg_resources +import warnings import google.api_core.client_options as ClientOptions # type: ignore from google.api_core import exceptions as core_exceptions # type: ignore @@ -861,6 +862,11 @@ async def schedule_transfer_runs( for a time range. """ + warnings.warn( + "DataTransferServiceAsyncClient.schedule_transfer_runs is deprecated", + DeprecationWarning, + ) + # Create or coerce a protobuf request object. # Sanity check: If we got a request object, we should *not* have # gotten any keyword arguments that map to the request. diff --git a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/client.py b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/client.py index b0dab00f..099e611b 100644 --- a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/client.py +++ b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/client.py @@ -19,6 +19,7 @@ import re from typing import Callable, Dict, Optional, Sequence, Tuple, Type, Union import pkg_resources +import warnings from google.api_core import client_options as client_options_lib # type: ignore from google.api_core import exceptions as core_exceptions # type: ignore @@ -1020,6 +1021,11 @@ def schedule_transfer_runs( for a time range. """ + warnings.warn( + "DataTransferServiceClient.schedule_transfer_runs is deprecated", + DeprecationWarning, + ) + # Create or coerce a protobuf request object. # Sanity check: If we got a request object, we should *not* have # gotten any keyword arguments that map to the request. diff --git a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/base.py b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/base.py index 830861c5..d4a0ff10 100644 --- a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/base.py +++ b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/base.py @@ -24,6 +24,7 @@ from google.api_core import gapic_v1 # type: ignore from google.api_core import retry as retries # type: ignore from google.auth import credentials as ga_credentials # type: ignore +from google.oauth2 import service_account # type: ignore from google.cloud.bigquery_datatransfer_v1.types import datatransfer from google.cloud.bigquery_datatransfer_v1.types import transfer @@ -47,8 +48,6 @@ except pkg_resources.DistributionNotFound: # pragma: NO COVER _GOOGLE_AUTH_VERSION = None -_API_CORE_VERSION = google.api_core.__version__ - class DataTransferServiceTransport(abc.ABC): """Abstract transport class for DataTransferService.""" @@ -66,6 +65,7 @@ def __init__( scopes: Optional[Sequence[str]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, + always_use_jwt_access: Optional[bool] = False, **kwargs, ) -> None: """Instantiate the transport. @@ -89,6 +89,8 @@ def __init__( API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. + always_use_jwt_access (Optional[bool]): Whether self signed JWT should + be used for service account credentials. """ # Save the hostname. Default to port 443 (HTTPS) if none is specified. if ":" not in host: @@ -98,7 +100,7 @@ def __init__( scopes_kwargs = self._get_scopes_kwargs(self._host, scopes) # Save the scopes. - self._scopes = scopes or self.AUTH_SCOPES + self._scopes = scopes # If no credentials are provided, then determine the appropriate # defaults. @@ -117,13 +119,20 @@ def __init__( **scopes_kwargs, quota_project_id=quota_project_id ) + # If the credentials is service account credentials, then always try to use self signed JWT. + if ( + always_use_jwt_access + and isinstance(credentials, service_account.Credentials) + and hasattr(service_account.Credentials, "with_always_use_jwt_access") + ): + credentials = credentials.with_always_use_jwt_access(True) + # Save the credentials. self._credentials = credentials - # TODO(busunkim): These two class methods are in the base transport + # TODO(busunkim): This method is in the base transport # to avoid duplicating code across the transport classes. These functions - # should be deleted once the minimum required versions of google-api-core - # and google-auth are increased. + # should be deleted once the minimum required versions of google-auth is increased. # TODO: Remove this function once google-auth >= 1.25.0 is required @classmethod @@ -144,27 +153,6 @@ def _get_scopes_kwargs( return scopes_kwargs - # TODO: Remove this function once google-api-core >= 1.26.0 is required - @classmethod - def _get_self_signed_jwt_kwargs( - cls, host: str, scopes: Optional[Sequence[str]] - ) -> Dict[str, Union[Optional[Sequence[str]], str]]: - """Returns kwargs to pass to grpc_helpers.create_channel depending on the google-api-core version""" - - self_signed_jwt_kwargs: Dict[str, Union[Optional[Sequence[str]], str]] = {} - - if _API_CORE_VERSION and ( - packaging.version.parse(_API_CORE_VERSION) - >= packaging.version.parse("1.26.0") - ): - self_signed_jwt_kwargs["default_scopes"] = cls.AUTH_SCOPES - self_signed_jwt_kwargs["scopes"] = scopes - self_signed_jwt_kwargs["default_host"] = cls.DEFAULT_HOST - else: - self_signed_jwt_kwargs["scopes"] = scopes or cls.AUTH_SCOPES - - return self_signed_jwt_kwargs - def _prep_wrapped_messages(self, client_info): # Precompute the wrapped methods. self._wrapped_methods = { diff --git a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc.py b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc.py index 9181ae48..6f431d2a 100644 --- a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc.py +++ b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc.py @@ -62,6 +62,7 @@ def __init__( client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, + always_use_jwt_access: Optional[bool] = False, ) -> None: """Instantiate the transport. @@ -102,6 +103,8 @@ def __init__( API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. + always_use_jwt_access (Optional[bool]): Whether self signed JWT should + be used for service account credentials. Raises: google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport @@ -154,6 +157,7 @@ def __init__( scopes=scopes, quota_project_id=quota_project_id, client_info=client_info, + always_use_jwt_access=always_use_jwt_access, ) if not self._grpc_channel: @@ -209,14 +213,14 @@ def create_channel( and ``credentials_file`` are passed. """ - self_signed_jwt_kwargs = cls._get_self_signed_jwt_kwargs(host, scopes) - return grpc_helpers.create_channel( host, credentials=credentials, credentials_file=credentials_file, quota_project_id=quota_project_id, - **self_signed_jwt_kwargs, + default_scopes=cls.AUTH_SCOPES, + scopes=scopes, + default_host=cls.DEFAULT_HOST, **kwargs, ) diff --git a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc_asyncio.py b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc_asyncio.py index 2cd986db..a5f7d8f4 100644 --- a/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc_asyncio.py +++ b/google/cloud/bigquery_datatransfer_v1/services/data_transfer_service/transports/grpc_asyncio.py @@ -83,14 +83,14 @@ def create_channel( aio.Channel: A gRPC AsyncIO channel object. """ - self_signed_jwt_kwargs = cls._get_self_signed_jwt_kwargs(host, scopes) - return grpc_helpers_async.create_channel( host, credentials=credentials, credentials_file=credentials_file, quota_project_id=quota_project_id, - **self_signed_jwt_kwargs, + default_scopes=cls.AUTH_SCOPES, + scopes=scopes, + default_host=cls.DEFAULT_HOST, **kwargs, ) @@ -108,6 +108,7 @@ def __init__( client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id=None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, + always_use_jwt_access: Optional[bool] = False, ) -> None: """Instantiate the transport. @@ -149,6 +150,8 @@ def __init__( API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. + always_use_jwt_access (Optional[bool]): Whether self signed JWT should + be used for service account credentials. Raises: google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport @@ -200,6 +203,7 @@ def __init__( scopes=scopes, quota_project_id=quota_project_id, client_info=client_info, + always_use_jwt_access=always_use_jwt_access, ) if not self._grpc_channel: diff --git a/owlbot.py b/owlbot.py index 2281a7a8..6a5c03b1 100644 --- a/owlbot.py +++ b/owlbot.py @@ -27,17 +27,12 @@ # ---------------------------------------------------------------------------- for library in s.get_staging_dirs("v1"): - # Comment out broken assertion in unit test - # https://github.com/googleapis/gapic-generator-python/issues/897 + # Fix incorrect DeprecationWarning + # Fixed in https://github.com/googleapis/gapic-generator-python/pull/943 s.replace( - library / "tests/**/*.py", - "assert args\[0\]\.start_time == timestamp_pb2\.Timestamp\(seconds=751\)", - "# assert args[0].start_time == timestamp_pb2.Timestamp(seconds=751)" - ) - s.replace( - library / "tests/**/*.py", - "assert args\[0\]\.end_time == timestamp_pb2\.Timestamp\(seconds=751\)", - "# assert args[0].end_time == timestamp_pb2.Timestamp(seconds=751)" + "google/**/*client.py", + "warnings\.DeprecationWarning", + "DeprecationWarning" ) s.move(library, excludes=["*.tar.gz", "docs/index.rst", "README.rst", "setup.py"]) diff --git a/setup.py b/setup.py index 043dd179..d1ac14ca 100644 --- a/setup.py +++ b/setup.py @@ -29,7 +29,7 @@ # 'Development Status :: 5 - Production/Stable' release_status = "Development Status :: 5 - Production/Stable" dependencies = ( - "google-api-core[grpc] >= 1.22.2, < 2.0.0dev", + "google-api-core[grpc] >= 1.26.0, <2.0.0dev", "proto-plus >= 1.15.0", "packaging >= 14.3", ) diff --git a/testing/constraints-3.6.txt b/testing/constraints-3.6.txt index e6739c3e..d3b6361c 100644 --- a/testing/constraints-3.6.txt +++ b/testing/constraints-3.6.txt @@ -5,7 +5,7 @@ # # e.g., if setup.py has "foo >= 1.14.0, < 2.0.0dev", # Then this file should have foo==1.14.0 -google-api-core==1.22.2 +google-api-core==1.26.0 proto-plus==1.15.0 libcst==0.2.5 packaging==14.3 diff --git a/tests/unit/gapic/bigquery_datatransfer_v1/test_data_transfer_service.py b/tests/unit/gapic/bigquery_datatransfer_v1/test_data_transfer_service.py index cf70ccc9..be3b2c25 100644 --- a/tests/unit/gapic/bigquery_datatransfer_v1/test_data_transfer_service.py +++ b/tests/unit/gapic/bigquery_datatransfer_v1/test_data_transfer_service.py @@ -41,9 +41,6 @@ from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service import ( transports, ) -from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.transports.base import ( - _API_CORE_VERSION, -) from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.transports.base import ( _GOOGLE_AUTH_VERSION, ) @@ -58,8 +55,9 @@ import google.auth -# TODO(busunkim): Once google-api-core >= 1.26.0 is required: -# - Delete all the api-core and auth "less than" test cases +# TODO(busunkim): Once google-auth >= 1.25.0 is required transitively +# through google-api-core: +# - Delete the auth "less than" test cases # - Delete these pytest markers (Make the "greater than or equal to" tests the default). requires_google_auth_lt_1_25_0 = pytest.mark.skipif( packaging.version.parse(_GOOGLE_AUTH_VERSION) >= packaging.version.parse("1.25.0"), @@ -70,16 +68,6 @@ reason="This test requires google-auth >= 1.25.0", ) -requires_api_core_lt_1_26_0 = pytest.mark.skipif( - packaging.version.parse(_API_CORE_VERSION) >= packaging.version.parse("1.26.0"), - reason="This test requires google-api-core < 1.26.0", -) - -requires_api_core_gte_1_26_0 = pytest.mark.skipif( - packaging.version.parse(_API_CORE_VERSION) < packaging.version.parse("1.26.0"), - reason="This test requires google-api-core >= 1.26.0", -) - def client_cert_source_callback(): return b"cert bytes", b"key bytes" @@ -143,6 +131,36 @@ def test_data_transfer_service_client_from_service_account_info(client_class): assert client.transport._host == "bigquerydatatransfer.googleapis.com:443" +@pytest.mark.parametrize( + "client_class", [DataTransferServiceClient, DataTransferServiceAsyncClient,] +) +def test_data_transfer_service_client_service_account_always_use_jwt(client_class): + with mock.patch.object( + service_account.Credentials, "with_always_use_jwt_access", create=True + ) as use_jwt: + creds = service_account.Credentials(None, None, None) + client = client_class(credentials=creds) + use_jwt.assert_not_called() + + +@pytest.mark.parametrize( + "transport_class,transport_name", + [ + (transports.DataTransferServiceGrpcTransport, "grpc"), + (transports.DataTransferServiceGrpcAsyncIOTransport, "grpc_asyncio"), + ], +) +def test_data_transfer_service_client_service_account_always_use_jwt_true( + transport_class, transport_name +): + with mock.patch.object( + service_account.Credentials, "with_always_use_jwt_access", create=True + ) as use_jwt: + creds = service_account.Credentials(None, None, None) + transport = transport_class(credentials=creds, always_use_jwt_access=True) + use_jwt.assert_called_once_with(True) + + @pytest.mark.parametrize( "client_class", [DataTransferServiceClient, DataTransferServiceAsyncClient,] ) @@ -2776,8 +2794,12 @@ def test_schedule_transfer_runs_flattened(): assert len(call.mock_calls) == 1 _, args, _ = call.mock_calls[0] assert args[0].parent == "parent_value" - # assert args[0].start_time == timestamp_pb2.Timestamp(seconds=751) - # assert args[0].end_time == timestamp_pb2.Timestamp(seconds=751) + assert TimestampRule().to_proto(args[0].start_time) == timestamp_pb2.Timestamp( + seconds=751 + ) + assert TimestampRule().to_proto(args[0].end_time) == timestamp_pb2.Timestamp( + seconds=751 + ) def test_schedule_transfer_runs_flattened_error(): @@ -2825,8 +2847,12 @@ async def test_schedule_transfer_runs_flattened_async(): assert len(call.mock_calls) _, args, _ = call.mock_calls[0] assert args[0].parent == "parent_value" - # assert args[0].start_time == timestamp_pb2.Timestamp(seconds=751) - # assert args[0].end_time == timestamp_pb2.Timestamp(seconds=751) + assert TimestampRule().to_proto(args[0].start_time) == timestamp_pb2.Timestamp( + seconds=751 + ) + assert TimestampRule().to_proto(args[0].end_time) == timestamp_pb2.Timestamp( + seconds=751 + ) @pytest.mark.asyncio @@ -4684,7 +4710,6 @@ def test_data_transfer_service_transport_auth_adc_old_google_auth(transport_clas (transports.DataTransferServiceGrpcAsyncIOTransport, grpc_helpers_async), ], ) -@requires_api_core_gte_1_26_0 def test_data_transfer_service_transport_create_channel(transport_class, grpc_helpers): # If credentials and host are not provided, the transport class should use # ADC credentials. @@ -4713,79 +4738,6 @@ def test_data_transfer_service_transport_create_channel(transport_class, grpc_he ) -@pytest.mark.parametrize( - "transport_class,grpc_helpers", - [ - (transports.DataTransferServiceGrpcTransport, grpc_helpers), - (transports.DataTransferServiceGrpcAsyncIOTransport, grpc_helpers_async), - ], -) -@requires_api_core_lt_1_26_0 -def test_data_transfer_service_transport_create_channel_old_api_core( - transport_class, grpc_helpers -): - # If credentials and host are not provided, the transport class should use - # ADC credentials. - with mock.patch.object( - google.auth, "default", autospec=True - ) as adc, mock.patch.object( - grpc_helpers, "create_channel", autospec=True - ) as create_channel: - creds = ga_credentials.AnonymousCredentials() - adc.return_value = (creds, None) - transport_class(quota_project_id="octopus") - - create_channel.assert_called_with( - "bigquerydatatransfer.googleapis.com:443", - credentials=creds, - credentials_file=None, - quota_project_id="octopus", - scopes=("https://www.googleapis.com/auth/cloud-platform",), - ssl_credentials=None, - options=[ - ("grpc.max_send_message_length", -1), - ("grpc.max_receive_message_length", -1), - ], - ) - - -@pytest.mark.parametrize( - "transport_class,grpc_helpers", - [ - (transports.DataTransferServiceGrpcTransport, grpc_helpers), - (transports.DataTransferServiceGrpcAsyncIOTransport, grpc_helpers_async), - ], -) -@requires_api_core_lt_1_26_0 -def test_data_transfer_service_transport_create_channel_user_scopes( - transport_class, grpc_helpers -): - # If credentials and host are not provided, the transport class should use - # ADC credentials. - with mock.patch.object( - google.auth, "default", autospec=True - ) as adc, mock.patch.object( - grpc_helpers, "create_channel", autospec=True - ) as create_channel: - creds = ga_credentials.AnonymousCredentials() - adc.return_value = (creds, None) - - transport_class(quota_project_id="octopus", scopes=["1", "2"]) - - create_channel.assert_called_with( - "bigquerydatatransfer.googleapis.com:443", - credentials=creds, - credentials_file=None, - quota_project_id="octopus", - scopes=["1", "2"], - ssl_credentials=None, - options=[ - ("grpc.max_send_message_length", -1), - ("grpc.max_receive_message_length", -1), - ], - ) - - @pytest.mark.parametrize( "transport_class", [ @@ -4810,7 +4762,7 @@ def test_data_transfer_service_grpc_transport_client_cert_source_for_mtls( "squid.clam.whelk:443", credentials=cred, credentials_file=None, - scopes=("https://www.googleapis.com/auth/cloud-platform",), + scopes=None, ssl_credentials=mock_ssl_channel_creds, quota_project_id=None, options=[ @@ -4919,7 +4871,7 @@ def test_data_transfer_service_transport_channel_mtls_with_client_cert_source( "mtls.squid.clam.whelk:443", credentials=cred, credentials_file=None, - scopes=("https://www.googleapis.com/auth/cloud-platform",), + scopes=None, ssl_credentials=mock_ssl_cred, quota_project_id=None, options=[ @@ -4966,7 +4918,7 @@ def test_data_transfer_service_transport_channel_mtls_with_adc(transport_class): "mtls.squid.clam.whelk:443", credentials=mock_cred, credentials_file=None, - scopes=("https://www.googleapis.com/auth/cloud-platform",), + scopes=None, ssl_credentials=mock_ssl_cred, quota_project_id=None, options=[ From ea018c9f4a1f9c360dbe9f08650250ea8c505f29 Mon Sep 17 00:00:00 2001 From: Tim Swast Date: Thu, 1 Jul 2021 14:15:32 -0500 Subject: [PATCH 4/7] docs: add sample to include run notification (#173) * docs: add sample to include run notification * blacken samples * fix sample lint --- owlbot.py | 6 ++ samples/snippets/conftest.py | 55 +++++++++++++++++-- samples/snippets/manage_transfer_configs.py | 4 +- .../snippets/manage_transfer_configs_test.py | 4 +- samples/snippets/noxfile.py | 39 +++++++------ samples/snippets/requirements-test.txt | 1 + samples/snippets/run_notification.py | 44 +++++++++++++++ samples/snippets/run_notification_test.py | 26 +++++++++ 8 files changed, 151 insertions(+), 28 deletions(-) create mode 100644 samples/snippets/run_notification.py create mode 100644 samples/snippets/run_notification_test.py diff --git a/owlbot.py b/owlbot.py index 6a5c03b1..08dd3376 100644 --- a/owlbot.py +++ b/owlbot.py @@ -14,12 +14,16 @@ """This script is used to synthesize generated parts of this library.""" +import pathlib + import synthtool as s from synthtool import gcp from synthtool.languages import python +REPO_ROOT = pathlib.Path(__file__).parent.absolute() + common = gcp.CommonTemplates() # ---------------------------------------------------------------------------- @@ -53,3 +57,5 @@ python.py_samples(skip_readmes=True) s.shell.run(["nox", "-s", "blacken"], hide_output=False) +for noxfile in REPO_ROOT.glob("samples/**/noxfile.py"): + s.shell.run(["nox", "-s", "blacken"], cwd=noxfile.parent, hide_output=False) diff --git a/samples/snippets/conftest.py b/samples/snippets/conftest.py index 998d5ea7..f708ff48 100644 --- a/samples/snippets/conftest.py +++ b/samples/snippets/conftest.py @@ -14,6 +14,7 @@ import datetime import os +import random import uuid from google.api_core import client_options @@ -21,9 +22,40 @@ import google.auth from google.cloud import bigquery from google.cloud import bigquery_datatransfer +from google.cloud import pubsub_v1 import pytest +RESOURCE_PREFIX = "python_bigquery_datatransfer_samples_snippets" +RESOURCE_DATE_FORMAT = "%Y%m%d%H%M%S" +RESOURCE_DATE_LENGTH = 4 + 2 + 2 + 2 + 2 + 2 + + +def resource_prefix() -> str: + timestamp = datetime.datetime.utcnow().strftime(RESOURCE_DATE_FORMAT) + random_string = hex(random.randrange(1000000))[2:] + return f"{RESOURCE_PREFIX}_{timestamp}_{random_string}" + + +def resource_name_to_date(resource_name: str): + start_date = len(RESOURCE_PREFIX) + 1 + date_string = resource_name[start_date : start_date + RESOURCE_DATE_LENGTH] + parsed_date = datetime.datetime.strptime(date_string, RESOURCE_DATE_FORMAT) + return parsed_date + + +@pytest.fixture(scope="session", autouse=True) +def cleanup_pubsub_topics(pubsub_client: pubsub_v1.PublisherClient, project_id): + yesterday = datetime.datetime.utcnow() - datetime.timedelta(days=1) + for topic in pubsub_client.list_topics(project=f"projects/{project_id}"): + topic_id = topic.name.split("/")[-1] + if ( + topic_id.startswith(RESOURCE_PREFIX) + and resource_name_to_date(topic_id) < yesterday + ): + pubsub_client.delete_topic(topic=topic.name) + + def temp_suffix(): now = datetime.datetime.now() return f"{now.strftime('%Y%m%d%H%M%S')}_{uuid.uuid4().hex[:8]}" @@ -35,6 +67,21 @@ def bigquery_client(default_credentials): return bigquery.Client(credentials=credentials, project=project_id) +@pytest.fixture(scope="session") +def pubsub_client(default_credentials): + credentials, _ = default_credentials + return pubsub_v1.PublisherClient(credentials=credentials) + + +@pytest.fixture(scope="session") +def pubsub_topic(pubsub_client: pubsub_v1.PublisherClient, project_id): + topic_id = resource_prefix() + topic_path = pubsub_v1.PublisherClient.topic_path(project_id, topic_id) + pubsub_client.create_topic(name=topic_path) + yield topic_path + pubsub_client.delete_topic(topic=topic_path) + + @pytest.fixture(scope="session") def dataset_id(bigquery_client, project_id): dataset_id = f"bqdts_{temp_suffix()}" @@ -56,10 +103,10 @@ def project_id(): @pytest.fixture(scope="session") def service_account_name(default_credentials): credentials, _ = default_credentials - # Note: this property is not available when running with user account - # credentials, but only service account credentials are used in our test - # infrastructure. - return credentials.service_account_email + # The service_account_email attribute is not available when running with + # user account credentials, but should be available when running from our + # continuous integration tests. + return getattr(credentials, "service_account_email", None) @pytest.fixture(scope="session") diff --git a/samples/snippets/manage_transfer_configs.py b/samples/snippets/manage_transfer_configs.py index 6b4abd78..5f775f10 100644 --- a/samples/snippets/manage_transfer_configs.py +++ b/samples/snippets/manage_transfer_configs.py @@ -135,9 +135,7 @@ def schedule_backfill(override_values={}): ) response = transfer_client.schedule_transfer_runs( - parent=transfer_config_name, - start_time=start_time, - end_time=end_time, + parent=transfer_config_name, start_time=start_time, end_time=end_time, ) print("Started transfer runs:") diff --git a/samples/snippets/manage_transfer_configs_test.py b/samples/snippets/manage_transfer_configs_test.py index de31c713..52d16dc2 100644 --- a/samples/snippets/manage_transfer_configs_test.py +++ b/samples/snippets/manage_transfer_configs_test.py @@ -52,9 +52,7 @@ def test_update_credentials_with_service_account( def test_schedule_backfill(capsys, transfer_config_name): runs = manage_transfer_configs.schedule_backfill( - { - "transfer_config_name": transfer_config_name, - } + {"transfer_config_name": transfer_config_name} ) out, _ = capsys.readouterr() assert "Started transfer runs:" in out diff --git a/samples/snippets/noxfile.py b/samples/snippets/noxfile.py index 5ff9e1db..160fe728 100644 --- a/samples/snippets/noxfile.py +++ b/samples/snippets/noxfile.py @@ -38,17 +38,15 @@ TEST_CONFIG = { # You can opt out from the test for specific Python versions. - 'ignored_versions': ["2.7"], - + "ignored_versions": ["2.7"], # Old samples are opted out of enforcing Python type hints # All new samples should feature them - 'enforce_type_hints': False, - + "enforce_type_hints": False, # An envvar key for determining the project id to use. Change it # to 'BUILD_SPECIFIC_GCLOUD_PROJECT' if you want to opt in using a # build specific Cloud project. You can also use your own string # to use your own Cloud project. - 'gcloud_project_env': 'GOOGLE_CLOUD_PROJECT', + "gcloud_project_env": "GOOGLE_CLOUD_PROJECT", # 'gcloud_project_env': 'BUILD_SPECIFIC_GCLOUD_PROJECT', # If you need to use a specific version of pip, # change pip_version_override to the string representation @@ -56,13 +54,13 @@ "pip_version_override": None, # A dictionary you want to inject into your test. Don't put any # secrets here. These values will override predefined values. - 'envs': {}, + "envs": {}, } try: # Ensure we can import noxfile_config in the project's directory. - sys.path.append('.') + sys.path.append(".") from noxfile_config import TEST_CONFIG_OVERRIDE except ImportError as e: print("No user noxfile_config found: detail: {}".format(e)) @@ -77,12 +75,12 @@ def get_pytest_env_vars() -> Dict[str, str]: ret = {} # Override the GCLOUD_PROJECT and the alias. - env_key = TEST_CONFIG['gcloud_project_env'] + env_key = TEST_CONFIG["gcloud_project_env"] # This should error out if not set. - ret['GOOGLE_CLOUD_PROJECT'] = os.environ[env_key] + ret["GOOGLE_CLOUD_PROJECT"] = os.environ[env_key] # Apply user supplied envs. - ret.update(TEST_CONFIG['envs']) + ret.update(TEST_CONFIG["envs"]) return ret @@ -91,7 +89,7 @@ def get_pytest_env_vars() -> Dict[str, str]: ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"] # Any default versions that should be ignored. -IGNORED_VERSIONS = TEST_CONFIG['ignored_versions'] +IGNORED_VERSIONS = TEST_CONFIG["ignored_versions"] TESTED_VERSIONS = sorted([v for v in ALL_VERSIONS if v not in IGNORED_VERSIONS]) @@ -140,7 +138,7 @@ def _determine_local_import_names(start_dir: str) -> List[str]: @nox.session def lint(session: nox.sessions.Session) -> None: - if not TEST_CONFIG['enforce_type_hints']: + if not TEST_CONFIG["enforce_type_hints"]: session.install("flake8", "flake8-import-order") else: session.install("flake8", "flake8-import-order", "flake8-annotations") @@ -149,9 +147,11 @@ def lint(session: nox.sessions.Session) -> None: args = FLAKE8_COMMON_ARGS + [ "--application-import-names", ",".join(local_names), - "." + ".", ] session.run("flake8", *args) + + # # Black # @@ -164,6 +164,7 @@ def blacken(session: nox.sessions.Session) -> None: session.run("black", *python_files) + # # Sample Tests # @@ -172,7 +173,9 @@ def blacken(session: nox.sessions.Session) -> None: PYTEST_COMMON_ARGS = ["--junitxml=sponge_log.xml"] -def _session_tests(session: nox.sessions.Session, post_install: Callable = None) -> None: +def _session_tests( + session: nox.sessions.Session, post_install: Callable = None +) -> None: if TEST_CONFIG["pip_version_override"]: pip_version = TEST_CONFIG["pip_version_override"] session.install(f"pip=={pip_version}") @@ -202,7 +205,7 @@ def _session_tests(session: nox.sessions.Session, post_install: Callable = None) # on travis where slow and flaky tests are excluded. # See http://doc.pytest.org/en/latest/_modules/_pytest/main.html success_codes=[0, 5], - env=get_pytest_env_vars() + env=get_pytest_env_vars(), ) @@ -212,9 +215,9 @@ def py(session: nox.sessions.Session) -> None: if session.python in TESTED_VERSIONS: _session_tests(session) else: - session.skip("SKIPPED: {} tests are disabled for this sample.".format( - session.python - )) + session.skip( + "SKIPPED: {} tests are disabled for this sample.".format(session.python) + ) # diff --git a/samples/snippets/requirements-test.txt b/samples/snippets/requirements-test.txt index e4d022b0..f0a3f547 100644 --- a/samples/snippets/requirements-test.txt +++ b/samples/snippets/requirements-test.txt @@ -1,3 +1,4 @@ google-cloud-bigquery==2.20.0 +google-cloud-pubsub==2.6.0 pytest==6.2.4 mock==4.0.3 diff --git a/samples/snippets/run_notification.py b/samples/snippets/run_notification.py new file mode 100644 index 00000000..44f1bf12 --- /dev/null +++ b/samples/snippets/run_notification.py @@ -0,0 +1,44 @@ +# Copyright 2021 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +def run_notification(transfer_config_name, pubsub_topic): + orig_transfer_config_name = transfer_config_name + orig_pubsub_topic = pubsub_topic + # [START bigquerydatatransfer_run_notification] + transfer_config_name = "projects/1234/locations/us/transferConfigs/abcd" + pubsub_topic = "projects/PROJECT-ID/topics/TOPIC-ID" + # [END bigquerydatatransfer_run_notification] + transfer_config_name = orig_transfer_config_name + pubsub_topic = orig_pubsub_topic + + # [START bigquerydatatransfer_run_notification] + from google.cloud import bigquery_datatransfer + from google.protobuf import field_mask_pb2 + + transfer_client = bigquery_datatransfer.DataTransferServiceClient() + + transfer_config = bigquery_datatransfer.TransferConfig(name=transfer_config_name) + transfer_config.notification_pubsub_topic = pubsub_topic + update_mask = field_mask_pb2.FieldMask(paths=["notification_pubsub_topic"]) + + transfer_config = transfer_client.update_transfer_config( + {"transfer_config": transfer_config, "update_mask": update_mask} + ) + + print(f"Updated config: '{transfer_config.name}'") + print(f"Notification Pub/Sub topic: '{transfer_config.notification_pubsub_topic}'") + # [END bigquerydatatransfer_run_notification] + # Return the config name for testing purposes, so that it can be deleted. + return transfer_config diff --git a/samples/snippets/run_notification_test.py b/samples/snippets/run_notification_test.py new file mode 100644 index 00000000..4c41e689 --- /dev/null +++ b/samples/snippets/run_notification_test.py @@ -0,0 +1,26 @@ +# Copyright 2021 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from . import run_notification + + +def test_run_notification(capsys, transfer_config_name, pubsub_topic): + run_notification.run_notification( + transfer_config_name=transfer_config_name, pubsub_topic=pubsub_topic, + ) + out, _ = capsys.readouterr() + assert "Updated config:" in out + assert transfer_config_name in out + assert "Notification Pub/Sub topic:" in out + assert pubsub_topic in out From 6dd0a70c3ed1c50e383910a793d6d21fb4887b50 Mon Sep 17 00:00:00 2001 From: "gcf-owl-bot[bot]" <78513119+gcf-owl-bot[bot]@users.noreply.github.com> Date: Mon, 12 Jul 2021 09:58:22 +0000 Subject: [PATCH 5/7] chore: use gapic-generator-python 0.50.4 (#174) PiperOrigin-RevId: 383929493 Source-Link: https://github.com/googleapis/googleapis/commit/dd1ef2849564ed64028b5c6ab0d098f5158a9165 Source-Link: https://github.com/googleapis/googleapis-gen/commit/507440292af237802f032d66830ed9c4f3435355 --- samples/snippets/manage_transfer_configs.py | 4 +++- samples/snippets/noxfile.py | 2 +- samples/snippets/run_notification_test.py | 3 ++- 3 files changed, 6 insertions(+), 3 deletions(-) diff --git a/samples/snippets/manage_transfer_configs.py b/samples/snippets/manage_transfer_configs.py index 5f775f10..6b4abd78 100644 --- a/samples/snippets/manage_transfer_configs.py +++ b/samples/snippets/manage_transfer_configs.py @@ -135,7 +135,9 @@ def schedule_backfill(override_values={}): ) response = transfer_client.schedule_transfer_runs( - parent=transfer_config_name, start_time=start_time, end_time=end_time, + parent=transfer_config_name, + start_time=start_time, + end_time=end_time, ) print("Started transfer runs:") diff --git a/samples/snippets/noxfile.py b/samples/snippets/noxfile.py index 160fe728..b3c8658a 100644 --- a/samples/snippets/noxfile.py +++ b/samples/snippets/noxfile.py @@ -226,7 +226,7 @@ def py(session: nox.sessions.Session) -> None: def _get_repo_root() -> Optional[str]: - """ Returns the root folder of the project. """ + """Returns the root folder of the project.""" # Get root of this repository. Assume we don't have directories nested deeper than 10 items. p = Path(os.getcwd()) for i in range(10): diff --git a/samples/snippets/run_notification_test.py b/samples/snippets/run_notification_test.py index 4c41e689..02f24266 100644 --- a/samples/snippets/run_notification_test.py +++ b/samples/snippets/run_notification_test.py @@ -17,7 +17,8 @@ def test_run_notification(capsys, transfer_config_name, pubsub_topic): run_notification.run_notification( - transfer_config_name=transfer_config_name, pubsub_topic=pubsub_topic, + transfer_config_name=transfer_config_name, + pubsub_topic=pubsub_topic, ) out, _ = capsys.readouterr() assert "Updated config:" in out From b450f9911ff09c32f6a6b743272fc5960834c391 Mon Sep 17 00:00:00 2001 From: WhiteSource Renovate Date: Mon, 12 Jul 2021 20:52:09 +0200 Subject: [PATCH 6/7] chore(deps): update dependency google-cloud-pubsub to v2.6.1 (#175) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [![WhiteSource Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com) This PR contains the following updates: | Package | Change | Age | Adoption | Passing | Confidence | |---|---|---|---|---|---| | [google-cloud-pubsub](https://togithub.com/googleapis/python-pubsub) | `==2.6.0` -> `==2.6.1` | [![age](https://badges.renovateapi.com/packages/pypi/google-cloud-pubsub/2.6.1/age-slim)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://badges.renovateapi.com/packages/pypi/google-cloud-pubsub/2.6.1/adoption-slim)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://badges.renovateapi.com/packages/pypi/google-cloud-pubsub/2.6.1/compatibility-slim/2.6.0)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://badges.renovateapi.com/packages/pypi/google-cloud-pubsub/2.6.1/confidence-slim/2.6.0)](https://docs.renovatebot.com/merge-confidence/) | --- ### Release Notes
googleapis/python-pubsub ### [`v2.6.1`](https://togithub.com/googleapis/python-pubsub/blob/master/CHANGELOG.md#​261) [Compare Source](https://togithub.com/googleapis/python-pubsub/compare/v2.6.0...v2.6.1) 07-05-2021 10:33 PDT ##### Dependencies - Fix possible crash by requiring `grpcio >= 1.38.1`. ([#​414](https://togithub.com/googleapis/python-pubsub/issues/414)) ([7037a28](https://togithub.com/googleapis/python-pubsub/pull/435/commits/7037a28090aa4efa01808231721716bca80bb0b7)) ##### Documentation - Adjust samples for publishing with error handler and flow control. ([#​433](https://togithub.com/googleapis/python-pubsub/pull/433)) ##### Internal / Testing Changes - Fix flaky sync pull sample test. ([#​434](https://togithub.com/googleapis/python-pubsub/pull/434)) - Mitigate flaky snippets tests. ([#​432](https://togithub.com/googleapis/python-pubsub/pull/432))
--- ### Configuration 📅 **Schedule**: At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] If you want to rebase/retry this PR, check this box. --- This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#github/googleapis/python-bigquery-datatransfer). --- samples/snippets/requirements-test.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/snippets/requirements-test.txt b/samples/snippets/requirements-test.txt index f0a3f547..e763d1e2 100644 --- a/samples/snippets/requirements-test.txt +++ b/samples/snippets/requirements-test.txt @@ -1,4 +1,4 @@ google-cloud-bigquery==2.20.0 -google-cloud-pubsub==2.6.0 +google-cloud-pubsub==2.6.1 pytest==6.2.4 mock==4.0.3 From 738d0b26738b07c72c924c6d18ec52b3d075d78e Mon Sep 17 00:00:00 2001 From: "release-please[bot]" <55107282+release-please[bot]@users.noreply.github.com> Date: Mon, 12 Jul 2021 18:56:10 +0000 Subject: [PATCH 7/7] chore: release 3.3.0 (#172) :robot: I have created a release \*beep\* \*boop\* --- ## [3.3.0](https://www.github.com/googleapis/python-bigquery-datatransfer/compare/v3.2.0...v3.3.0) (2021-07-12) ### Features * add always_use_jwt_access ([#171](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/171)) ([cd4494f](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/cd4494f0dc7304469e7d4a0ed6e13d716b3cacbf)) ### Documentation * add sample to include run notification ([#173](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/173)) ([ea018c9](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/ea018c9f4a1f9c360dbe9f08650250ea8c505f29)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). --- CHANGELOG.md | 12 ++++++++++++ setup.py | 2 +- 2 files changed, 13 insertions(+), 1 deletion(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index ce1b11ae..6af254d8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,6 +4,18 @@ [1]: https://pypi.org/project/google-cloud-bigquery-datatransfer/#history +## [3.3.0](https://www.github.com/googleapis/python-bigquery-datatransfer/compare/v3.2.0...v3.3.0) (2021-07-12) + + +### Features + +* add always_use_jwt_access ([#171](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/171)) ([cd4494f](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/cd4494f0dc7304469e7d4a0ed6e13d716b3cacbf)) + + +### Documentation + +* add sample to include run notification ([#173](https://www.github.com/googleapis/python-bigquery-datatransfer/issues/173)) ([ea018c9](https://www.github.com/googleapis/python-bigquery-datatransfer/commit/ea018c9f4a1f9c360dbe9f08650250ea8c505f29)) + ## [3.2.0](https://www.github.com/googleapis/python-bigquery-datatransfer/compare/v3.1.1...v3.2.0) (2021-06-22) diff --git a/setup.py b/setup.py index d1ac14ca..32f5cfb3 100644 --- a/setup.py +++ b/setup.py @@ -22,7 +22,7 @@ name = "google-cloud-bigquery-datatransfer" description = "BigQuery Data Transfer API client library" -version = "3.2.0" +version = "3.3.0" # Should be one of: # 'Development Status :: 3 - Alpha' # 'Development Status :: 4 - Beta'