`__
+ for more information.
+ bitrate_bps (int):
+ Required. The video bitrate in bits per
+ second. Must be between 1 and 1,000,000,000.
+ pixel_format (str):
+ Pixel format to use. The default is ``"yuv420p"``.
+
+ Supported pixel formats:
+
+ - 'yuv420p' pixel format.
+ - 'yuv422p' pixel format.
+ - 'yuv444p' pixel format.
+ - 'yuv420p10' 10-bit HDR pixel format.
+ - 'yuv422p10' 10-bit HDR pixel format.
+ - 'yuv444p10' 10-bit HDR pixel format.
+ - 'yuv420p12' 12-bit HDR pixel format.
+ - 'yuv422p12' 12-bit HDR pixel format.
+ - 'yuv444p12' 12-bit HDR pixel format.
+ rate_control_mode (str):
+ Specify the ``rate_control_mode``. The default is ``"vbr"``.
+
+ Supported rate control modes:
+
+ - 'vbr' - variable bitrate
+ - 'crf' - constant rate factor
+ crf_level (int):
+ Target CRF level. Must be between 10 and 36,
+ where 10 is the highest quality and 36 is the
+ most efficient compression. The default is 21.
+ gop_frame_count (int):
+ Select the GOP size based on the specified
+ frame count. Must be greater than zero.
+ gop_duration (google.protobuf.duration_pb2.Duration):
+ Select the GOP size based on the specified duration. The
+ default is ``"3s"``. Note that ``gopDuration`` must be less
+ than or equal to ```segmentDuration`` <#SegmentSettings>`__,
+ and ```segmentDuration`` <#SegmentSettings>`__ must be
+ divisible by ``gopDuration``.
+ profile (str):
+ Enforces the specified codec profile. The following profiles
+ are supported:
+
+ - ``profile0`` (default)
+ - ``profile1``
+ - ``profile2``
+ - ``profile3``
+
+ The available options are
+ `WebM-compatible `__\ {:
+ class="external" }. Note that certain values for this field
+ may cause the transcoder to override other fields you set in
+ the ``Vp9CodecSettings`` message.
+ """
+
+ width_pixels = proto.Field(proto.INT32, number=1,)
+ height_pixels = proto.Field(proto.INT32, number=2,)
+ frame_rate = proto.Field(proto.DOUBLE, number=3,)
+ bitrate_bps = proto.Field(proto.INT32, number=4,)
+ pixel_format = proto.Field(proto.STRING, number=5,)
+ rate_control_mode = proto.Field(proto.STRING, number=6,)
+ crf_level = proto.Field(proto.INT32, number=7,)
+ gop_frame_count = proto.Field(proto.INT32, number=8, oneof="gop_mode",)
+ gop_duration = proto.Field(
+ proto.MESSAGE, number=9, oneof="gop_mode", message=duration_pb2.Duration,
+ )
+ profile = proto.Field(proto.STRING, number=10,)
+
+ h264 = proto.Field(
+ proto.MESSAGE, number=1, oneof="codec_settings", message=H264CodecSettings,
+ )
+ h265 = proto.Field(
+ proto.MESSAGE, number=2, oneof="codec_settings", message=H265CodecSettings,
+ )
+ vp9 = proto.Field(
+ proto.MESSAGE, number=3, oneof="codec_settings", message=Vp9CodecSettings,
+ )
+
+
+class AudioStream(proto.Message):
+ r"""Audio stream resource.
+ Attributes:
+ codec (str):
+ The codec for this audio stream. The default is ``"aac"``.
+
+ Supported audio codecs:
+
+ - 'aac'
+ - 'aac-he'
+ - 'aac-he-v2'
+ - 'mp3'
+ - 'ac3'
+ - 'eac3'
+ bitrate_bps (int):
+ Required. Audio bitrate in bits per second.
+ Must be between 1 and 10,000,000.
+ channel_count (int):
+ Number of audio channels. Must be between 1
+ and 6. The default is 2.
+ channel_layout (Sequence[str]):
+ A list of channel names specifying layout of the audio
+ channels. This only affects the metadata embedded in the
+ container headers, if supported by the specified format. The
+ default is ``["fl", "fr"]``.
+
+ Supported channel names:
+
+ - 'fl' - Front left channel
+ - 'fr' - Front right channel
+ - 'sl' - Side left channel
+ - 'sr' - Side right channel
+ - 'fc' - Front center channel
+ - 'lfe' - Low frequency
+ mapping (Sequence[google.cloud.video.transcoder_v1.types.AudioStream.AudioMapping]):
+ The mapping for the ``Job.edit_list`` atoms with audio
+ ``EditAtom.inputs``.
+ sample_rate_hertz (int):
+ The audio sample rate in Hertz. The default
+ is 48000 Hertz.
+ """
+
+ class AudioMapping(proto.Message):
+ r"""The mapping for the ``Job.edit_list`` atoms with audio
+ ``EditAtom.inputs``.
+
+ Attributes:
+ atom_key (str):
+ Required. The ``EditAtom.key`` that references the atom with
+ audio inputs in the ``Job.edit_list``.
+ input_key (str):
+ Required. The ``Input.key`` that identifies the input file.
+ input_track (int):
+ Required. The zero-based index of the track
+ in the input file.
+ input_channel (int):
+ Required. The zero-based index of the channel
+ in the input audio stream.
+ output_channel (int):
+ Required. The zero-based index of the channel
+ in the output audio stream.
+ gain_db (float):
+ Audio volume control in dB. Negative values
+ decrease volume, positive values increase. The
+ default is 0.
+ """
+
+ atom_key = proto.Field(proto.STRING, number=1,)
+ input_key = proto.Field(proto.STRING, number=2,)
+ input_track = proto.Field(proto.INT32, number=3,)
+ input_channel = proto.Field(proto.INT32, number=4,)
+ output_channel = proto.Field(proto.INT32, number=5,)
+ gain_db = proto.Field(proto.DOUBLE, number=6,)
+
+ codec = proto.Field(proto.STRING, number=1,)
+ bitrate_bps = proto.Field(proto.INT32, number=2,)
+ channel_count = proto.Field(proto.INT32, number=3,)
+ channel_layout = proto.RepeatedField(proto.STRING, number=4,)
+ mapping = proto.RepeatedField(proto.MESSAGE, number=5, message=AudioMapping,)
+ sample_rate_hertz = proto.Field(proto.INT32, number=6,)
+
+
+class TextStream(proto.Message):
+ r"""Encoding of a text stream. For example, closed captions or
+ subtitles.
+
+ Attributes:
+ codec (str):
+ The codec for this text stream. The default is ``"webvtt"``.
+
+ Supported text codecs:
+
+ - 'srt'
+ - 'ttml'
+ - 'cea608'
+ - 'cea708'
+ - 'webvtt'
+ mapping (Sequence[google.cloud.video.transcoder_v1.types.TextStream.TextMapping]):
+ The mapping for the ``Job.edit_list`` atoms with text
+ ``EditAtom.inputs``.
+ """
+
+ class TextMapping(proto.Message):
+ r"""The mapping for the ``Job.edit_list`` atoms with text
+ ``EditAtom.inputs``.
+
+ Attributes:
+ atom_key (str):
+ Required. The ``EditAtom.key`` that references atom with
+ text inputs in the ``Job.edit_list``.
+ input_key (str):
+ Required. The ``Input.key`` that identifies the input file.
+ input_track (int):
+ Required. The zero-based index of the track
+ in the input file.
+ """
+
+ atom_key = proto.Field(proto.STRING, number=1,)
+ input_key = proto.Field(proto.STRING, number=2,)
+ input_track = proto.Field(proto.INT32, number=3,)
+
+ codec = proto.Field(proto.STRING, number=1,)
+ mapping = proto.RepeatedField(proto.MESSAGE, number=3, message=TextMapping,)
+
+
+class SegmentSettings(proto.Message):
+ r"""Segment settings for ``"ts"``, ``"fmp4"`` and ``"vtt"``.
+ Attributes:
+ segment_duration (google.protobuf.duration_pb2.Duration):
+ Duration of the segments in seconds. The default is
+ ``"6.0s"``. Note that ``segmentDuration`` must be greater
+ than or equal to ```gopDuration`` <#videostream>`__, and
+ ``segmentDuration`` must be divisible by
+ ```gopDuration`` <#videostream>`__.
+ individual_segments (bool):
+ Required. Create an individual segment file. The default is
+ ``false``.
+ """
+
+ segment_duration = proto.Field(
+ proto.MESSAGE, number=1, message=duration_pb2.Duration,
+ )
+ individual_segments = proto.Field(proto.BOOL, number=3,)
+
+
+class Encryption(proto.Message):
+ r"""Encryption settings.
+ Attributes:
+ key (str):
+ Required. 128 bit encryption key represented
+ as lowercase hexadecimal digits.
+ iv (str):
+ Required. 128 bit Initialization Vector (IV)
+ represented as lowercase hexadecimal digits.
+ aes_128 (google.cloud.video.transcoder_v1.types.Encryption.Aes128Encryption):
+ Configuration for AES-128 encryption.
+ sample_aes (google.cloud.video.transcoder_v1.types.Encryption.SampleAesEncryption):
+ Configuration for SAMPLE-AES encryption.
+ mpeg_cenc (google.cloud.video.transcoder_v1.types.Encryption.MpegCommonEncryption):
+ Configuration for MPEG Common Encryption
+ (MPEG-CENC).
+ """
+
+ class Aes128Encryption(proto.Message):
+ r"""Configuration for AES-128 encryption.
+ Attributes:
+ key_uri (str):
+ Required. URI of the key delivery service.
+ This URI is inserted into the M3U8 header.
+ """
+
+ key_uri = proto.Field(proto.STRING, number=1,)
+
+ class SampleAesEncryption(proto.Message):
+ r"""Configuration for SAMPLE-AES encryption.
+ Attributes:
+ key_uri (str):
+ Required. URI of the key delivery service.
+ This URI is inserted into the M3U8 header.
+ """
+
+ key_uri = proto.Field(proto.STRING, number=1,)
+
+ class MpegCommonEncryption(proto.Message):
+ r"""Configuration for MPEG Common Encryption (MPEG-CENC).
+ Attributes:
+ key_id (str):
+ Required. 128 bit Key ID represented as
+ lowercase hexadecimal digits for use with common
+ encryption.
+ scheme (str):
+ Required. Specify the encryption scheme.
+ Supported encryption schemes:
+ - 'cenc'
+ - 'cbcs'
+ """
+
+ key_id = proto.Field(proto.STRING, number=1,)
+ scheme = proto.Field(proto.STRING, number=2,)
+
+ key = proto.Field(proto.STRING, number=1,)
+ iv = proto.Field(proto.STRING, number=2,)
+ aes_128 = proto.Field(
+ proto.MESSAGE, number=3, oneof="encryption_mode", message=Aes128Encryption,
+ )
+ sample_aes = proto.Field(
+ proto.MESSAGE, number=4, oneof="encryption_mode", message=SampleAesEncryption,
+ )
+ mpeg_cenc = proto.Field(
+ proto.MESSAGE, number=5, oneof="encryption_mode", message=MpegCommonEncryption,
+ )
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/google/cloud/video/transcoder_v1/types/services.py b/google/cloud/video/transcoder_v1/types/services.py
new file mode 100644
index 0000000..edee4f1
--- /dev/null
+++ b/google/cloud/video/transcoder_v1/types/services.py
@@ -0,0 +1,221 @@
+# -*- coding: utf-8 -*-
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import proto # type: ignore
+
+from google.cloud.video.transcoder_v1.types import resources
+
+
+__protobuf__ = proto.module(
+ package="google.cloud.video.transcoder.v1",
+ manifest={
+ "CreateJobRequest",
+ "ListJobsRequest",
+ "GetJobRequest",
+ "DeleteJobRequest",
+ "ListJobsResponse",
+ "CreateJobTemplateRequest",
+ "ListJobTemplatesRequest",
+ "GetJobTemplateRequest",
+ "DeleteJobTemplateRequest",
+ "ListJobTemplatesResponse",
+ },
+)
+
+
+class CreateJobRequest(proto.Message):
+ r"""Request message for ``TranscoderService.CreateJob``.
+ Attributes:
+ parent (str):
+ Required. The parent location to create and process this
+ job. Format: ``projects/{project}/locations/{location}``
+ job (google.cloud.video.transcoder_v1.types.Job):
+ Required. Parameters for creating transcoding
+ job.
+ """
+
+ parent = proto.Field(proto.STRING, number=1,)
+ job = proto.Field(proto.MESSAGE, number=2, message=resources.Job,)
+
+
+class ListJobsRequest(proto.Message):
+ r"""Request message for ``TranscoderService.ListJobs``. The parent
+ location from which to retrieve the collection of jobs.
+
+ Attributes:
+ parent (str):
+ Required. Format:
+ ``projects/{project}/locations/{location}``
+ page_size (int):
+ The maximum number of items to return.
+ page_token (str):
+ The ``next_page_token`` value returned from a previous List
+ request, if any.
+ filter (str):
+ The filter expression, following the syntax
+ outlined in https://google.aip.dev/160.
+ order_by (str):
+ One or more fields to compare and use to sort
+ the output. See
+ https://google.aip.dev/132#ordering.
+ """
+
+ parent = proto.Field(proto.STRING, number=1,)
+ page_size = proto.Field(proto.INT32, number=2,)
+ page_token = proto.Field(proto.STRING, number=3,)
+ filter = proto.Field(proto.STRING, number=4,)
+ order_by = proto.Field(proto.STRING, number=5,)
+
+
+class GetJobRequest(proto.Message):
+ r"""Request message for ``TranscoderService.GetJob``.
+ Attributes:
+ name (str):
+ Required. The name of the job to retrieve. Format:
+ ``projects/{project}/locations/{location}/jobs/{job}``
+ """
+
+ name = proto.Field(proto.STRING, number=1,)
+
+
+class DeleteJobRequest(proto.Message):
+ r"""Request message for ``TranscoderService.DeleteJob``.
+ Attributes:
+ name (str):
+ Required. The name of the job to delete. Format:
+ ``projects/{project}/locations/{location}/jobs/{job}``
+ """
+
+ name = proto.Field(proto.STRING, number=1,)
+
+
+class ListJobsResponse(proto.Message):
+ r"""Response message for ``TranscoderService.ListJobs``.
+ Attributes:
+ jobs (Sequence[google.cloud.video.transcoder_v1.types.Job]):
+ List of jobs in the specified region.
+ next_page_token (str):
+ The pagination token.
+ unreachable (Sequence[str]):
+ List of regions that could not be reached.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ jobs = proto.RepeatedField(proto.MESSAGE, number=1, message=resources.Job,)
+ next_page_token = proto.Field(proto.STRING, number=2,)
+ unreachable = proto.RepeatedField(proto.STRING, number=3,)
+
+
+class CreateJobTemplateRequest(proto.Message):
+ r"""Request message for ``TranscoderService.CreateJobTemplate``.
+ Attributes:
+ parent (str):
+ Required. The parent location to create this job template.
+ Format: ``projects/{project}/locations/{location}``
+ job_template (google.cloud.video.transcoder_v1.types.JobTemplate):
+ Required. Parameters for creating job
+ template.
+ job_template_id (str):
+ Required. The ID to use for the job template, which will
+ become the final component of the job template's resource
+ name.
+
+ This value should be 4-63 characters, and valid characters
+ must match the regular expression
+ ``[a-zA-Z][a-zA-Z0-9_-]*``.
+ """
+
+ parent = proto.Field(proto.STRING, number=1,)
+ job_template = proto.Field(proto.MESSAGE, number=2, message=resources.JobTemplate,)
+ job_template_id = proto.Field(proto.STRING, number=3,)
+
+
+class ListJobTemplatesRequest(proto.Message):
+ r"""Request message for ``TranscoderService.ListJobTemplates``.
+ Attributes:
+ parent (str):
+ Required. The parent location from which to retrieve the
+ collection of job templates. Format:
+ ``projects/{project}/locations/{location}``
+ page_size (int):
+ The maximum number of items to return.
+ page_token (str):
+ The ``next_page_token`` value returned from a previous List
+ request, if any.
+ filter (str):
+ The filter expression, following the syntax
+ outlined in https://google.aip.dev/160.
+ order_by (str):
+ One or more fields to compare and use to sort
+ the output. See
+ https://google.aip.dev/132#ordering.
+ """
+
+ parent = proto.Field(proto.STRING, number=1,)
+ page_size = proto.Field(proto.INT32, number=2,)
+ page_token = proto.Field(proto.STRING, number=3,)
+ filter = proto.Field(proto.STRING, number=4,)
+ order_by = proto.Field(proto.STRING, number=5,)
+
+
+class GetJobTemplateRequest(proto.Message):
+ r"""Request message for ``TranscoderService.GetJobTemplate``.
+ Attributes:
+ name (str):
+ Required. The name of the job template to retrieve. Format:
+ ``projects/{project}/locations/{location}/jobTemplates/{job_template}``
+ """
+
+ name = proto.Field(proto.STRING, number=1,)
+
+
+class DeleteJobTemplateRequest(proto.Message):
+ r"""Request message for ``TranscoderService.DeleteJobTemplate``.
+ Attributes:
+ name (str):
+ Required. The name of the job template to delete.
+ ``projects/{project}/locations/{location}/jobTemplates/{job_template}``
+ """
+
+ name = proto.Field(proto.STRING, number=1,)
+
+
+class ListJobTemplatesResponse(proto.Message):
+ r"""Response message for ``TranscoderService.ListJobTemplates``.
+ Attributes:
+ job_templates (Sequence[google.cloud.video.transcoder_v1.types.JobTemplate]):
+ List of job templates in the specified
+ region.
+ next_page_token (str):
+ The pagination token.
+ unreachable (Sequence[str]):
+ List of regions that could not be reached.
+ """
+
+ @property
+ def raw_page(self):
+ return self
+
+ job_templates = proto.RepeatedField(
+ proto.MESSAGE, number=1, message=resources.JobTemplate,
+ )
+ next_page_token = proto.Field(proto.STRING, number=2,)
+ unreachable = proto.RepeatedField(proto.STRING, number=3,)
+
+
+__all__ = tuple(sorted(__protobuf__.manifest))
diff --git a/owlbot.py b/owlbot.py
index a74a262..770b5ec 100644
--- a/owlbot.py
+++ b/owlbot.py
@@ -21,7 +21,7 @@
common = gcp.CommonTemplates()
-default_version = "v1beta1"
+default_version = "v1"
for library in s.get_staging_dirs(default_version):
# Work around generator issue https://github.com/googleapis/gapic-generator-python/issues/902
diff --git a/tests/unit/gapic/transcoder_v1/__init__.py b/tests/unit/gapic/transcoder_v1/__init__.py
new file mode 100644
index 0000000..4de6597
--- /dev/null
+++ b/tests/unit/gapic/transcoder_v1/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/tests/unit/gapic/transcoder_v1/test_transcoder_service.py b/tests/unit/gapic/transcoder_v1/test_transcoder_service.py
new file mode 100644
index 0000000..7e2b3b5
--- /dev/null
+++ b/tests/unit/gapic/transcoder_v1/test_transcoder_service.py
@@ -0,0 +1,3097 @@
+# -*- coding: utf-8 -*-
+# Copyright 2020 Google LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import os
+import mock
+import packaging.version
+
+import grpc
+from grpc.experimental import aio
+import math
+import pytest
+from proto.marshal.rules.dates import DurationRule, TimestampRule
+
+
+from google.api_core import client_options
+from google.api_core import exceptions as core_exceptions
+from google.api_core import gapic_v1
+from google.api_core import grpc_helpers
+from google.api_core import grpc_helpers_async
+from google.auth import credentials as ga_credentials
+from google.auth.exceptions import MutualTLSChannelError
+from google.cloud.video.transcoder_v1.services.transcoder_service import (
+ TranscoderServiceAsyncClient,
+)
+from google.cloud.video.transcoder_v1.services.transcoder_service import (
+ TranscoderServiceClient,
+)
+from google.cloud.video.transcoder_v1.services.transcoder_service import pagers
+from google.cloud.video.transcoder_v1.services.transcoder_service import transports
+from google.cloud.video.transcoder_v1.services.transcoder_service.transports.base import (
+ _GOOGLE_AUTH_VERSION,
+)
+from google.cloud.video.transcoder_v1.types import resources
+from google.cloud.video.transcoder_v1.types import services
+from google.oauth2 import service_account
+from google.protobuf import any_pb2 # type: ignore
+from google.protobuf import duration_pb2 # type: ignore
+from google.protobuf import timestamp_pb2 # type: ignore
+from google.rpc import status_pb2 # type: ignore
+import google.auth
+
+
+# TODO(busunkim): Once google-auth >= 1.25.0 is required transitively
+# through google-api-core:
+# - Delete the auth "less than" test cases
+# - Delete these pytest markers (Make the "greater than or equal to" tests the default).
+requires_google_auth_lt_1_25_0 = pytest.mark.skipif(
+ packaging.version.parse(_GOOGLE_AUTH_VERSION) >= packaging.version.parse("1.25.0"),
+ reason="This test requires google-auth < 1.25.0",
+)
+requires_google_auth_gte_1_25_0 = pytest.mark.skipif(
+ packaging.version.parse(_GOOGLE_AUTH_VERSION) < packaging.version.parse("1.25.0"),
+ reason="This test requires google-auth >= 1.25.0",
+)
+
+
+def client_cert_source_callback():
+ return b"cert bytes", b"key bytes"
+
+
+# If default endpoint is localhost, then default mtls endpoint will be the same.
+# This method modifies the default endpoint so the client can produce a different
+# mtls endpoint for endpoint testing purposes.
+def modify_default_endpoint(client):
+ return (
+ "foo.googleapis.com"
+ if ("localhost" in client.DEFAULT_ENDPOINT)
+ else client.DEFAULT_ENDPOINT
+ )
+
+
+def test__get_default_mtls_endpoint():
+ api_endpoint = "example.googleapis.com"
+ api_mtls_endpoint = "example.mtls.googleapis.com"
+ sandbox_endpoint = "example.sandbox.googleapis.com"
+ sandbox_mtls_endpoint = "example.mtls.sandbox.googleapis.com"
+ non_googleapi = "api.example.com"
+
+ assert TranscoderServiceClient._get_default_mtls_endpoint(None) is None
+ assert (
+ TranscoderServiceClient._get_default_mtls_endpoint(api_endpoint)
+ == api_mtls_endpoint
+ )
+ assert (
+ TranscoderServiceClient._get_default_mtls_endpoint(api_mtls_endpoint)
+ == api_mtls_endpoint
+ )
+ assert (
+ TranscoderServiceClient._get_default_mtls_endpoint(sandbox_endpoint)
+ == sandbox_mtls_endpoint
+ )
+ assert (
+ TranscoderServiceClient._get_default_mtls_endpoint(sandbox_mtls_endpoint)
+ == sandbox_mtls_endpoint
+ )
+ assert (
+ TranscoderServiceClient._get_default_mtls_endpoint(non_googleapi)
+ == non_googleapi
+ )
+
+
+@pytest.mark.parametrize(
+ "client_class", [TranscoderServiceClient, TranscoderServiceAsyncClient,]
+)
+def test_transcoder_service_client_from_service_account_info(client_class):
+ creds = ga_credentials.AnonymousCredentials()
+ with mock.patch.object(
+ service_account.Credentials, "from_service_account_info"
+ ) as factory:
+ factory.return_value = creds
+ info = {"valid": True}
+ client = client_class.from_service_account_info(info)
+ assert client.transport._credentials == creds
+ assert isinstance(client, client_class)
+
+ assert client.transport._host == "transcoder.googleapis.com:443"
+
+
+@pytest.mark.parametrize(
+ "client_class", [TranscoderServiceClient, TranscoderServiceAsyncClient,]
+)
+def test_transcoder_service_client_service_account_always_use_jwt(client_class):
+ with mock.patch.object(
+ service_account.Credentials, "with_always_use_jwt_access", create=True
+ ) as use_jwt:
+ creds = service_account.Credentials(None, None, None)
+ client = client_class(credentials=creds)
+ use_jwt.assert_not_called()
+
+
+@pytest.mark.parametrize(
+ "transport_class,transport_name",
+ [
+ (transports.TranscoderServiceGrpcTransport, "grpc"),
+ (transports.TranscoderServiceGrpcAsyncIOTransport, "grpc_asyncio"),
+ ],
+)
+def test_transcoder_service_client_service_account_always_use_jwt_true(
+ transport_class, transport_name
+):
+ with mock.patch.object(
+ service_account.Credentials, "with_always_use_jwt_access", create=True
+ ) as use_jwt:
+ creds = service_account.Credentials(None, None, None)
+ transport = transport_class(credentials=creds, always_use_jwt_access=True)
+ use_jwt.assert_called_once_with(True)
+
+
+@pytest.mark.parametrize(
+ "client_class", [TranscoderServiceClient, TranscoderServiceAsyncClient,]
+)
+def test_transcoder_service_client_from_service_account_file(client_class):
+ creds = ga_credentials.AnonymousCredentials()
+ with mock.patch.object(
+ service_account.Credentials, "from_service_account_file"
+ ) as factory:
+ factory.return_value = creds
+ client = client_class.from_service_account_file("dummy/file/path.json")
+ assert client.transport._credentials == creds
+ assert isinstance(client, client_class)
+
+ client = client_class.from_service_account_json("dummy/file/path.json")
+ assert client.transport._credentials == creds
+ assert isinstance(client, client_class)
+
+ assert client.transport._host == "transcoder.googleapis.com:443"
+
+
+def test_transcoder_service_client_get_transport_class():
+ transport = TranscoderServiceClient.get_transport_class()
+ available_transports = [
+ transports.TranscoderServiceGrpcTransport,
+ ]
+ assert transport in available_transports
+
+ transport = TranscoderServiceClient.get_transport_class("grpc")
+ assert transport == transports.TranscoderServiceGrpcTransport
+
+
+@pytest.mark.parametrize(
+ "client_class,transport_class,transport_name",
+ [
+ (TranscoderServiceClient, transports.TranscoderServiceGrpcTransport, "grpc"),
+ (
+ TranscoderServiceAsyncClient,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ "grpc_asyncio",
+ ),
+ ],
+)
+@mock.patch.object(
+ TranscoderServiceClient,
+ "DEFAULT_ENDPOINT",
+ modify_default_endpoint(TranscoderServiceClient),
+)
+@mock.patch.object(
+ TranscoderServiceAsyncClient,
+ "DEFAULT_ENDPOINT",
+ modify_default_endpoint(TranscoderServiceAsyncClient),
+)
+def test_transcoder_service_client_client_options(
+ client_class, transport_class, transport_name
+):
+ # Check that if channel is provided we won't create a new one.
+ with mock.patch.object(TranscoderServiceClient, "get_transport_class") as gtc:
+ transport = transport_class(credentials=ga_credentials.AnonymousCredentials())
+ client = client_class(transport=transport)
+ gtc.assert_not_called()
+
+ # Check that if channel is provided via str we will create a new one.
+ with mock.patch.object(TranscoderServiceClient, "get_transport_class") as gtc:
+ client = client_class(transport=transport_name)
+ gtc.assert_called()
+
+ # Check the case api_endpoint is provided.
+ options = client_options.ClientOptions(api_endpoint="squid.clam.whelk")
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class(client_options=options)
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host="squid.clam.whelk",
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+ # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS_ENDPOINT is
+ # "never".
+ with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS_ENDPOINT": "never"}):
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class()
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=client.DEFAULT_ENDPOINT,
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+ # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS_ENDPOINT is
+ # "always".
+ with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS_ENDPOINT": "always"}):
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class()
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=client.DEFAULT_MTLS_ENDPOINT,
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+ # Check the case api_endpoint is not provided and GOOGLE_API_USE_MTLS_ENDPOINT has
+ # unsupported value.
+ with mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS_ENDPOINT": "Unsupported"}):
+ with pytest.raises(MutualTLSChannelError):
+ client = client_class()
+
+ # Check the case GOOGLE_API_USE_CLIENT_CERTIFICATE has unsupported value.
+ with mock.patch.dict(
+ os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": "Unsupported"}
+ ):
+ with pytest.raises(ValueError):
+ client = client_class()
+
+ # Check the case quota_project_id is provided
+ options = client_options.ClientOptions(quota_project_id="octopus")
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class(client_options=options)
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=client.DEFAULT_ENDPOINT,
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id="octopus",
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+
+@pytest.mark.parametrize(
+ "client_class,transport_class,transport_name,use_client_cert_env",
+ [
+ (
+ TranscoderServiceClient,
+ transports.TranscoderServiceGrpcTransport,
+ "grpc",
+ "true",
+ ),
+ (
+ TranscoderServiceAsyncClient,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ "grpc_asyncio",
+ "true",
+ ),
+ (
+ TranscoderServiceClient,
+ transports.TranscoderServiceGrpcTransport,
+ "grpc",
+ "false",
+ ),
+ (
+ TranscoderServiceAsyncClient,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ "grpc_asyncio",
+ "false",
+ ),
+ ],
+)
+@mock.patch.object(
+ TranscoderServiceClient,
+ "DEFAULT_ENDPOINT",
+ modify_default_endpoint(TranscoderServiceClient),
+)
+@mock.patch.object(
+ TranscoderServiceAsyncClient,
+ "DEFAULT_ENDPOINT",
+ modify_default_endpoint(TranscoderServiceAsyncClient),
+)
+@mock.patch.dict(os.environ, {"GOOGLE_API_USE_MTLS_ENDPOINT": "auto"})
+def test_transcoder_service_client_mtls_env_auto(
+ client_class, transport_class, transport_name, use_client_cert_env
+):
+ # This tests the endpoint autoswitch behavior. Endpoint is autoswitched to the default
+ # mtls endpoint, if GOOGLE_API_USE_CLIENT_CERTIFICATE is "true" and client cert exists.
+
+ # Check the case client_cert_source is provided. Whether client cert is used depends on
+ # GOOGLE_API_USE_CLIENT_CERTIFICATE value.
+ with mock.patch.dict(
+ os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env}
+ ):
+ options = client_options.ClientOptions(
+ client_cert_source=client_cert_source_callback
+ )
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class(client_options=options)
+
+ if use_client_cert_env == "false":
+ expected_client_cert_source = None
+ expected_host = client.DEFAULT_ENDPOINT
+ else:
+ expected_client_cert_source = client_cert_source_callback
+ expected_host = client.DEFAULT_MTLS_ENDPOINT
+
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=expected_host,
+ scopes=None,
+ client_cert_source_for_mtls=expected_client_cert_source,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+ # Check the case ADC client cert is provided. Whether client cert is used depends on
+ # GOOGLE_API_USE_CLIENT_CERTIFICATE value.
+ with mock.patch.dict(
+ os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env}
+ ):
+ with mock.patch.object(transport_class, "__init__") as patched:
+ with mock.patch(
+ "google.auth.transport.mtls.has_default_client_cert_source",
+ return_value=True,
+ ):
+ with mock.patch(
+ "google.auth.transport.mtls.default_client_cert_source",
+ return_value=client_cert_source_callback,
+ ):
+ if use_client_cert_env == "false":
+ expected_host = client.DEFAULT_ENDPOINT
+ expected_client_cert_source = None
+ else:
+ expected_host = client.DEFAULT_MTLS_ENDPOINT
+ expected_client_cert_source = client_cert_source_callback
+
+ patched.return_value = None
+ client = client_class()
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=expected_host,
+ scopes=None,
+ client_cert_source_for_mtls=expected_client_cert_source,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+ # Check the case client_cert_source and ADC client cert are not provided.
+ with mock.patch.dict(
+ os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env}
+ ):
+ with mock.patch.object(transport_class, "__init__") as patched:
+ with mock.patch(
+ "google.auth.transport.mtls.has_default_client_cert_source",
+ return_value=False,
+ ):
+ patched.return_value = None
+ client = client_class()
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=client.DEFAULT_ENDPOINT,
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+
+@pytest.mark.parametrize(
+ "client_class,transport_class,transport_name",
+ [
+ (TranscoderServiceClient, transports.TranscoderServiceGrpcTransport, "grpc"),
+ (
+ TranscoderServiceAsyncClient,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ "grpc_asyncio",
+ ),
+ ],
+)
+def test_transcoder_service_client_client_options_scopes(
+ client_class, transport_class, transport_name
+):
+ # Check the case scopes are provided.
+ options = client_options.ClientOptions(scopes=["1", "2"],)
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class(client_options=options)
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host=client.DEFAULT_ENDPOINT,
+ scopes=["1", "2"],
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+
+@pytest.mark.parametrize(
+ "client_class,transport_class,transport_name",
+ [
+ (TranscoderServiceClient, transports.TranscoderServiceGrpcTransport, "grpc"),
+ (
+ TranscoderServiceAsyncClient,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ "grpc_asyncio",
+ ),
+ ],
+)
+def test_transcoder_service_client_client_options_credentials_file(
+ client_class, transport_class, transport_name
+):
+ # Check the case credentials file is provided.
+ options = client_options.ClientOptions(credentials_file="credentials.json")
+ with mock.patch.object(transport_class, "__init__") as patched:
+ patched.return_value = None
+ client = client_class(client_options=options)
+ patched.assert_called_once_with(
+ credentials=None,
+ credentials_file="credentials.json",
+ host=client.DEFAULT_ENDPOINT,
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+
+def test_transcoder_service_client_client_options_from_dict():
+ with mock.patch(
+ "google.cloud.video.transcoder_v1.services.transcoder_service.transports.TranscoderServiceGrpcTransport.__init__"
+ ) as grpc_transport:
+ grpc_transport.return_value = None
+ client = TranscoderServiceClient(
+ client_options={"api_endpoint": "squid.clam.whelk"}
+ )
+ grpc_transport.assert_called_once_with(
+ credentials=None,
+ credentials_file=None,
+ host="squid.clam.whelk",
+ scopes=None,
+ client_cert_source_for_mtls=None,
+ quota_project_id=None,
+ client_info=transports.base.DEFAULT_CLIENT_INFO,
+ )
+
+
+def test_create_job(transport: str = "grpc", request_type=services.CreateJobRequest):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job(
+ name="name_value",
+ input_uri="input_uri_value",
+ output_uri="output_uri_value",
+ state=resources.Job.ProcessingState.PENDING,
+ ttl_after_completion_days=2670,
+ template_id="template_id_value",
+ )
+ response = client.create_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.Job)
+ assert response.name == "name_value"
+ assert response.input_uri == "input_uri_value"
+ assert response.output_uri == "output_uri_value"
+ assert response.state == resources.Job.ProcessingState.PENDING
+ assert response.ttl_after_completion_days == 2670
+
+
+def test_create_job_from_dict():
+ test_create_job(request_type=dict)
+
+
+def test_create_job_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ client.create_job()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobRequest()
+
+
+@pytest.mark.asyncio
+async def test_create_job_async(
+ transport: str = "grpc_asyncio", request_type=services.CreateJobRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.Job(
+ name="name_value",
+ input_uri="input_uri_value",
+ output_uri="output_uri_value",
+ state=resources.Job.ProcessingState.PENDING,
+ ttl_after_completion_days=2670,
+ )
+ )
+ response = await client.create_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.Job)
+ assert response.name == "name_value"
+ assert response.input_uri == "input_uri_value"
+ assert response.output_uri == "output_uri_value"
+ assert response.state == resources.Job.ProcessingState.PENDING
+ assert response.ttl_after_completion_days == 2670
+
+
+@pytest.mark.asyncio
+async def test_create_job_async_from_dict():
+ await test_create_job_async(request_type=dict)
+
+
+def test_create_job_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.CreateJobRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ call.return_value = resources.Job()
+ client.create_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_create_job_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.CreateJobRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(resources.Job())
+ await client.create_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+def test_create_job_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.create_job(
+ parent="parent_value", job=resources.Job(name="name_value"),
+ )
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+ assert args[0].job == resources.Job(name="name_value")
+
+
+def test_create_job_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.create_job(
+ services.CreateJobRequest(),
+ parent="parent_value",
+ job=resources.Job(name="name_value"),
+ )
+
+
+@pytest.mark.asyncio
+async def test_create_job_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.create_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(resources.Job())
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.create_job(
+ parent="parent_value", job=resources.Job(name="name_value"),
+ )
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+ assert args[0].job == resources.Job(name="name_value")
+
+
+@pytest.mark.asyncio
+async def test_create_job_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.create_job(
+ services.CreateJobRequest(),
+ parent="parent_value",
+ job=resources.Job(name="name_value"),
+ )
+
+
+def test_list_jobs(transport: str = "grpc", request_type=services.ListJobsRequest):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobsResponse(
+ next_page_token="next_page_token_value", unreachable=["unreachable_value"],
+ )
+ response = client.list_jobs(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobsRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, pagers.ListJobsPager)
+ assert response.next_page_token == "next_page_token_value"
+ assert response.unreachable == ["unreachable_value"]
+
+
+def test_list_jobs_from_dict():
+ test_list_jobs(request_type=dict)
+
+
+def test_list_jobs_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ client.list_jobs()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobsRequest()
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_async(
+ transport: str = "grpc_asyncio", request_type=services.ListJobsRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobsResponse(
+ next_page_token="next_page_token_value",
+ unreachable=["unreachable_value"],
+ )
+ )
+ response = await client.list_jobs(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobsRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, pagers.ListJobsAsyncPager)
+ assert response.next_page_token == "next_page_token_value"
+ assert response.unreachable == ["unreachable_value"]
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_async_from_dict():
+ await test_list_jobs_async(request_type=dict)
+
+
+def test_list_jobs_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.ListJobsRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ call.return_value = services.ListJobsResponse()
+ client.list_jobs(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.ListJobsRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobsResponse()
+ )
+ await client.list_jobs(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+def test_list_jobs_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobsResponse()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.list_jobs(parent="parent_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+
+
+def test_list_jobs_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.list_jobs(
+ services.ListJobsRequest(), parent="parent_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobsResponse()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobsResponse()
+ )
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.list_jobs(parent="parent_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.list_jobs(
+ services.ListJobsRequest(), parent="parent_value",
+ )
+
+
+def test_list_jobs_pager():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials,)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobsResponse(
+ jobs=[resources.Job(), resources.Job(), resources.Job(),],
+ next_page_token="abc",
+ ),
+ services.ListJobsResponse(jobs=[], next_page_token="def",),
+ services.ListJobsResponse(jobs=[resources.Job(),], next_page_token="ghi",),
+ services.ListJobsResponse(jobs=[resources.Job(), resources.Job(),],),
+ RuntimeError,
+ )
+
+ metadata = ()
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)),
+ )
+ pager = client.list_jobs(request={})
+
+ assert pager._metadata == metadata
+
+ results = [i for i in pager]
+ assert len(results) == 6
+ assert all(isinstance(i, resources.Job) for i in results)
+
+
+def test_list_jobs_pages():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials,)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.list_jobs), "__call__") as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobsResponse(
+ jobs=[resources.Job(), resources.Job(), resources.Job(),],
+ next_page_token="abc",
+ ),
+ services.ListJobsResponse(jobs=[], next_page_token="def",),
+ services.ListJobsResponse(jobs=[resources.Job(),], next_page_token="ghi",),
+ services.ListJobsResponse(jobs=[resources.Job(), resources.Job(),],),
+ RuntimeError,
+ )
+ pages = list(client.list_jobs(request={}).pages)
+ for page_, token in zip(pages, ["abc", "def", "ghi", ""]):
+ assert page_.raw_page.next_page_token == token
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_async_pager():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials,
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_jobs), "__call__", new_callable=mock.AsyncMock
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobsResponse(
+ jobs=[resources.Job(), resources.Job(), resources.Job(),],
+ next_page_token="abc",
+ ),
+ services.ListJobsResponse(jobs=[], next_page_token="def",),
+ services.ListJobsResponse(jobs=[resources.Job(),], next_page_token="ghi",),
+ services.ListJobsResponse(jobs=[resources.Job(), resources.Job(),],),
+ RuntimeError,
+ )
+ async_pager = await client.list_jobs(request={},)
+ assert async_pager.next_page_token == "abc"
+ responses = []
+ async for response in async_pager:
+ responses.append(response)
+
+ assert len(responses) == 6
+ assert all(isinstance(i, resources.Job) for i in responses)
+
+
+@pytest.mark.asyncio
+async def test_list_jobs_async_pages():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials,
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_jobs), "__call__", new_callable=mock.AsyncMock
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobsResponse(
+ jobs=[resources.Job(), resources.Job(), resources.Job(),],
+ next_page_token="abc",
+ ),
+ services.ListJobsResponse(jobs=[], next_page_token="def",),
+ services.ListJobsResponse(jobs=[resources.Job(),], next_page_token="ghi",),
+ services.ListJobsResponse(jobs=[resources.Job(), resources.Job(),],),
+ RuntimeError,
+ )
+ pages = []
+ async for page_ in (await client.list_jobs(request={})).pages:
+ pages.append(page_)
+ for page_, token in zip(pages, ["abc", "def", "ghi", ""]):
+ assert page_.raw_page.next_page_token == token
+
+
+def test_get_job(transport: str = "grpc", request_type=services.GetJobRequest):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job(
+ name="name_value",
+ input_uri="input_uri_value",
+ output_uri="output_uri_value",
+ state=resources.Job.ProcessingState.PENDING,
+ ttl_after_completion_days=2670,
+ template_id="template_id_value",
+ )
+ response = client.get_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.Job)
+ assert response.name == "name_value"
+ assert response.input_uri == "input_uri_value"
+ assert response.output_uri == "output_uri_value"
+ assert response.state == resources.Job.ProcessingState.PENDING
+ assert response.ttl_after_completion_days == 2670
+
+
+def test_get_job_from_dict():
+ test_get_job(request_type=dict)
+
+
+def test_get_job_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ client.get_job()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobRequest()
+
+
+@pytest.mark.asyncio
+async def test_get_job_async(
+ transport: str = "grpc_asyncio", request_type=services.GetJobRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.Job(
+ name="name_value",
+ input_uri="input_uri_value",
+ output_uri="output_uri_value",
+ state=resources.Job.ProcessingState.PENDING,
+ ttl_after_completion_days=2670,
+ )
+ )
+ response = await client.get_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.Job)
+ assert response.name == "name_value"
+ assert response.input_uri == "input_uri_value"
+ assert response.output_uri == "output_uri_value"
+ assert response.state == resources.Job.ProcessingState.PENDING
+ assert response.ttl_after_completion_days == 2670
+
+
+@pytest.mark.asyncio
+async def test_get_job_async_from_dict():
+ await test_get_job_async(request_type=dict)
+
+
+def test_get_job_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.GetJobRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ call.return_value = resources.Job()
+ client.get_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_get_job_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.GetJobRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(resources.Job())
+ await client.get_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+def test_get_job_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.get_job(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+def test_get_job_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.get_job(
+ services.GetJobRequest(), name="name_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_get_job_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.Job()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(resources.Job())
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.get_job(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_get_job_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.get_job(
+ services.GetJobRequest(), name="name_value",
+ )
+
+
+def test_delete_job(transport: str = "grpc", request_type=services.DeleteJobRequest):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+ response = client.delete_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert response is None
+
+
+def test_delete_job_from_dict():
+ test_delete_job(request_type=dict)
+
+
+def test_delete_job_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ client.delete_job()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobRequest()
+
+
+@pytest.mark.asyncio
+async def test_delete_job_async(
+ transport: str = "grpc_asyncio", request_type=services.DeleteJobRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ response = await client.delete_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobRequest()
+
+ # Establish that the response is the type that we expect.
+ assert response is None
+
+
+@pytest.mark.asyncio
+async def test_delete_job_async_from_dict():
+ await test_delete_job_async(request_type=dict)
+
+
+def test_delete_job_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.DeleteJobRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ call.return_value = None
+ client.delete_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_delete_job_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.DeleteJobRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ await client.delete_job(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+def test_delete_job_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.delete_job(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+def test_delete_job_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.delete_job(
+ services.DeleteJobRequest(), name="name_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_delete_job_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.delete_job), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.delete_job(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_delete_job_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.delete_job(
+ services.DeleteJobRequest(), name="name_value",
+ )
+
+
+def test_create_job_template(
+ transport: str = "grpc", request_type=services.CreateJobTemplateRequest
+):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate(name="name_value",)
+ response = client.create_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.JobTemplate)
+ assert response.name == "name_value"
+
+
+def test_create_job_template_from_dict():
+ test_create_job_template(request_type=dict)
+
+
+def test_create_job_template_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ client.create_job_template()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobTemplateRequest()
+
+
+@pytest.mark.asyncio
+async def test_create_job_template_async(
+ transport: str = "grpc_asyncio", request_type=services.CreateJobTemplateRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate(name="name_value",)
+ )
+ response = await client.create_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.CreateJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.JobTemplate)
+ assert response.name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_create_job_template_async_from_dict():
+ await test_create_job_template_async(request_type=dict)
+
+
+def test_create_job_template_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.CreateJobTemplateRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ call.return_value = resources.JobTemplate()
+ client.create_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_create_job_template_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.CreateJobTemplateRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate()
+ )
+ await client.create_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+def test_create_job_template_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.create_job_template(
+ parent="parent_value",
+ job_template=resources.JobTemplate(name="name_value"),
+ job_template_id="job_template_id_value",
+ )
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+ assert args[0].job_template == resources.JobTemplate(name="name_value")
+ assert args[0].job_template_id == "job_template_id_value"
+
+
+def test_create_job_template_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.create_job_template(
+ services.CreateJobTemplateRequest(),
+ parent="parent_value",
+ job_template=resources.JobTemplate(name="name_value"),
+ job_template_id="job_template_id_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_create_job_template_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.create_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate()
+ )
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.create_job_template(
+ parent="parent_value",
+ job_template=resources.JobTemplate(name="name_value"),
+ job_template_id="job_template_id_value",
+ )
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+ assert args[0].job_template == resources.JobTemplate(name="name_value")
+ assert args[0].job_template_id == "job_template_id_value"
+
+
+@pytest.mark.asyncio
+async def test_create_job_template_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.create_job_template(
+ services.CreateJobTemplateRequest(),
+ parent="parent_value",
+ job_template=resources.JobTemplate(name="name_value"),
+ job_template_id="job_template_id_value",
+ )
+
+
+def test_list_job_templates(
+ transport: str = "grpc", request_type=services.ListJobTemplatesRequest
+):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobTemplatesResponse(
+ next_page_token="next_page_token_value", unreachable=["unreachable_value"],
+ )
+ response = client.list_job_templates(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobTemplatesRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, pagers.ListJobTemplatesPager)
+ assert response.next_page_token == "next_page_token_value"
+ assert response.unreachable == ["unreachable_value"]
+
+
+def test_list_job_templates_from_dict():
+ test_list_job_templates(request_type=dict)
+
+
+def test_list_job_templates_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ client.list_job_templates()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobTemplatesRequest()
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_async(
+ transport: str = "grpc_asyncio", request_type=services.ListJobTemplatesRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobTemplatesResponse(
+ next_page_token="next_page_token_value",
+ unreachable=["unreachable_value"],
+ )
+ )
+ response = await client.list_job_templates(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.ListJobTemplatesRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, pagers.ListJobTemplatesAsyncPager)
+ assert response.next_page_token == "next_page_token_value"
+ assert response.unreachable == ["unreachable_value"]
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_async_from_dict():
+ await test_list_job_templates_async(request_type=dict)
+
+
+def test_list_job_templates_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.ListJobTemplatesRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ call.return_value = services.ListJobTemplatesResponse()
+ client.list_job_templates(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.ListJobTemplatesRequest()
+
+ request.parent = "parent/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobTemplatesResponse()
+ )
+ await client.list_job_templates(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "parent=parent/value",) in kw["metadata"]
+
+
+def test_list_job_templates_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobTemplatesResponse()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.list_job_templates(parent="parent_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+
+
+def test_list_job_templates_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.list_job_templates(
+ services.ListJobTemplatesRequest(), parent="parent_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = services.ListJobTemplatesResponse()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ services.ListJobTemplatesResponse()
+ )
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.list_job_templates(parent="parent_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].parent == "parent_value"
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.list_job_templates(
+ services.ListJobTemplatesRequest(), parent="parent_value",
+ )
+
+
+def test_list_job_templates_pager():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials,)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobTemplatesResponse(
+ job_templates=[
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ ],
+ next_page_token="abc",
+ ),
+ services.ListJobTemplatesResponse(job_templates=[], next_page_token="def",),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(),], next_page_token="ghi",
+ ),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(), resources.JobTemplate(),],
+ ),
+ RuntimeError,
+ )
+
+ metadata = ()
+ metadata = tuple(metadata) + (
+ gapic_v1.routing_header.to_grpc_metadata((("parent", ""),)),
+ )
+ pager = client.list_job_templates(request={})
+
+ assert pager._metadata == metadata
+
+ results = [i for i in pager]
+ assert len(results) == 6
+ assert all(isinstance(i, resources.JobTemplate) for i in results)
+
+
+def test_list_job_templates_pages():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials,)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates), "__call__"
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobTemplatesResponse(
+ job_templates=[
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ ],
+ next_page_token="abc",
+ ),
+ services.ListJobTemplatesResponse(job_templates=[], next_page_token="def",),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(),], next_page_token="ghi",
+ ),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(), resources.JobTemplate(),],
+ ),
+ RuntimeError,
+ )
+ pages = list(client.list_job_templates(request={}).pages)
+ for page_, token in zip(pages, ["abc", "def", "ghi", ""]):
+ assert page_.raw_page.next_page_token == token
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_async_pager():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials,
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates),
+ "__call__",
+ new_callable=mock.AsyncMock,
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobTemplatesResponse(
+ job_templates=[
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ ],
+ next_page_token="abc",
+ ),
+ services.ListJobTemplatesResponse(job_templates=[], next_page_token="def",),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(),], next_page_token="ghi",
+ ),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(), resources.JobTemplate(),],
+ ),
+ RuntimeError,
+ )
+ async_pager = await client.list_job_templates(request={},)
+ assert async_pager.next_page_token == "abc"
+ responses = []
+ async for response in async_pager:
+ responses.append(response)
+
+ assert len(responses) == 6
+ assert all(isinstance(i, resources.JobTemplate) for i in responses)
+
+
+@pytest.mark.asyncio
+async def test_list_job_templates_async_pages():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials,
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.list_job_templates),
+ "__call__",
+ new_callable=mock.AsyncMock,
+ ) as call:
+ # Set the response to a series of pages.
+ call.side_effect = (
+ services.ListJobTemplatesResponse(
+ job_templates=[
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ resources.JobTemplate(),
+ ],
+ next_page_token="abc",
+ ),
+ services.ListJobTemplatesResponse(job_templates=[], next_page_token="def",),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(),], next_page_token="ghi",
+ ),
+ services.ListJobTemplatesResponse(
+ job_templates=[resources.JobTemplate(), resources.JobTemplate(),],
+ ),
+ RuntimeError,
+ )
+ pages = []
+ async for page_ in (await client.list_job_templates(request={})).pages:
+ pages.append(page_)
+ for page_, token in zip(pages, ["abc", "def", "ghi", ""]):
+ assert page_.raw_page.next_page_token == token
+
+
+def test_get_job_template(
+ transport: str = "grpc", request_type=services.GetJobTemplateRequest
+):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate(name="name_value",)
+ response = client.get_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.JobTemplate)
+ assert response.name == "name_value"
+
+
+def test_get_job_template_from_dict():
+ test_get_job_template(request_type=dict)
+
+
+def test_get_job_template_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ client.get_job_template()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobTemplateRequest()
+
+
+@pytest.mark.asyncio
+async def test_get_job_template_async(
+ transport: str = "grpc_asyncio", request_type=services.GetJobTemplateRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate(name="name_value",)
+ )
+ response = await client.get_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.GetJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert isinstance(response, resources.JobTemplate)
+ assert response.name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_get_job_template_async_from_dict():
+ await test_get_job_template_async(request_type=dict)
+
+
+def test_get_job_template_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.GetJobTemplateRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ call.return_value = resources.JobTemplate()
+ client.get_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_get_job_template_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.GetJobTemplateRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate()
+ )
+ await client.get_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+def test_get_job_template_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate()
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.get_job_template(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+def test_get_job_template_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.get_job_template(
+ services.GetJobTemplateRequest(), name="name_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_get_job_template_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(type(client.transport.get_job_template), "__call__") as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = resources.JobTemplate()
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(
+ resources.JobTemplate()
+ )
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.get_job_template(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_get_job_template_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.get_job_template(
+ services.GetJobTemplateRequest(), name="name_value",
+ )
+
+
+def test_delete_job_template(
+ transport: str = "grpc", request_type=services.DeleteJobTemplateRequest
+):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+ response = client.delete_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert response is None
+
+
+def test_delete_job_template_from_dict():
+ test_delete_job_template(request_type=dict)
+
+
+def test_delete_job_template_empty_call():
+ # This test is a coverage failsafe to make sure that totally empty calls,
+ # i.e. request == None and no flattened fields passed, work.
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport="grpc",
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ client.delete_job_template()
+ call.assert_called()
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobTemplateRequest()
+
+
+@pytest.mark.asyncio
+async def test_delete_job_template_async(
+ transport: str = "grpc_asyncio", request_type=services.DeleteJobTemplateRequest
+):
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # Everything is optional in proto3 as far as the runtime is concerned,
+ # and we are mocking out the actual API, so just send an empty request.
+ request = request_type()
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ response = await client.delete_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == services.DeleteJobTemplateRequest()
+
+ # Establish that the response is the type that we expect.
+ assert response is None
+
+
+@pytest.mark.asyncio
+async def test_delete_job_template_async_from_dict():
+ await test_delete_job_template_async(request_type=dict)
+
+
+def test_delete_job_template_field_headers():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.DeleteJobTemplateRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ call.return_value = None
+ client.delete_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+@pytest.mark.asyncio
+async def test_delete_job_template_field_headers_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Any value that is part of the HTTP/1.1 URI should be sent as
+ # a field header. Set these to a non-empty value.
+ request = services.DeleteJobTemplateRequest()
+
+ request.name = "name/value"
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ await client.delete_job_template(request)
+
+ # Establish that the underlying gRPC stub method was called.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0] == request
+
+ # Establish that the field header was sent.
+ _, _, kw = call.mock_calls[0]
+ assert ("x-goog-request-params", "name=name/value",) in kw["metadata"]
+
+
+def test_delete_job_template_flattened():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ client.delete_job_template(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls) == 1
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+def test_delete_job_template_flattened_error():
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ client.delete_job_template(
+ services.DeleteJobTemplateRequest(), name="name_value",
+ )
+
+
+@pytest.mark.asyncio
+async def test_delete_job_template_flattened_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Mock the actual call within the gRPC stub, and fake the request.
+ with mock.patch.object(
+ type(client.transport.delete_job_template), "__call__"
+ ) as call:
+ # Designate an appropriate return value for the call.
+ call.return_value = None
+
+ call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(None)
+ # Call the method with a truthy value for each flattened field,
+ # using the keyword arguments to the method.
+ response = await client.delete_job_template(name="name_value",)
+
+ # Establish that the underlying call was made with the expected
+ # request object values.
+ assert len(call.mock_calls)
+ _, args, _ = call.mock_calls[0]
+ assert args[0].name == "name_value"
+
+
+@pytest.mark.asyncio
+async def test_delete_job_template_flattened_error_async():
+ client = TranscoderServiceAsyncClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Attempting to call a method with both a request object and flattened
+ # fields is an error.
+ with pytest.raises(ValueError):
+ await client.delete_job_template(
+ services.DeleteJobTemplateRequest(), name="name_value",
+ )
+
+
+def test_credentials_transport_error():
+ # It is an error to provide credentials and a transport instance.
+ transport = transports.TranscoderServiceGrpcTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ with pytest.raises(ValueError):
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), transport=transport,
+ )
+
+ # It is an error to provide a credentials file and a transport instance.
+ transport = transports.TranscoderServiceGrpcTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ with pytest.raises(ValueError):
+ client = TranscoderServiceClient(
+ client_options={"credentials_file": "credentials.json"},
+ transport=transport,
+ )
+
+ # It is an error to provide scopes and a transport instance.
+ transport = transports.TranscoderServiceGrpcTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ with pytest.raises(ValueError):
+ client = TranscoderServiceClient(
+ client_options={"scopes": ["1", "2"]}, transport=transport,
+ )
+
+
+def test_transport_instance():
+ # A client may be instantiated with a custom transport instance.
+ transport = transports.TranscoderServiceGrpcTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ client = TranscoderServiceClient(transport=transport)
+ assert client.transport is transport
+
+
+def test_transport_get_channel():
+ # A client may be instantiated with a custom transport instance.
+ transport = transports.TranscoderServiceGrpcTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ channel = transport.grpc_channel
+ assert channel
+
+ transport = transports.TranscoderServiceGrpcAsyncIOTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+ channel = transport.grpc_channel
+ assert channel
+
+
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+def test_transport_adc(transport_class):
+ # Test default credentials are used if not provided.
+ with mock.patch.object(google.auth, "default") as adc:
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport_class()
+ adc.assert_called_once()
+
+
+def test_transport_grpc_default():
+ # A client should use the gRPC transport by default.
+ client = TranscoderServiceClient(credentials=ga_credentials.AnonymousCredentials(),)
+ assert isinstance(client.transport, transports.TranscoderServiceGrpcTransport,)
+
+
+def test_transcoder_service_base_transport_error():
+ # Passing both a credentials object and credentials_file should raise an error
+ with pytest.raises(core_exceptions.DuplicateCredentialArgs):
+ transport = transports.TranscoderServiceTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ credentials_file="credentials.json",
+ )
+
+
+def test_transcoder_service_base_transport():
+ # Instantiate the base transport.
+ with mock.patch(
+ "google.cloud.video.transcoder_v1.services.transcoder_service.transports.TranscoderServiceTransport.__init__"
+ ) as Transport:
+ Transport.return_value = None
+ transport = transports.TranscoderServiceTransport(
+ credentials=ga_credentials.AnonymousCredentials(),
+ )
+
+ # Every method on the transport should just blindly
+ # raise NotImplementedError.
+ methods = (
+ "create_job",
+ "list_jobs",
+ "get_job",
+ "delete_job",
+ "create_job_template",
+ "list_job_templates",
+ "get_job_template",
+ "delete_job_template",
+ )
+ for method in methods:
+ with pytest.raises(NotImplementedError):
+ getattr(transport, method)(request=object())
+
+
+@requires_google_auth_gte_1_25_0
+def test_transcoder_service_base_transport_with_credentials_file():
+ # Instantiate the base transport with a credentials file
+ with mock.patch.object(
+ google.auth, "load_credentials_from_file", autospec=True
+ ) as load_creds, mock.patch(
+ "google.cloud.video.transcoder_v1.services.transcoder_service.transports.TranscoderServiceTransport._prep_wrapped_messages"
+ ) as Transport:
+ Transport.return_value = None
+ load_creds.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport = transports.TranscoderServiceTransport(
+ credentials_file="credentials.json", quota_project_id="octopus",
+ )
+ load_creds.assert_called_once_with(
+ "credentials.json",
+ scopes=None,
+ default_scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id="octopus",
+ )
+
+
+@requires_google_auth_lt_1_25_0
+def test_transcoder_service_base_transport_with_credentials_file_old_google_auth():
+ # Instantiate the base transport with a credentials file
+ with mock.patch.object(
+ google.auth, "load_credentials_from_file", autospec=True
+ ) as load_creds, mock.patch(
+ "google.cloud.video.transcoder_v1.services.transcoder_service.transports.TranscoderServiceTransport._prep_wrapped_messages"
+ ) as Transport:
+ Transport.return_value = None
+ load_creds.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport = transports.TranscoderServiceTransport(
+ credentials_file="credentials.json", quota_project_id="octopus",
+ )
+ load_creds.assert_called_once_with(
+ "credentials.json",
+ scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id="octopus",
+ )
+
+
+def test_transcoder_service_base_transport_with_adc():
+ # Test the default credentials are used if credentials and credentials_file are None.
+ with mock.patch.object(google.auth, "default", autospec=True) as adc, mock.patch(
+ "google.cloud.video.transcoder_v1.services.transcoder_service.transports.TranscoderServiceTransport._prep_wrapped_messages"
+ ) as Transport:
+ Transport.return_value = None
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport = transports.TranscoderServiceTransport()
+ adc.assert_called_once()
+
+
+@requires_google_auth_gte_1_25_0
+def test_transcoder_service_auth_adc():
+ # If no credentials are provided, we should use ADC credentials.
+ with mock.patch.object(google.auth, "default", autospec=True) as adc:
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ TranscoderServiceClient()
+ adc.assert_called_once_with(
+ scopes=None,
+ default_scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id=None,
+ )
+
+
+@requires_google_auth_lt_1_25_0
+def test_transcoder_service_auth_adc_old_google_auth():
+ # If no credentials are provided, we should use ADC credentials.
+ with mock.patch.object(google.auth, "default", autospec=True) as adc:
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ TranscoderServiceClient()
+ adc.assert_called_once_with(
+ scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id=None,
+ )
+
+
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+@requires_google_auth_gte_1_25_0
+def test_transcoder_service_transport_auth_adc(transport_class):
+ # If credentials and host are not provided, the transport class should use
+ # ADC credentials.
+ with mock.patch.object(google.auth, "default", autospec=True) as adc:
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport_class(quota_project_id="octopus", scopes=["1", "2"])
+ adc.assert_called_once_with(
+ scopes=["1", "2"],
+ default_scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id="octopus",
+ )
+
+
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+@requires_google_auth_lt_1_25_0
+def test_transcoder_service_transport_auth_adc_old_google_auth(transport_class):
+ # If credentials and host are not provided, the transport class should use
+ # ADC credentials.
+ with mock.patch.object(google.auth, "default", autospec=True) as adc:
+ adc.return_value = (ga_credentials.AnonymousCredentials(), None)
+ transport_class(quota_project_id="octopus")
+ adc.assert_called_once_with(
+ scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ quota_project_id="octopus",
+ )
+
+
+@pytest.mark.parametrize(
+ "transport_class,grpc_helpers",
+ [
+ (transports.TranscoderServiceGrpcTransport, grpc_helpers),
+ (transports.TranscoderServiceGrpcAsyncIOTransport, grpc_helpers_async),
+ ],
+)
+def test_transcoder_service_transport_create_channel(transport_class, grpc_helpers):
+ # If credentials and host are not provided, the transport class should use
+ # ADC credentials.
+ with mock.patch.object(
+ google.auth, "default", autospec=True
+ ) as adc, mock.patch.object(
+ grpc_helpers, "create_channel", autospec=True
+ ) as create_channel:
+ creds = ga_credentials.AnonymousCredentials()
+ adc.return_value = (creds, None)
+ transport_class(quota_project_id="octopus", scopes=["1", "2"])
+
+ create_channel.assert_called_with(
+ "transcoder.googleapis.com:443",
+ credentials=creds,
+ credentials_file=None,
+ quota_project_id="octopus",
+ default_scopes=("https://www.googleapis.com/auth/cloud-platform",),
+ scopes=["1", "2"],
+ default_host="transcoder.googleapis.com",
+ ssl_credentials=None,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+def test_transcoder_service_grpc_transport_client_cert_source_for_mtls(transport_class):
+ cred = ga_credentials.AnonymousCredentials()
+
+ # Check ssl_channel_credentials is used if provided.
+ with mock.patch.object(transport_class, "create_channel") as mock_create_channel:
+ mock_ssl_channel_creds = mock.Mock()
+ transport_class(
+ host="squid.clam.whelk",
+ credentials=cred,
+ ssl_channel_credentials=mock_ssl_channel_creds,
+ )
+ mock_create_channel.assert_called_once_with(
+ "squid.clam.whelk:443",
+ credentials=cred,
+ credentials_file=None,
+ scopes=None,
+ ssl_credentials=mock_ssl_channel_creds,
+ quota_project_id=None,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+
+ # Check if ssl_channel_credentials is not provided, then client_cert_source_for_mtls
+ # is used.
+ with mock.patch.object(transport_class, "create_channel", return_value=mock.Mock()):
+ with mock.patch("grpc.ssl_channel_credentials") as mock_ssl_cred:
+ transport_class(
+ credentials=cred,
+ client_cert_source_for_mtls=client_cert_source_callback,
+ )
+ expected_cert, expected_key = client_cert_source_callback()
+ mock_ssl_cred.assert_called_once_with(
+ certificate_chain=expected_cert, private_key=expected_key
+ )
+
+
+def test_transcoder_service_host_no_port():
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ client_options=client_options.ClientOptions(
+ api_endpoint="transcoder.googleapis.com"
+ ),
+ )
+ assert client.transport._host == "transcoder.googleapis.com:443"
+
+
+def test_transcoder_service_host_with_port():
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(),
+ client_options=client_options.ClientOptions(
+ api_endpoint="transcoder.googleapis.com:8000"
+ ),
+ )
+ assert client.transport._host == "transcoder.googleapis.com:8000"
+
+
+def test_transcoder_service_grpc_transport_channel():
+ channel = grpc.secure_channel("http://localhost/", grpc.local_channel_credentials())
+
+ # Check that channel is used if provided.
+ transport = transports.TranscoderServiceGrpcTransport(
+ host="squid.clam.whelk", channel=channel,
+ )
+ assert transport.grpc_channel == channel
+ assert transport._host == "squid.clam.whelk:443"
+ assert transport._ssl_channel_credentials == None
+
+
+def test_transcoder_service_grpc_asyncio_transport_channel():
+ channel = aio.secure_channel("http://localhost/", grpc.local_channel_credentials())
+
+ # Check that channel is used if provided.
+ transport = transports.TranscoderServiceGrpcAsyncIOTransport(
+ host="squid.clam.whelk", channel=channel,
+ )
+ assert transport.grpc_channel == channel
+ assert transport._host == "squid.clam.whelk:443"
+ assert transport._ssl_channel_credentials == None
+
+
+# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are
+# removed from grpc/grpc_asyncio transport constructor.
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+def test_transcoder_service_transport_channel_mtls_with_client_cert_source(
+ transport_class,
+):
+ with mock.patch(
+ "grpc.ssl_channel_credentials", autospec=True
+ ) as grpc_ssl_channel_cred:
+ with mock.patch.object(
+ transport_class, "create_channel"
+ ) as grpc_create_channel:
+ mock_ssl_cred = mock.Mock()
+ grpc_ssl_channel_cred.return_value = mock_ssl_cred
+
+ mock_grpc_channel = mock.Mock()
+ grpc_create_channel.return_value = mock_grpc_channel
+
+ cred = ga_credentials.AnonymousCredentials()
+ with pytest.warns(DeprecationWarning):
+ with mock.patch.object(google.auth, "default") as adc:
+ adc.return_value = (cred, None)
+ transport = transport_class(
+ host="squid.clam.whelk",
+ api_mtls_endpoint="mtls.squid.clam.whelk",
+ client_cert_source=client_cert_source_callback,
+ )
+ adc.assert_called_once()
+
+ grpc_ssl_channel_cred.assert_called_once_with(
+ certificate_chain=b"cert bytes", private_key=b"key bytes"
+ )
+ grpc_create_channel.assert_called_once_with(
+ "mtls.squid.clam.whelk:443",
+ credentials=cred,
+ credentials_file=None,
+ scopes=None,
+ ssl_credentials=mock_ssl_cred,
+ quota_project_id=None,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+ assert transport.grpc_channel == mock_grpc_channel
+ assert transport._ssl_channel_credentials == mock_ssl_cred
+
+
+# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are
+# removed from grpc/grpc_asyncio transport constructor.
+@pytest.mark.parametrize(
+ "transport_class",
+ [
+ transports.TranscoderServiceGrpcTransport,
+ transports.TranscoderServiceGrpcAsyncIOTransport,
+ ],
+)
+def test_transcoder_service_transport_channel_mtls_with_adc(transport_class):
+ mock_ssl_cred = mock.Mock()
+ with mock.patch.multiple(
+ "google.auth.transport.grpc.SslCredentials",
+ __init__=mock.Mock(return_value=None),
+ ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred),
+ ):
+ with mock.patch.object(
+ transport_class, "create_channel"
+ ) as grpc_create_channel:
+ mock_grpc_channel = mock.Mock()
+ grpc_create_channel.return_value = mock_grpc_channel
+ mock_cred = mock.Mock()
+
+ with pytest.warns(DeprecationWarning):
+ transport = transport_class(
+ host="squid.clam.whelk",
+ credentials=mock_cred,
+ api_mtls_endpoint="mtls.squid.clam.whelk",
+ client_cert_source=None,
+ )
+
+ grpc_create_channel.assert_called_once_with(
+ "mtls.squid.clam.whelk:443",
+ credentials=mock_cred,
+ credentials_file=None,
+ scopes=None,
+ ssl_credentials=mock_ssl_cred,
+ quota_project_id=None,
+ options=[
+ ("grpc.max_send_message_length", -1),
+ ("grpc.max_receive_message_length", -1),
+ ],
+ )
+ assert transport.grpc_channel == mock_grpc_channel
+
+
+def test_job_path():
+ project = "squid"
+ location = "clam"
+ job = "whelk"
+ expected = "projects/{project}/locations/{location}/jobs/{job}".format(
+ project=project, location=location, job=job,
+ )
+ actual = TranscoderServiceClient.job_path(project, location, job)
+ assert expected == actual
+
+
+def test_parse_job_path():
+ expected = {
+ "project": "octopus",
+ "location": "oyster",
+ "job": "nudibranch",
+ }
+ path = TranscoderServiceClient.job_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_job_path(path)
+ assert expected == actual
+
+
+def test_job_template_path():
+ project = "cuttlefish"
+ location = "mussel"
+ job_template = "winkle"
+ expected = "projects/{project}/locations/{location}/jobTemplates/{job_template}".format(
+ project=project, location=location, job_template=job_template,
+ )
+ actual = TranscoderServiceClient.job_template_path(project, location, job_template)
+ assert expected == actual
+
+
+def test_parse_job_template_path():
+ expected = {
+ "project": "nautilus",
+ "location": "scallop",
+ "job_template": "abalone",
+ }
+ path = TranscoderServiceClient.job_template_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_job_template_path(path)
+ assert expected == actual
+
+
+def test_common_billing_account_path():
+ billing_account = "squid"
+ expected = "billingAccounts/{billing_account}".format(
+ billing_account=billing_account,
+ )
+ actual = TranscoderServiceClient.common_billing_account_path(billing_account)
+ assert expected == actual
+
+
+def test_parse_common_billing_account_path():
+ expected = {
+ "billing_account": "clam",
+ }
+ path = TranscoderServiceClient.common_billing_account_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_common_billing_account_path(path)
+ assert expected == actual
+
+
+def test_common_folder_path():
+ folder = "whelk"
+ expected = "folders/{folder}".format(folder=folder,)
+ actual = TranscoderServiceClient.common_folder_path(folder)
+ assert expected == actual
+
+
+def test_parse_common_folder_path():
+ expected = {
+ "folder": "octopus",
+ }
+ path = TranscoderServiceClient.common_folder_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_common_folder_path(path)
+ assert expected == actual
+
+
+def test_common_organization_path():
+ organization = "oyster"
+ expected = "organizations/{organization}".format(organization=organization,)
+ actual = TranscoderServiceClient.common_organization_path(organization)
+ assert expected == actual
+
+
+def test_parse_common_organization_path():
+ expected = {
+ "organization": "nudibranch",
+ }
+ path = TranscoderServiceClient.common_organization_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_common_organization_path(path)
+ assert expected == actual
+
+
+def test_common_project_path():
+ project = "cuttlefish"
+ expected = "projects/{project}".format(project=project,)
+ actual = TranscoderServiceClient.common_project_path(project)
+ assert expected == actual
+
+
+def test_parse_common_project_path():
+ expected = {
+ "project": "mussel",
+ }
+ path = TranscoderServiceClient.common_project_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_common_project_path(path)
+ assert expected == actual
+
+
+def test_common_location_path():
+ project = "winkle"
+ location = "nautilus"
+ expected = "projects/{project}/locations/{location}".format(
+ project=project, location=location,
+ )
+ actual = TranscoderServiceClient.common_location_path(project, location)
+ assert expected == actual
+
+
+def test_parse_common_location_path():
+ expected = {
+ "project": "scallop",
+ "location": "abalone",
+ }
+ path = TranscoderServiceClient.common_location_path(**expected)
+
+ # Check that the path construction is reversible.
+ actual = TranscoderServiceClient.parse_common_location_path(path)
+ assert expected == actual
+
+
+def test_client_withDEFAULT_CLIENT_INFO():
+ client_info = gapic_v1.client_info.ClientInfo()
+
+ with mock.patch.object(
+ transports.TranscoderServiceTransport, "_prep_wrapped_messages"
+ ) as prep:
+ client = TranscoderServiceClient(
+ credentials=ga_credentials.AnonymousCredentials(), client_info=client_info,
+ )
+ prep.assert_called_once_with(client_info)
+
+ with mock.patch.object(
+ transports.TranscoderServiceTransport, "_prep_wrapped_messages"
+ ) as prep:
+ transport_class = TranscoderServiceClient.get_transport_class()
+ transport = transport_class(
+ credentials=ga_credentials.AnonymousCredentials(), client_info=client_info,
+ )
+ prep.assert_called_once_with(client_info)
From 86cd3f1c61a649d4c2810a730d7e141e1600fe68 Mon Sep 17 00:00:00 2001
From: "release-please[bot]"
<55107282+release-please[bot]@users.noreply.github.com>
Date: Mon, 12 Jul 2021 23:26:33 +0000
Subject: [PATCH 9/9] chore: release 0.4.0 (#63)
:robot: I have created a release \*beep\* \*boop\*
---
## [0.4.0](https://www.github.com/googleapis/python-video-transcoder/compare/v0.3.1...v0.4.0) (2021-07-09)
### Features
* add always_use_jwt_access ([#62](https://www.github.com/googleapis/python-video-transcoder/issues/62)) ([d43c40e](https://www.github.com/googleapis/python-video-transcoder/commit/d43c40e9ab80c42afd25efa1c2980d23dbc50ce2))
* Add Transcoder V1 ([#67](https://www.github.com/googleapis/python-video-transcoder/issues/67)) ([721d28e](https://www.github.com/googleapis/python-video-transcoder/commit/721d28ec565bfdb41a195167a989baf042ede228))
### Bug Fixes
* disable always_use_jwt_access ([#66](https://www.github.com/googleapis/python-video-transcoder/issues/66)) ([98d8b86](https://www.github.com/googleapis/python-video-transcoder/commit/98d8b860227a9b9a8b4cecc851ec547d7789ac66))
### Documentation
* omit mention of Python 2.7 in 'CONTRIBUTING.rst' ([#1127](https://www.github.com/googleapis/python-video-transcoder/issues/1127)) ([#58](https://www.github.com/googleapis/python-video-transcoder/issues/58)) ([1659ce8](https://www.github.com/googleapis/python-video-transcoder/commit/1659ce88ef94139a271be9719a4adaf4e3a600c0)), closes [#1126](https://www.github.com/googleapis/python-video-transcoder/issues/1126)
---
This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please).
---
CHANGELOG.md | 18 ++++++++++++++++++
setup.py | 2 +-
2 files changed, 19 insertions(+), 1 deletion(-)
diff --git a/CHANGELOG.md b/CHANGELOG.md
index e916b90..6e22c73 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,23 @@
# Changelog
+## [0.4.0](https://www.github.com/googleapis/python-video-transcoder/compare/v0.3.1...v0.4.0) (2021-07-09)
+
+
+### Features
+
+* add always_use_jwt_access ([#62](https://www.github.com/googleapis/python-video-transcoder/issues/62)) ([d43c40e](https://www.github.com/googleapis/python-video-transcoder/commit/d43c40e9ab80c42afd25efa1c2980d23dbc50ce2))
+* Add Transcoder V1 ([#67](https://www.github.com/googleapis/python-video-transcoder/issues/67)) ([721d28e](https://www.github.com/googleapis/python-video-transcoder/commit/721d28ec565bfdb41a195167a989baf042ede228))
+
+
+### Bug Fixes
+
+* disable always_use_jwt_access ([#66](https://www.github.com/googleapis/python-video-transcoder/issues/66)) ([98d8b86](https://www.github.com/googleapis/python-video-transcoder/commit/98d8b860227a9b9a8b4cecc851ec547d7789ac66))
+
+
+### Documentation
+
+* omit mention of Python 2.7 in 'CONTRIBUTING.rst' ([#1127](https://www.github.com/googleapis/python-video-transcoder/issues/1127)) ([#58](https://www.github.com/googleapis/python-video-transcoder/issues/58)) ([1659ce8](https://www.github.com/googleapis/python-video-transcoder/commit/1659ce88ef94139a271be9719a4adaf4e3a600c0)), closes [#1126](https://www.github.com/googleapis/python-video-transcoder/issues/1126)
+
### [0.3.1](https://www.github.com/googleapis/python-video-transcoder/compare/v0.3.0...v0.3.1) (2021-05-28)
diff --git a/setup.py b/setup.py
index 83242bc..f9a5501 100644
--- a/setup.py
+++ b/setup.py
@@ -19,7 +19,7 @@
import os
import setuptools # type: ignore
-version = "0.3.1"
+version = "0.4.0"
package_root = os.path.abspath(os.path.dirname(__file__))
pFad - Phonifier reborn
Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies:
Alternative Proxy
pFad Proxy
pFad v3 Proxy
pFad v4 Proxy