Skip to content

Commit c3ceeec

Browse files
feat: Add samples for STS GAPIC client (GoogleCloudPlatform#7091)
* feat: Add samples for GAPIC client * refactor(storagetransfer): Update region tags * refactor(storagetransfer)!: Unify `storage-transfer` and `storage/transfer-service` as `storagetransfer` * refactor(storagetransfer): Update region tags for apiary samples Removes legacay `all` and `main` tags * style(storagetransfer): Rename new sample file names to match existing names Keeps things a bit more uniform * docs(storagetransfer): Update copyright info for migrated files * docs(storagetransfer): Update documentation for clarity * docs(storagetransfer): Make `header-check` happy * refactor(storagetransfer): Streamline quick start * refactor(storagetransfer): Updates for `Python Sample Authoring Guide` conformance * refactor(storagetransfer): Repair apiary import * chore(storagetransfer): Fix copyright year * test(storagetransfer): Disable `enforce_type_hints` This folder has old samples that we don't want to heavily modify * feat(storagetransfer): Add GAPIC retry sample * refactor(storagetransfer): cleanup Duration import * test(storagetransfer): Split up existing test file and add tests for GAPIC * fix(storagetransfer): Correctly pass `retry` parameter * feat(storagetransfer): Add create client samples * feat(storagetransfer): Add GAPIC sample for `storagetransfer_transfer_check` * refactor(storagetransfer): Get more info from `get_operation` * refactor(storagetransfer): Use conventional style for argument flags * test(storagetransfer): Import test reliability with `backoff` * test(storagetransfer): Cleanup transfer jobs post-test * test(storagetransfer): Add tests for `transfer_check` * refactor(storagetransfer): Add `description` to match apiary client * test(storagetransfer): Add tests for `nearline_request` * feat(storagetransfer): Improve `TransferOperation` samples by deserializing the payload * docs(storagetransfer): Clarifications * style(storagetransfer): line break (80 char limit) * test(storagetransfer): Source AWS secrets if available * docs(storagetransfer): Update README input Update input for consistency, although not necessarily used * test(storagetransfer): Prepare and use shared, dynamic setup Use shared config, dynamically prepare buckets * test(storagetransfer): Add simple tests for create client * refactor(storagetransfer): Add description to AWS request sample * test(storagetransfer): Move `job_description` to shared test utility * test(storagetransfer): Add tests for S3 -> GCS * fix(storagetransfer): Correct `storagetransfer_create_transfer_client` region tag * fix(storagetransfer): `BEGIN` -> `START` * fix(storagetransfer): `BEGIN` -> `START` * docs(storagetransfer): Typo * feat(storagetransfer): Add `cloud-storage-dpes` and `python-samples-owners` to CODEOWNERS * fix(storagetransfer): Correct shebang on apiary samples * docs: Add comment for AWS secrets * docs(storagetransfer): Fix product URL for README input * refactor(storagetransfer): Remove generic `Exception` catch * test(storagetransfer): Remove unnecessary `noxfile.py` * docs(storagetransfer): Improve `create_client` test documentation * docs(storagetransfer): Improve documentation on `aws-secrets.sh` * docs(storagetransfer): Documentation updates Co-authored-by: JesseLovelace <43148100+JesseLovelace@users.noreply.github.com>
1 parent 3ca114f commit c3ceeec

31 files changed

Lines changed: 1316 additions & 148 deletions

.github/CODEOWNERS

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,7 @@
7171
/spanner/**/*.py @larkee @GoogleCloudPlatform/python-samples-owners
7272
/speech/**/*.py @telpirion @GoogleCloudPlatform/python-samples-owners
7373
/storage/**/*.py @GoogleCloudPlatform/cloud-storage-dpes @GoogleCloudPlatform/python-samples-owners
74+
/storagetransfer/**/*.py @GoogleCloudPlatform/cloud-storage-dpes @GoogleCloudPlatform/python-samples-owners
7475
/tables/automl/**/*.py @telpirion @GoogleCloudPlatform/python-samples-owners
7576
/tasks/**/*.py @averikitsch @GoogleCloudPlatform/python-samples-owners
7677
/texttospeech/**/*.py @telpirion @GoogleCloudPlatform/python-samples-owners

.kokoro/tests/run_tests.sh

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -96,6 +96,12 @@ fi
9696
source ./testing/test-env.sh
9797
export GOOGLE_APPLICATION_CREDENTIALS=$(pwd)/testing/service-account.json
9898

99+
# Import secrets for AWS integration testing. This can be used for products
100+
# such as Storage Transfer Service.
101+
if [[ -f "${KOKORO_GFILE_DIR}/aws-secrets.sh" ]]; then
102+
source "${KOKORO_GFILE_DIR}/aws-secrets.sh"
103+
fi
104+
99105
# For cloud-run session, we activate the service account for gcloud sdk.
100106
gcloud auth activate-service-account \
101107
--key-file "${GOOGLE_APPLICATION_CREDENTIALS}"

storage/transfer_service/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
These samples have been moved.
2+
3+
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/main/storagetransfer

storage/transfer_service/README.rst.in

Lines changed: 0 additions & 30 deletions
This file was deleted.

storage/transfer_service/requirements-test.txt

Lines changed: 0 additions & 1 deletion
This file was deleted.

storage/transfer_service/sts_snippets_test.py

Lines changed: 0 additions & 89 deletions
This file was deleted.
File renamed without changes.

storagetransfer/README.rst.in

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# This file is used to generate README.rst
2+
3+
product:
4+
name: Storage Transfer Service
5+
short_name: Storage Transfer Service
6+
url: https://cloud.google.com/storage-transfer-service/
7+
description: >
8+
`Storage Transfer Service`_: secure, low-cost
9+
services for transferring data from cloud or on-premises sources.
10+
11+
description: >
12+
These samples demonstrate how to transfer data between Google Cloud Storage
13+
and other storage systems.
14+
15+
setup:
16+
- auth
17+
- install_deps
18+
19+
samples:
20+
- name: Quickstart
21+
file: quickstart.py
22+
show_help: true
23+
- name: Create STS Client
24+
file: create_client.py
25+
show_help: true
26+
- name: Transfer to GCS Nearline
27+
file: nearline_request.py
28+
show_help: true
29+
- name: Transfer from AWS
30+
file: aws_request.py
31+
show_help: true
32+
- name: Check transfer status
33+
file: transfer_check.py
34+
show_help: true
35+
- name: Check Latest Transfer Operation
36+
file: check_latest_transfer_operation.py
37+
show_help: true
38+
- name: Get Transfer Job with Retries
39+
file: get_transfer_job_with_retries.py
40+
show_help: true
41+
42+
folder: storagetransfer

storagetransfer/aws_request.py

Lines changed: 126 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,126 @@
1+
#!/usr/bin/env python
2+
3+
# Copyright 2021 Google LLC
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
17+
"""
18+
Command-line sample that creates a one-time transfer from Amazon S3 to
19+
Google Cloud Storage.
20+
"""
21+
22+
23+
import argparse
24+
25+
# [START storagetransfer_transfer_from_aws]
26+
from datetime import datetime
27+
28+
from google.cloud import storage_transfer
29+
30+
31+
def create_one_time_aws_transfer(
32+
project_id: str, description: str,
33+
source_bucket: str, aws_access_key_id: str,
34+
aws_secret_access_key: str, sink_bucket: str):
35+
"""Creates a one-time transfer job from Amazon S3 to Google Cloud
36+
Storage."""
37+
38+
client = storage_transfer.StorageTransferServiceClient()
39+
40+
# The ID of the Google Cloud Platform Project that owns the job
41+
# project_id = 'my-project-id'
42+
43+
# A useful description for your transfer job
44+
# description = 'My transfer job'
45+
46+
# AWS S3 source bucket name
47+
# source_bucket = 'my-s3-source-bucket'
48+
49+
# AWS Access Key ID
50+
# aws_access_key_id = 'AKIA...'
51+
52+
# AWS Secret Access Key
53+
# aws_secret_access_key = 'HEAoMK2.../...ku8'
54+
55+
# Google Cloud Storage destination bucket name
56+
# sink_bucket = 'my-gcs-destination-bucket'
57+
58+
now = datetime.utcnow()
59+
# Setting the start date and the end date as
60+
# the same time creates a one-time transfer
61+
one_time_schedule = {
62+
'day': now.day,
63+
'month': now.month,
64+
'year': now.year
65+
}
66+
67+
transfer_job_request = storage_transfer.CreateTransferJobRequest({
68+
'transfer_job': {
69+
'project_id': project_id,
70+
'description': description,
71+
'status': storage_transfer.TransferJob.Status.ENABLED,
72+
'schedule': {
73+
'schedule_start_date': one_time_schedule,
74+
'schedule_end_date': one_time_schedule
75+
},
76+
'transfer_spec': {
77+
'aws_s3_data_source': {
78+
'bucket_name': source_bucket,
79+
'aws_access_key': {
80+
'access_key_id': aws_access_key_id,
81+
'secret_access_key': aws_secret_access_key,
82+
}
83+
},
84+
'gcs_data_sink': {
85+
'bucket_name': sink_bucket,
86+
}
87+
}
88+
}
89+
})
90+
91+
result = client.create_transfer_job(transfer_job_request)
92+
print(f'Created transferJob: {result.name}')
93+
94+
95+
# [END storagetransfer_transfer_from_aws]
96+
97+
if __name__ == "__main__":
98+
parser = argparse.ArgumentParser(description=__doc__)
99+
parser.add_argument(
100+
'--project-id',
101+
help='The ID of the Google Cloud Platform Project that owns the job',
102+
required=True)
103+
parser.add_argument(
104+
'--description',
105+
help='A useful description for your transfer job',
106+
default='My transfer job')
107+
parser.add_argument(
108+
'--source-bucket',
109+
help='AWS S3 source bucket name',
110+
required=True)
111+
parser.add_argument(
112+
'--aws-access-key-id',
113+
help='AWS access key ID',
114+
required=True)
115+
parser.add_argument(
116+
'--aws-secret-access-key',
117+
help='AWS secret access key',
118+
required=True)
119+
parser.add_argument(
120+
'--sink-bucket',
121+
help='Google Cloud Storage destination bucket name',
122+
required=True)
123+
124+
args = parser.parse_args()
125+
126+
create_one_time_aws_transfer(**vars(args))
Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
#!/usr/bin/env python
22

3-
# Copyright 2015, Google, Inc.
3+
# Copyright 2015 Google LLC
44
# Licensed under the Apache License, Version 2.0 (the "License");
55
# you may not use this file except in compliance with the License.
66
# You may obtain a copy of the License at
@@ -13,7 +13,6 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515

16-
# [START all]
1716
"""Command-line sample that creates a one-time transfer from Amazon S3 to
1817
Google Cloud Storage.
1918
@@ -31,7 +30,7 @@
3130
import googleapiclient.discovery
3231

3332

34-
# [START main]
33+
# [START storagetransfer_transfer_from_aws_apiary]
3534
def main(description, project_id, start_date, start_time, source_bucket,
3635
access_key_id, secret_access_key, sink_bucket):
3736
"""Create a one-time transfer from Amazon S3 to Google Cloud Storage."""
@@ -76,7 +75,7 @@ def main(description, project_id, start_date, start_time, source_bucket,
7675
result = storagetransfer.transferJobs().create(body=transfer_job).execute()
7776
print('Returned transferJob: {}'.format(
7877
json.dumps(result, indent=4)))
79-
# [END main]
78+
# [END storagetransfer_transfer_from_aws_apiary]
8079

8180

8281
if __name__ == '__main__':
@@ -108,4 +107,3 @@ def main(description, project_id, start_date, start_time, source_bucket,
108107
args.access_key_id,
109108
args.secret_access_key,
110109
args.sink_bucket)
111-
# [END all]

0 commit comments

Comments
 (0)