Skip to content

Commit ea50f55

Browse files
authored
Merge pull request #1905 from daspecster/add-logging-export-ownership-doc
Add docs, exporting logging to storage permissions.
2 parents 5c4db0d + 1e4405e commit ea50f55

1 file changed

Lines changed: 62 additions & 1 deletion

File tree

docs/logging-usage.rst

Lines changed: 62 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,13 +205,74 @@ Delete a metric:
205205
>>> metric.exists() # API call
206206
False
207207

208-
209208
Export log entries using sinks
210209
------------------------------
211210

212211
Sinks allow exporting entries which match a given filter to Cloud Storage
213212
buckets, BigQuery datasets, or Cloud Pub/Sub topics.
214213

214+
Export to Cloud storage
215+
~~~~~~~~~~~~~~~~~~~~~~~
216+
217+
Make sure that the storage bucket you want to export logs too has
218+
`cloud-logs@google.com` as the owner. See `Set permission for writing exported logs`_.
219+
220+
Add `cloud-logs@google.com` as the owner of `my-bucket-name`:
221+
222+
.. doctest::
223+
224+
>>> from gcloud import storage
225+
>>> client = storage.Client()
226+
>>> bucket = client.get_bucket('my-bucket-name')
227+
>>> bucket.acl.reload()
228+
>>> logs_group = bucket.acl.group('cloud-logs@google.com')
229+
>>> logs_group.grant_owner()
230+
>>> bucket.acl.add_entity(logs_group)
231+
>>> bucket.acl.save()
232+
233+
.. _Set permission for writing exported logs: https://cloud.google.com/logging/docs/export/configure_export#setting_product_name_short_permissions_for_writing_exported_logs
234+
235+
Export to BigQuery
236+
~~~~~~~~~~~~~~~~~~
237+
238+
To export logs to BigQuery you must log into the Cloud Platform Console
239+
and add `cloud-logs@google.com` to a dataset.
240+
241+
See: `Setting permissions for BigQuery`_
242+
243+
.. doctest::
244+
>>> from gcloud import bigquery
245+
>>> from gcloud.bigquery.dataset import AccessGrant
246+
>>> bigquery_client = bigquery.Client()
247+
>>> dataset = bigquery_client.dataset('my-dataset-name')
248+
>>> dataset.create()
249+
>>> dataset.reload()
250+
>>> grants = dataset.access_grants
251+
>>> grants.append(AccessGrant(
252+
... 'WRITER', 'groupByEmail', 'cloud-logs@google.com')))
253+
>>> dataset.access_grants = grants
254+
>>> dataset.update()
255+
256+
.. _Setting permissions for BigQuery: https://cloud.google.com/logging/docs/export/configure_export#manual-access-bq
257+
258+
Export to Pub/Sub
259+
~~~~~~~~~~~~~~~~~
260+
261+
To export logs to BigQuery you must log into the Cloud Platform Console
262+
and add `cloud-logs@google.com` to a topic.
263+
264+
See: `Setting permissions for Pub/Sub`_
265+
266+
.. doctest::
267+
>>> from gcloud import pubsub
268+
>>> client = pubsub.Client()
269+
>>> topic = client.topic('your-topic-name')
270+
>>> policy = top.get_iam_policy()
271+
>>> policy.owners.add(policy.group('cloud-logs@google.com'))
272+
>>> topic.set_iam_policy(policy)
273+
274+
.. _Setting permissions for Pub/Sub: https://cloud.google.com/logging/docs/export/configure_export#manual-access-pubsub
275+
215276
Create a Cloud Storage sink:
216277

217278
.. doctest::

0 commit comments

Comments
 (0)