Skip to content

Commit 1f8a79d

Browse files
authored
Merge pull request googleapis#2657 from dhermes/docstring-doctest-prework
Removing explicit (and implicit) doctest blocks from Sphinx docs.
2 parents c56e6e7 + 372290f commit 1f8a79d

14 files changed

Lines changed: 241 additions & 152 deletions

File tree

datastore/google/cloud/datastore/client.py

Lines changed: 21 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -447,29 +447,35 @@ def query(self, **kwargs):
447447
448448
Passes our ``project``.
449449
450-
Using query to search a datastore::
450+
Using query to search a datastore:
451451
452-
>>> from google.cloud import datastore
453-
>>> client = datastore.Client()
454-
>>> query = client.query(kind='MyKind')
455-
>>> query.add_filter('property', '=', 'val')
452+
.. code-block:: python
453+
454+
>>> from google.cloud import datastore
455+
>>> client = datastore.Client()
456+
>>> query = client.query(kind='MyKind')
457+
>>> query.add_filter('property', '=', 'val')
456458
457459
Using the query iterator's
458460
:meth:`~google.cloud.datastore.query.Iterator.next_page` method:
459461
460-
>>> query_iter = query.fetch()
461-
>>> entities, more_results, cursor = query_iter.next_page()
462-
>>> entities
463-
[<list of Entity unmarshalled from protobuf>]
464-
>>> more_results
465-
<boolean of more results>
466-
>>> cursor
467-
<string containing cursor where fetch stopped>
462+
.. code-block:: python
463+
464+
>>> query_iter = query.fetch()
465+
>>> entities, more_results, cursor = query_iter.next_page()
466+
>>> entities
467+
[<list of Entity unmarshalled from protobuf>]
468+
>>> more_results
469+
<boolean of more results>
470+
>>> cursor
471+
<string containing cursor where fetch stopped>
468472
469473
Under the hood this is doing:
470474
471-
>>> connection.run_query('project', query.to_protobuf())
472-
[<list of Entity Protobufs>], cursor, more_results, skipped_results
475+
.. code-block:: python
476+
477+
>>> connection.run_query('project', query.to_protobuf())
478+
[<list of Entity Protobufs>], cursor, more_results, skipped_results
473479
474480
:type kwargs: dict
475481
:param kwargs: Parameters for initializing and instance of

datastore/google/cloud/datastore/connection.py

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -474,16 +474,20 @@ def lookup(self, project, key_pbs,
474474
as output). It is used under the hood in
475475
:meth:`Client.get() <.datastore.client.Client.get>`:
476476
477-
>>> from google.cloud import datastore
478-
>>> client = datastore.Client(project='project')
479-
>>> key = client.key('MyKind', 1234)
480-
>>> client.get(key)
481-
[<Entity object>]
477+
.. code-block:: python
478+
479+
>>> from google.cloud import datastore
480+
>>> client = datastore.Client(project='project')
481+
>>> key = client.key('MyKind', 1234)
482+
>>> client.get(key)
483+
[<Entity object>]
482484
483485
Using a :class:`Connection` directly:
484486
485-
>>> connection.lookup('project', [key.to_protobuf()])
486-
[<Entity protobuf>]
487+
.. code-block:: python
488+
489+
>>> connection.lookup('project', [key.to_protobuf()])
490+
[<Entity protobuf>]
487491
488492
:type project: str
489493
:param project: The project to look up the keys in.

datastore/google/cloud/datastore/entity.py

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,10 @@ class Entity(dict):
3737
This means you could take an existing entity and change the key
3838
to duplicate the object.
3939
40-
Use :func:`google.cloud.datastore.get` to retrieve an existing entity.
40+
Use :meth:`~google.cloud.datastore.client.Client.get` to retrieve an
41+
existing entity:
42+
43+
.. code-block:: python
4144
4245
>>> from google.cloud import datastore
4346
>>> client = datastore.Client()
@@ -47,16 +50,20 @@ class Entity(dict):
4750
You can the set values on the entity just like you would on any
4851
other dictionary.
4952
50-
>>> entity['age'] = 20
51-
>>> entity['name'] = 'JJ'
52-
>>> entity
53-
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
53+
.. code-block:: python
54+
55+
>>> entity['age'] = 20
56+
>>> entity['name'] = 'JJ'
57+
>>> entity
58+
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
5459
5560
And you can convert an entity to a regular Python dictionary with the
5661
``dict`` builtin:
5762
58-
>>> dict(entity)
59-
{'age': 20, 'name': 'JJ'}
63+
.. code-block:: python
64+
65+
>>> dict(entity)
66+
{'age': 20, 'name': 'JJ'}
6067
6168
.. note::
6269

datastore/google/cloud/datastore/key.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,20 +25,26 @@ class Key(object):
2525
2626
To create a basic key:
2727
28+
.. code-block:: python
29+
2830
>>> Key('EntityKind', 1234)
2931
<Key[{'kind': 'EntityKind', 'id': 1234}]>
3032
>>> Key('EntityKind', 'foo')
3133
<Key[{'kind': 'EntityKind', 'name': 'foo'}]>
3234
3335
To create a key with a parent:
3436
37+
.. code-block:: python
38+
3539
>>> Key('Parent', 'foo', 'Child', 1234)
3640
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
3741
>>> Key('Child', 1234, parent=parent_key)
3842
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
3943
4044
To create a partial key:
4145
46+
.. code-block:: python
47+
4248
>>> Key('Parent', 'foo', 'Child')
4349
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child'}]>
4450

datastore/google/cloud/datastore/transaction.py

Lines changed: 22 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -25,22 +25,27 @@ class Transaction(Batch):
2525
2626
For example, the following snippet of code will put the two ``save``
2727
operations (either ``insert`` or ``upsert``) into the same
28-
mutation, and execute those within a transaction::
28+
mutation, and execute those within a transaction:
29+
30+
.. code-block:: python
2931
30-
>>> from google.cloud import datastore
3132
>>> client = datastore.Client()
3233
>>> with client.transaction():
3334
... client.put_multi([entity1, entity2])
3435
35-
Because it derives from :class:`Batch <.datastore.batch.Batch>`,
36-
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods::
36+
Because it derives from :class:`~google.cloud.datastore.batch.Batch`,
37+
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods:
38+
39+
.. code-block:: python
3740
3841
>>> with client.transaction() as xact:
3942
... xact.put(entity1)
4043
... xact.delete(entity2.key)
4144
4245
By default, the transaction is rolled back if the transaction block
43-
exits with an error::
46+
exits with an error:
47+
48+
.. code-block:: python
4449
4550
>>> with client.transaction():
4651
... do_some_work()
@@ -49,9 +54,13 @@ class Transaction(Batch):
4954
If the transaction block exists without an exception, it will commit
5055
by default.
5156
52-
.. warning:: Inside a transaction, automatically assigned IDs for
57+
.. warning::
58+
59+
Inside a transaction, automatically assigned IDs for
5360
entities will not be available at save time! That means, if you
54-
try::
61+
try:
62+
63+
.. code-block:: python
5564
5665
>>> with client.transaction():
5766
... entity = datastore.Entity(key=client.key('Thing'))
@@ -61,7 +70,9 @@ class Transaction(Batch):
6170
committed.
6271
6372
Once you exit the transaction (or call :meth:`commit`), the
64-
automatically generated ID will be assigned to the entity::
73+
automatically generated ID will be assigned to the entity:
74+
75+
.. code-block:: python
6576
6677
>>> with client.transaction():
6778
... entity = datastore.Entity(key=client.key('Thing'))
@@ -73,7 +84,9 @@ class Transaction(Batch):
7384
False
7485
7586
If you don't want to use the context manager you can initialize a
76-
transaction manually::
87+
transaction manually:
88+
89+
.. code-block:: python
7790
7891
>>> transaction = client.transaction()
7992
>>> transaction.begin()

docs/bigquery-usage.rst

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Authentication / Configuration
2020
:envvar:`GOOGLE_CLOUD_PROJECT` environment variables, create an instance of
2121
:class:`Client <google.cloud.bigquery.client.Client>`.
2222

23-
.. doctest::
23+
.. code-block:: python
2424
2525
>>> from google.cloud import bigquery
2626
>>> client = bigquery.Client()
@@ -39,7 +39,7 @@ To override the project inferred from the environment, pass an explicit
3939
``project`` to the constructor, or to either of the alternative
4040
``classmethod`` factories:
4141

42-
.. doctest::
42+
.. code-block:: python
4343
4444
>>> from google.cloud import bigquery
4545
>>> client = bigquery.Client(project='PROJECT_ID')
@@ -101,7 +101,7 @@ Patch metadata for a dataset:
101101

102102
Replace the ACL for a dataset, and update all writeable fields:
103103

104-
.. doctest::
104+
.. code-block:: python
105105
106106
>>> from google.cloud import bigquery
107107
>>> client = bigquery.Client()
@@ -231,7 +231,7 @@ Querying data (asynchronous)
231231

232232
Background a query, loading the results into a table:
233233

234-
.. doctest::
234+
.. code-block:: python
235235
236236
>>> from google.cloud import bigquery
237237
>>> client = bigquery.Client()
@@ -262,7 +262,7 @@ Background a query, loading the results into a table:
262262

263263
Then, begin executing the job on the server:
264264

265-
.. doctest::
265+
.. code-block:: python
266266
267267
>>> job.begin() # API call
268268
>>> job.created
@@ -272,7 +272,7 @@ Then, begin executing the job on the server:
272272
273273
Poll until the job is complete:
274274

275-
.. doctest::
275+
.. code-block:: python
276276
277277
>>> import time
278278
>>> retry_count = 100
@@ -287,7 +287,7 @@ Poll until the job is complete:
287287
288288
Retrieve the results:
289289

290-
.. doctest::
290+
.. code-block:: python
291291
292292
>>> results = job.results()
293293
>>> rows, total_count, token = query.fetch_data() # API requet
@@ -306,7 +306,7 @@ Start a job loading data asynchronously from a set of CSV files, located on
306306
Google Cloud Storage, appending rows into an existing table. First, create
307307
the job locally:
308308

309-
.. doctest::
309+
.. code-block:: python
310310
311311
>>> from google.cloud import bigquery
312312
>>> from google.cloud.bigquery import SchemaField
@@ -337,7 +337,7 @@ the job locally:
337337
338338
Then, begin executing the job on the server:
339339
340-
.. doctest::
340+
.. code-block:: python
341341
342342
>>> job.begin() # API call
343343
>>> job.created
@@ -347,7 +347,7 @@ Then, begin executing the job on the server:
347347
348348
Poll until the job is complete:
349349
350-
.. doctest::
350+
.. code-block:: python
351351
352352
>>> import time
353353
>>> retry_count = 100
@@ -367,7 +367,7 @@ Exporting data (async)
367367
Start a job exporting a table's data asynchronously to a set of CSV files,
368368
located on Google Cloud Storage. First, create the job locally:
369369
370-
.. doctest::
370+
.. code-block:: python
371371
372372
>>> from google.cloud import bigquery
373373
>>> client = bigquery.Client()
@@ -395,7 +395,7 @@ located on Google Cloud Storage. First, create the job locally:
395395
396396
Then, begin executing the job on the server:
397397
398-
.. doctest::
398+
.. code-block:: python
399399
400400
>>> job.begin() # API call
401401
>>> job.created
@@ -405,7 +405,7 @@ Then, begin executing the job on the server:
405405
406406
Poll until the job is complete:
407407
408-
.. doctest::
408+
.. code-block:: python
409409
410410
>>> import time
411411
>>> retry_count = 100
@@ -424,7 +424,7 @@ Copy tables (async)
424424
425425
First, create the job locally:
426426
427-
.. doctest::
427+
.. code-block:: python
428428
429429
>>> from google.cloud import bigquery
430430
>>> client = bigquery.Client()
@@ -449,7 +449,7 @@ First, create the job locally:
449449
450450
Then, begin executing the job on the server:
451451
452-
.. doctest::
452+
.. code-block:: python
453453
454454
>>> job.begin() # API call
455455
>>> job.created
@@ -459,7 +459,7 @@ Then, begin executing the job on the server:
459459
460460
Poll until the job is complete:
461461
462-
.. doctest::
462+
.. code-block:: python
463463
464464
>>> import time
465465
>>> retry_count = 100

0 commit comments

Comments
 (0)