Skip to content

Commit 372290f

Browse files
committed
Removing explicit doctest blocks from Sphinx docs.
This way we can gradually turn them on with **doctest** and make sure they work piece by piece. Also converted some implicit code blocks (`::`) and some implicit doctest blocks (`:` followed by `>>>`) into explicit code blocks.
1 parent 94aa18c commit 372290f

File tree

14 files changed

+241
-152
lines changed

14 files changed

+241
-152
lines changed

datastore/google/cloud/datastore/client.py

Lines changed: 21 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -447,29 +447,35 @@ def query(self, **kwargs):
447447
448448
Passes our ``project``.
449449
450-
Using query to search a datastore::
450+
Using query to search a datastore:
451451
452-
>>> from google.cloud import datastore
453-
>>> client = datastore.Client()
454-
>>> query = client.query(kind='MyKind')
455-
>>> query.add_filter('property', '=', 'val')
452+
.. code-block:: python
453+
454+
>>> from google.cloud import datastore
455+
>>> client = datastore.Client()
456+
>>> query = client.query(kind='MyKind')
457+
>>> query.add_filter('property', '=', 'val')
456458
457459
Using the query iterator's
458460
:meth:`~google.cloud.datastore.query.Iterator.next_page` method:
459461
460-
>>> query_iter = query.fetch()
461-
>>> entities, more_results, cursor = query_iter.next_page()
462-
>>> entities
463-
[<list of Entity unmarshalled from protobuf>]
464-
>>> more_results
465-
<boolean of more results>
466-
>>> cursor
467-
<string containing cursor where fetch stopped>
462+
.. code-block:: python
463+
464+
>>> query_iter = query.fetch()
465+
>>> entities, more_results, cursor = query_iter.next_page()
466+
>>> entities
467+
[<list of Entity unmarshalled from protobuf>]
468+
>>> more_results
469+
<boolean of more results>
470+
>>> cursor
471+
<string containing cursor where fetch stopped>
468472
469473
Under the hood this is doing:
470474
471-
>>> connection.run_query('project', query.to_protobuf())
472-
[<list of Entity Protobufs>], cursor, more_results, skipped_results
475+
.. code-block:: python
476+
477+
>>> connection.run_query('project', query.to_protobuf())
478+
[<list of Entity Protobufs>], cursor, more_results, skipped_results
473479
474480
:type kwargs: dict
475481
:param kwargs: Parameters for initializing and instance of

datastore/google/cloud/datastore/connection.py

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -474,16 +474,20 @@ def lookup(self, project, key_pbs,
474474
as output). It is used under the hood in
475475
:meth:`Client.get() <.datastore.client.Client.get>`:
476476
477-
>>> from google.cloud import datastore
478-
>>> client = datastore.Client(project='project')
479-
>>> key = client.key('MyKind', 1234)
480-
>>> client.get(key)
481-
[<Entity object>]
477+
.. code-block:: python
478+
479+
>>> from google.cloud import datastore
480+
>>> client = datastore.Client(project='project')
481+
>>> key = client.key('MyKind', 1234)
482+
>>> client.get(key)
483+
[<Entity object>]
482484
483485
Using a :class:`Connection` directly:
484486
485-
>>> connection.lookup('project', [key.to_protobuf()])
486-
[<Entity protobuf>]
487+
.. code-block:: python
488+
489+
>>> connection.lookup('project', [key.to_protobuf()])
490+
[<Entity protobuf>]
487491
488492
:type project: str
489493
:param project: The project to look up the keys in.

datastore/google/cloud/datastore/entity.py

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,10 @@ class Entity(dict):
3737
This means you could take an existing entity and change the key
3838
to duplicate the object.
3939
40-
Use :func:`google.cloud.datastore.get` to retrieve an existing entity.
40+
Use :meth:`~google.cloud.datastore.client.Client.get` to retrieve an
41+
existing entity:
42+
43+
.. code-block:: python
4144
4245
>>> from google.cloud import datastore
4346
>>> client = datastore.Client()
@@ -47,16 +50,20 @@ class Entity(dict):
4750
You can the set values on the entity just like you would on any
4851
other dictionary.
4952
50-
>>> entity['age'] = 20
51-
>>> entity['name'] = 'JJ'
52-
>>> entity
53-
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
53+
.. code-block:: python
54+
55+
>>> entity['age'] = 20
56+
>>> entity['name'] = 'JJ'
57+
>>> entity
58+
<Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
5459
5560
And you can convert an entity to a regular Python dictionary with the
5661
``dict`` builtin:
5762
58-
>>> dict(entity)
59-
{'age': 20, 'name': 'JJ'}
63+
.. code-block:: python
64+
65+
>>> dict(entity)
66+
{'age': 20, 'name': 'JJ'}
6067
6168
.. note::
6269

datastore/google/cloud/datastore/key.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,20 +25,26 @@ class Key(object):
2525
2626
To create a basic key:
2727
28+
.. code-block:: python
29+
2830
>>> Key('EntityKind', 1234)
2931
<Key[{'kind': 'EntityKind', 'id': 1234}]>
3032
>>> Key('EntityKind', 'foo')
3133
<Key[{'kind': 'EntityKind', 'name': 'foo'}]>
3234
3335
To create a key with a parent:
3436
37+
.. code-block:: python
38+
3539
>>> Key('Parent', 'foo', 'Child', 1234)
3640
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
3741
>>> Key('Child', 1234, parent=parent_key)
3842
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child', 'id': 1234}]>
3943
4044
To create a partial key:
4145
46+
.. code-block:: python
47+
4248
>>> Key('Parent', 'foo', 'Child')
4349
<Key[{'kind': 'Parent', 'name': 'foo'}, {'kind': 'Child'}]>
4450

datastore/google/cloud/datastore/transaction.py

Lines changed: 22 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -25,22 +25,27 @@ class Transaction(Batch):
2525
2626
For example, the following snippet of code will put the two ``save``
2727
operations (either ``insert`` or ``upsert``) into the same
28-
mutation, and execute those within a transaction::
28+
mutation, and execute those within a transaction:
29+
30+
.. code-block:: python
2931
30-
>>> from google.cloud import datastore
3132
>>> client = datastore.Client()
3233
>>> with client.transaction():
3334
... client.put_multi([entity1, entity2])
3435
35-
Because it derives from :class:`Batch <.datastore.batch.Batch>`,
36-
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods::
36+
Because it derives from :class:`~google.cloud.datastore.batch.Batch`,
37+
:class:`Transaction` also provides :meth:`put` and :meth:`delete` methods:
38+
39+
.. code-block:: python
3740
3841
>>> with client.transaction() as xact:
3942
... xact.put(entity1)
4043
... xact.delete(entity2.key)
4144
4245
By default, the transaction is rolled back if the transaction block
43-
exits with an error::
46+
exits with an error:
47+
48+
.. code-block:: python
4449
4550
>>> with client.transaction():
4651
... do_some_work()
@@ -49,9 +54,13 @@ class Transaction(Batch):
4954
If the transaction block exists without an exception, it will commit
5055
by default.
5156
52-
.. warning:: Inside a transaction, automatically assigned IDs for
57+
.. warning::
58+
59+
Inside a transaction, automatically assigned IDs for
5360
entities will not be available at save time! That means, if you
54-
try::
61+
try:
62+
63+
.. code-block:: python
5564
5665
>>> with client.transaction():
5766
... entity = datastore.Entity(key=client.key('Thing'))
@@ -61,7 +70,9 @@ class Transaction(Batch):
6170
committed.
6271
6372
Once you exit the transaction (or call :meth:`commit`), the
64-
automatically generated ID will be assigned to the entity::
73+
automatically generated ID will be assigned to the entity:
74+
75+
.. code-block:: python
6576
6677
>>> with client.transaction():
6778
... entity = datastore.Entity(key=client.key('Thing'))
@@ -73,7 +84,9 @@ class Transaction(Batch):
7384
False
7485
7586
If you don't want to use the context manager you can initialize a
76-
transaction manually::
87+
transaction manually:
88+
89+
.. code-block:: python
7790
7891
>>> transaction = client.transaction()
7992
>>> transaction.begin()

docs/bigquery-usage.rst

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Authentication / Configuration
2020
:envvar:`GOOGLE_CLOUD_PROJECT` environment variables, create an instance of
2121
:class:`Client <google.cloud.bigquery.client.Client>`.
2222

23-
.. doctest::
23+
.. code-block:: python
2424
2525
>>> from google.cloud import bigquery
2626
>>> client = bigquery.Client()
@@ -39,7 +39,7 @@ To override the project inferred from the environment, pass an explicit
3939
``project`` to the constructor, or to either of the alternative
4040
``classmethod`` factories:
4141

42-
.. doctest::
42+
.. code-block:: python
4343
4444
>>> from google.cloud import bigquery
4545
>>> client = bigquery.Client(project='PROJECT_ID')
@@ -101,7 +101,7 @@ Patch metadata for a dataset:
101101

102102
Replace the ACL for a dataset, and update all writeable fields:
103103

104-
.. doctest::
104+
.. code-block:: python
105105
106106
>>> from google.cloud import bigquery
107107
>>> client = bigquery.Client()
@@ -230,7 +230,7 @@ Querying data (asynchronous)
230230

231231
Background a query, loading the results into a table:
232232

233-
.. doctest::
233+
.. code-block:: python
234234
235235
>>> from google.cloud import bigquery
236236
>>> client = bigquery.Client()
@@ -261,7 +261,7 @@ Background a query, loading the results into a table:
261261

262262
Then, begin executing the job on the server:
263263

264-
.. doctest::
264+
.. code-block:: python
265265
266266
>>> job.begin() # API call
267267
>>> job.created
@@ -271,7 +271,7 @@ Then, begin executing the job on the server:
271271
272272
Poll until the job is complete:
273273

274-
.. doctest::
274+
.. code-block:: python
275275
276276
>>> import time
277277
>>> retry_count = 100
@@ -286,7 +286,7 @@ Poll until the job is complete:
286286
287287
Retrieve the results:
288288

289-
.. doctest::
289+
.. code-block:: python
290290
291291
>>> results = job.results()
292292
>>> rows, total_count, token = query.fetch_data() # API requet
@@ -305,7 +305,7 @@ Start a job loading data asynchronously from a set of CSV files, located on
305305
Google Cloud Storage, appending rows into an existing table. First, create
306306
the job locally:
307307

308-
.. doctest::
308+
.. code-block:: python
309309
310310
>>> from google.cloud import bigquery
311311
>>> from google.cloud.bigquery import SchemaField
@@ -336,7 +336,7 @@ the job locally:
336336
337337
Then, begin executing the job on the server:
338338
339-
.. doctest::
339+
.. code-block:: python
340340
341341
>>> job.begin() # API call
342342
>>> job.created
@@ -346,7 +346,7 @@ Then, begin executing the job on the server:
346346
347347
Poll until the job is complete:
348348
349-
.. doctest::
349+
.. code-block:: python
350350
351351
>>> import time
352352
>>> retry_count = 100
@@ -366,7 +366,7 @@ Exporting data (async)
366366
Start a job exporting a table's data asynchronously to a set of CSV files,
367367
located on Google Cloud Storage. First, create the job locally:
368368
369-
.. doctest::
369+
.. code-block:: python
370370
371371
>>> from google.cloud import bigquery
372372
>>> client = bigquery.Client()
@@ -394,7 +394,7 @@ located on Google Cloud Storage. First, create the job locally:
394394
395395
Then, begin executing the job on the server:
396396
397-
.. doctest::
397+
.. code-block:: python
398398
399399
>>> job.begin() # API call
400400
>>> job.created
@@ -404,7 +404,7 @@ Then, begin executing the job on the server:
404404
405405
Poll until the job is complete:
406406
407-
.. doctest::
407+
.. code-block:: python
408408
409409
>>> import time
410410
>>> retry_count = 100
@@ -423,7 +423,7 @@ Copy tables (async)
423423
424424
First, create the job locally:
425425
426-
.. doctest::
426+
.. code-block:: python
427427
428428
>>> from google.cloud import bigquery
429429
>>> client = bigquery.Client()
@@ -448,7 +448,7 @@ First, create the job locally:
448448
449449
Then, begin executing the job on the server:
450450
451-
.. doctest::
451+
.. code-block:: python
452452
453453
>>> job.begin() # API call
454454
>>> job.created
@@ -458,7 +458,7 @@ Then, begin executing the job on the server:
458458
459459
Poll until the job is complete:
460460
461-
.. doctest::
461+
.. code-block:: python
462462
463463
>>> import time
464464
>>> retry_count = 100

0 commit comments

Comments
 (0)