The source code for the library
(and demo code)
lives on GitHub,
You can install the library quickly with pip:
$ pip install gcloud
In order to run the demo, you need to have registred an actual gcloud
project and so you'll need to provide some environment variables to facilitate
authentication to your project:
GCLOUD_TESTS_PROJECT_ID: Developers Console project ID (e.g. bamboo-shift-455).GCLOUD_TESTS_DATASET_ID: The name of the dataset your tests connect to. This is typically the same asGCLOUD_TESTS_PROJECT_ID.GOOGLE_APPLICATION_CREDENTIALS: The path to a JSON key file; seeregression/app_credentials.json.sampleas an example. Such a file can be downloaded directly from the developer's console by clicking "Generate new JSON key". See private key docs for more details.
Run the example script included in the package:
$ python -m gcloud.storage.demo
And that's it!
You should be walking through
a demonstration of using gcloud.storage
to read and write data to Google Cloud Storage.
You can interact with a demo dataset in a Python interactive shell.
Start by importing the demo module and instantiating the demo connection:
>>> from gcloud.storage import demo >>> connection = demo.get_connection()
Once you have the connection, you can create buckets and keys:
>>> connection.get_all_buckets()
[<Bucket: ...>, ...]
>>> bucket = connection.create_bucket('my-new-bucket')
>>> print bucket
<Bucket: my-new-bucket>
>>> key = bucket.new_key('my-test-file.txt')
>>> print key
<Key: my-new-bucket, my-test-file.txt>
>>> key = key.set_contents_from_string('this is test content!')
>>> print key.get_contents_as_string()
'this is test content!'
>>> print bucket.get_all_keys()
[<Key: my-new-bucket, my-test-file.txt>]
>>> key.delete()
>>> bucket.delete()
Note
The get_connection method is just a shortcut for:
>>> from gcloud import storage >>> from gcloud.storage import demo >>> connection = storage.get_connection(demo.PROJECT_ID)