"Managed Service for Apache Spark" is the new name for the product formerly known as "Dataproc on Compute Engine" (cluster deployment) and "Google Cloud Serverless for Apache Spark" (serverless deployment).
Managed Service for Apache Spark staging buckets
Stay organized with collections
Save and categorize content based on your preferences.
This document provides information about Managed Service for Apache Spark staging buckets.
Managed Service for Apache Spark creates a Cloud Storage staging bucket in your project
or reuses an existing staging bucket from previous batch
creation requests. This is the default bucket created by
Managed Service for Apache Spark clusters. For more
information, see
Managed Service for Apache Spark staging and temp buckets.
Managed Service for Apache Spark stores workload dependencies, config files, and
job driver console output in the staging bucket.
Managed Service for Apache Spark sets regional staging buckets in
Cloud Storage locations
according to the Compute Engine zone where your workload is deployed,
and then creates and manages these project-level, per-location buckets.
Managed Service for Apache Spark-created staging buckets are shared among
workloads in the same region, and are created with a
Cloud Storage soft delete retention
duration set to 0 seconds.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2026-04-08 UTC."],[],[]]