@@ -105,7 +105,7 @@ to transform the message data, and writes the results to a
105105* [pom.xml](pom.xml)
106106* [metadata.json](metadata.json)
107107
108- # ## Building a container image
108+ # ## Build the Flex Template
109109
110110> < details><summary>
111111> < i>( Optional) < /i> Run the Apache Beam pipeline locally for development.
@@ -139,55 +139,11 @@ This *Uber JAR* file has all the dependencies embedded so it.
139139You can run this file as a standalone application with no external
140140dependencies on other libraries.
141141
142- Now, we build the
143- [Docker](https://docs.docker.com/engine/docker-overview/)
144- image for the Apache Beam pipeline.
145- We are using
146- [Cloud Build](https://cloud.google.com/cloud-build)
147- so we don' t need a local installation of Docker.
148-
149- > ℹ️ You can speed up subsequent builds with
150- > [Kaniko cache](https://cloud.google.com/cloud-build/docs/kaniko-cache)
151- > in Cloud Build.
152- >
153- > ```sh
154- > # (Optional) Enable to use Kaniko cache by default.
155- > gcloud config set builds/use_kaniko True
156- > ```
157-
158- Cloud Build allows you to
159- [build a Docker image using a `Dockerfile`](https://cloud.google.com/cloud-build/docs/quickstart-docker#build_using_dockerfile).
160- and saves it into
161- [Container Registry](https://cloud.google.com/container-registry/),
162- where the image is accessible to other Google Cloud products.
163-
164- ```sh
165- export TEMPLATE_IMAGE="gcr.io/$PROJECT/samples/dataflow/streaming-beam-sql:latest"
166-
167- # Build the image into Container Registry, this is roughly equivalent to:
168- # gcloud auth configure-docker
169- # docker image build -t $TEMPLATE_IMAGE .
170- # docker push $TEMPLATE_IMAGE
171- gcloud builds submit --tag "$TEMPLATE_IMAGE" .
172- ```
173-
174- > ℹ️ We use the [`.gcloudignore`](.gcloudignore) file to ignore large
175- > files not used for the container image, such as build files.
176- > This helps speed up the build by uploading less data.
177- >
178- > To learn more about `.gcloudignore`, see
179- > [`gcloud topic gcloudignore`](https://cloud.google.com/sdk/gcloud/reference/topic/gcloudignore)
180-
181- Images starting with `gcr.io/PROJECT/` are saved into your project' s
182- Container Registry, where the image is accessible to other Google Cloud products.
183-
184- # ## Creating a Flex Template
185-
186142To run a template, you need to create a * template spec* file containing all the
187143necessary information to run the job, such as the SDK information and metadata.
188144
189145The [` metadata.json` ](metadata.json) file contains additional information for
190- the template such as the " name" , " description" , and input " parameters" field.
146+ the template such as the ` name` , ` description` , and input ` parameters` field.
191147
192148We used
193149[regular expressions](https://docs.microsoft.com/en-us/dotnet/standard/base-types/regular-expression-language-quick-reference)
@@ -198,31 +154,43 @@ and [BigQuery table](https://cloud.google.com/bigquery/docs/tables#table_naming)
198154The template file must be created in a Cloud Storage location,
199155and is used to run a new Dataflow job.
200156
157+ A container image is created, which includes a self-contained application of your pipeline.
158+ Images starting with ` gcr.io/PROJECT/` are saved into your project' s
159+ Container Registry, where the image is accessible to other Google Cloud products.
160+
201161```sh
202162export TEMPLATE_PATH="gs://$BUCKET/samples/dataflow/templates/streaming-beam-sql.json"
163+ export TEMPLATE_IMAGE="gcr.io/$PROJECT/samples/dataflow/streaming-beam-sql:latest"
203164
204165# Build the Flex Template.
205- gcloud beta dataflow flex-template build $TEMPLATE_PATH \
206- --image " $TEMPLATE_IMAGE " \
207- --sdk-language " JAVA" \
208- --metadata-file " metadata.json"
166+ gcloud dataflow flex-template build $TEMPLATE_PATH \
167+ --image-gcr-path "$TEMPLATE_IMAGE" \
168+ --sdk-language "JAVA" \
169+ --flex-template-base-image JAVA11 \
170+ --metadata-file "metadata.json" \
171+ --jar "target/streaming-beam-sql-1.0.jar" \
172+ --env FLEX_TEMPLATE_JAVA_MAIN_CLASS="org.apache.beam.samples.StreamingBeamSQL"
209173```
210174
211175The template is now available through the template file in the Cloud Storage
212176location that you specified.
213177
214- # ## Running a Dataflow Flex Template pipeline
178+ ### Running a Flex Template pipeline
215179
216180You can now run the Apache Beam pipeline in Dataflow by referring to the
217181template file and passing the template
218182[parameters](https://cloud.devsite.corp.google.com/dataflow/docs/guides/specifying-exec-params#setting-other-cloud-dataflow-pipeline-options)
219183required by the pipeline.
220184
221185```sh
222- # Run the Flex Template.
223- gcloud beta dataflow flex-template run " streaming-beam-sql-` date +%Y%m%d-%H%M%S` " \
224- --template-file-gcs-location " $TEMPLATE_PATH " \
225- --parameters " inputSubscription=$SUBSCRIPTION ,outputTable=$PROJECT :$DATASET .$TABLE "
186+ export REGION="us-central1"
187+
188+ # Run the template.
189+ gcloud dataflow flex-template run "streaming-beam-sql-`date +%Y%m%d-%H%M%S`" \
190+ --template-file-gcs-location "$TEMPLATE_PATH" \
191+ --parameters inputSubscription="$SUBSCRIPTION" \
192+ --parameters outputTable="$PROJECT:$DATASET.$TABLE" \
193+ --region "$REGION"
226194```
227195
228196Check the results in BigQuery by running the following query:
0 commit comments