Which Java SDK class can you use to run your Dataflow programs locally?
Answer(s): B
DirectPipelineRunner allows you to execute operations in the pipeline directly, without any optimization. Useful for small local execution and tests
https://cloud.google.com/dataflow/java-sdk/JavaDoc/com/google/cloud/dataflow/sdk/runners/DirectPipelineRunner
The Dataflow SDKs have been recently transitioned into which Apache service?
Answer(s): D
Dataflow SDKs are being transitioned to Apache Beam, as per the latest Google directive
https://cloud.google.com/dataflow/docs/
The _________ for Cloud Bigtable makes it possible to use Cloud Bigtable in a Cloud Dataflow pipeline.
Answer(s): A
The Cloud Dataflow connector for Cloud Bigtable makes it possible to use Cloud Bigtable in a Cloud Dataflow pipeline. You can use the connector for both batch and streaming operations.
https://cloud.google.com/bigtable/docs/dataflow-hbase
Does Dataflow process batch data pipelines or streaming data pipelines?
Dataflow is a unified processing model, and can execute both streaming and batch data pipelines
https://cloud.google.com/dataflow/
Post your Comments and Discuss Google Professional Data Engineer exam with other Community members:
madhan Commented on June 16, 2023 next question EUROPEAN UNION
To protect our content from bots for real learners like you, we ask you to register for free. Sign in or sign up now to continue with the Professional Data Engineer material!