Free Google Cloud Architect Professional Exam Braindumps (page: 13)

Page 12 of 68
View Related Case Study

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to
BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an automated daily basis while managing cost.

What should you do?

  1. Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
  2. Create a Cloud Function that reads data from BigQuery and cleans it. Trigger it. Trigger the Cloud Function from a Compute Engine instance.
  3. Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
  4. Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.

Answer(s): A



View Related Case Study

For this question, refer to the TerramEarth case study. Considering the technical requirements, how should you reduce the unplanned vehicle downtime in GCP?

  1. Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
  2. Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a Multi-Regional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.
  3. Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a MultiRegional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google data Studio for analysis and reporting.
  4. Use Cloud Dataproc Hive as the data warehouse. Directly stream data into prtitioned Hive tables.
    Use Pig scripts to analyze data.

Answer(s): A



View Related Case Study

For this question, refer to the TerramEarth case study. You are asked to design a new architecture for the ingestion of the data of the 200,000 vehicles that are connected to a cellular network. You want to follow
Google-recommended practices.
Considering the technical requirements, which components should you use for the ingestion of the data?

  1. Google Kubernetes Engine with an SSL Ingress
  2. Cloud IoT Core with public/private key pairs
  3. Compute Engine with project-wide SSH keys
  4. Compute Engine with specific SSH keys

Answer(s): A

Explanation:

https://cloud.google.com/solutions/iot-overview https://cloud.google.com/iot/quotas



View Related Case Study

TerramEarth has about 1 petabyte (PB) of vehicle testing data in a private data center. You want to move the data to Cloud Storage for your machine learning team. Currently, a 1-Gbps interconnect link is available for you. The machine learning team wants to start using the data in a month.
What should you do?

  1. Request Transfer Appliances from Google Cloud, export the data to appliances, and return the appliances to Google Cloud.
  2. Configure the Storage Transfer service from Google Cloud to send the data from your data center to Cloud Storage
  3. Make sure there are no other users consuming the 1 Gbps link, and use multi-thread transfer to upload the data to Cloud Storage.
  4. Export files to an encrypted USB device, send the device to Google Cloud, and request an import of the data to Cloud Storage

Answer(s): A






Post your Comments and Discuss Google Google Cloud Architect Professional exam with other Community members:

Google Cloud Architect Professional Discussions & Posts