Google Cloud Architect Professional: Google Cloud Certified - Professional Cloud Architect
Free Practice Exam Questions (page: 8)
Updated On: 2-Jan-2026

View Related Case Study

You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing customer's wait time for parts. You decided to focus on reduction of the 3 weeks aggregate reporting time.

Which modifications to the company's processes should you recommend?

  1. Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics
  2. Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics
  3. Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics
  4. Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor

Answer(s): A

Explanation:

The Avro binary format is the preferred format for loading compressed data. Avro data is faster to load because the data can be read in parallel, even when the data blocks are compressed.
Cloud Storage supports streaming transfers with the gsutil tool or boto library, based on HTTP chunked transfer encoding. Streaming data lets you stream data to and from your Cloud Storage account as soon as it becomes available without requiring that the data be first saved to a separate file. Streaming transfers are useful if you have a process that generates data and you do not want to buffer it locally before uploading it, or if you want to send the result from a computational pipeline directly into Cloud Storage.


Reference:

https://cloud.google.com/storage/docs/streaming https://cloud.google.com/bigquery/docs/loading-data



View Related Case Study

Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform adoption?

  1. Opex/capex allocation, LAN changes, capacity planning
  2. Capacity planning, TCO calculations, opex/capex allocation
  3. Capacity planning, utilization measurement, data center expansion
  4. Data Center expansion, TCO calculations, utilization measurement

Answer(s): B



View Related Case Study

To speed up data retrieval, more vehicles will be upgraded to cellular connections and be able to transmit data to the ETL process. The current FTP process is error-prone and restarts the data transfer from the start of the file when connections fail, which happens often. You want to improve the reliability of the solution and minimize data transfer time on the cellular connections.

What should you do?

  1. Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run the ETL process using data in the bucket
  2. Use multiple Google Container Engine clusters running FTP servers located in different regions. Save the data to Multi-Regional buckets in US, EU, and Asia. Run the ETL process using the data in the bucket
  3. Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket
  4. Directly transfer the files to a different Google Cloud Regional Storage bucket location in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional bucket

Answer(s): C



View Related Case Study

TerramEarth's 20 million vehicles are scattered around the world. Based on the vehicle's location, its telemetry data is stored in a Google Cloud Storage (GCS) regional bucket (US, Europe, or Asia). The CTO has asked you to run a report on the raw telemetry data to determine why vehicles are breaking down after 100 K miles.
You want to run this job on all the data.

What is the most cost-effective way to run this job?

  1. Move all the data into 1 zone, then launch a Cloud Dataproc cluster to run the job
  2. Move all the data into 1 region, then launch a Google Cloud Dataproc cluster to run the job
  3. Launch a cluster in each region to preprocess and compress the raw data, then move the data into a multi- region bucket and use a Dataproc cluster to finish the job
  4. Launch a cluster in each region to preprocess and compress the raw data, then move the data into a region bucket and use a Cloud Dataproc cluster to finish the job

Answer(s): D



View Related Case Study

TerramEarth has equipped all connected trucks with servers and sensors to collect telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs.

What should they do?

  1. Have the vehicle's computer compress the data in hourly snapshots, and store it in a Google Cloud Storage (GCS) Nearline bucket
  2. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery
  3. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable
  4. Have the vehicle's computer compress the data in hourly snapshots, and store it in a GCS Coldline bucket

Answer(s): D

Explanation:

Storage is the best choice for data that you plan to access at most once a year, due to its slightly lower availability, 90-day minimum storage duration, costs for data access, and higher per-operation costs. For example:
Cold Data Storage - Infrequently accessed data, such as data stored for legal or regulatory reasons, can be stored at low cost as Coldline Storage, and be available when you need it.
Disaster recovery - In the event of a disaster recovery event, recovery time is key. Cloud Storage provides low latency access to data stored as Coldline Storage.


Reference:

https://cloud.google.com/storage/docs/storage-classes



Viewing page 8 of 61
Viewing questions 36 - 40 out of 297 questions



Post your Comments and Discuss Google Google Cloud Architect Professional exam prep with other Community members:

Google Cloud Architect Professional Exam Discussions & Posts