Google Cloud Architect Professional: Google Cloud Certified - Professional Cloud Architect
Free Practice Exam Questions
Updated On: 10-Jan-2026

View Related Case Study

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an automated daily basis while managing cost.

What should you do?

  1. Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
  2. Create a Cloud Function that reads data from BigQuery and cleans it. Trigger the Cloud Function from a Compute Engine instance.
  3. Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
  4. Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.

Answer(s): C



View Related Case Study

For this question, refer to the TerramEarth case study. Considering the technical requirements, how should you reduce the unplanned vehicle downtime in GCP?

  1. Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
  2. Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a Multi- Regional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.
  3. Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a Multi-Regional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google Data Studio for analysis and reporting.
  4. Use Cloud Dataproc Hive as the data warehouse. Directly stream data into partitioned Hive tables. Use Pig scripts to analyze data.

Answer(s): A



View Related Case Study

For this question, refer to the TerramEarth case study. You are asked to design a new architecture for the ingestion of the data of the 200,000 vehicles that are connected to a cellular network. You want to follow Google-recommended practices.

Considering the technical requirements, which components should you use for the ingestion of the data?

  1. Google Kubernetes Engine with an SSL Ingress
  2. Cloud IoT Core with public/private key pairs
  3. Compute Engine with project-wide SSH keys
  4. Compute Engine with specific SSH keys

Answer(s): B



View Related Case Study

For this question, refer to the TerramEarth case study. You start to build a new application that uses a few Cloud Functions for the backend. One use case requires a Cloud Function func_display to invoke another Cloud Function func_query. You want func_query only to accept invocations from func_display. You also want to follow Google's recommended best practices.
What should you do?

  1. 1. Create a token and pass it in as an environment variable to func_display.
    2. When invoking func_query, include the token in the request.
    3. Pass the same token to func_query and reject the invocation if the tokens are different.
  2. 1. Make func_query 'Require authentication.'
    2. Create a unique service account and associate it to func_display.
    3. Grant the service account invoker role for func_query.
    4. Create an id token in func_display and include the token to the request when invoking func_query.
  3. 1. Make func_query 'Require authentication' and only accept internal traffic.
    2. Create those two functions in the same VP
    3. Create an ingress firewall rule for func_query to only allow traffic from func_display.
  4. 1. Create those two functions in the same project and VPC.
    2. Make func_query only accept internal traffic.
    3. Create an ingress firewall for func_query to only allow traffic from func_display.
    4. Make sure both functions use the same service account.

Answer(s): B



View Related Case Study

For this question, refer to the TerramEarth case study. You have broken down a legacy monolithic application into a few containerized RESTful microservices. You want to run those microservices on Cloud Run. You also want to make sure the services are highly available with low latency to your customers.
What should you do?

  1. 1. Deploy Cloud Run services to multiple availability zones.
    2. Create an Apigee instance that points to the services.
    3. Create a global external HTTP(S) Load Balancing instance and attach Apigee to its backend.
  2. 1. Deploy Cloud Run services to multiple regions.
    2. Create serverless network endpoint groups pointing to the services.
    3. Add the serverless NEGs to a backend service that is used by a global HTTP(S) Load Balancing instance.
  3. 1. Deploy Cloud Run services to multiple regions.
    2. In Cloud DNS, create a geo-based DNS name that points to the services.
  4. 1. Deploy Cloud Run services to multiple availability zones.
    2. Create a TCP/IP global load balancer, and attach Apigee to its backend service.

Answer(s): C



Viewing page 10 of 61
Viewing questions 46 - 50 out of 297 questions



Post your Comments and Discuss Google Google Cloud Architect Professional exam prep with other Community members:

Google Cloud Architect Professional Exam Discussions & Posts