Free Google Cloud Architect Professional Exam Braindumps (page: 19)

Page 18 of 68
View Related Case Study

For this question, refer to the Helicopter Racing League (HRL) case study. The HRL development team releases a new version of their predictive capability application every Tuesday evening at 3 a.m. UTC to a repository. The security team at HRL has developed an in-house penetration test Cloud Function called Airwolf.
The security team wants to run Airwolf against the predictive capability application as soon as it is released every Tuesday. You need to set up Airwolf to run at the recurring weekly cadence.
What should you do?

  1. Set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function.
  2. Set up a Cloud Logging sink and a Cloud Storage bucket that triggers a Cloud Function.
  3. Configure the deployment job to notify a Pub/Sub queue that triggers a Cloud Function.
  4. Set up Identity and Access Management (IAM) and Confidential Computing to trigger a Cloud Function.

Answer(s): A



View Related Case Study

For this question, refer to the Helicopter Racing League (HRL) case study. HRL wants better prediction accuracy from their ML prediction models. They want you to use Google's AI Platform so HRL can understand and interpret the predictions.
What should you do?

  1. Use Explainable AI.
  2. Use Vision AI.
  3. Use Google Cloud's operations suite.
  4. Use Jupyter Notebooks.

Answer(s): A


Reference:

https://cloud.google.com/ai-platform/prediction/docs/ai-explanations/preparing- metadata



View Related Case Study

For this question, refer to the Helicopter Racing League (HRL) case study. HRL is looking for a cost- effective approach for storing their race data such as telemetry. They want to keep all historical records, train models using only the previous season's data, and plan for data growth in terms of volume and information collected.
You need to propose a data solution. Considering HRL business requirements and the goals expressed by
CEO S. Hawke, what should you do?

  1. Use Firestore for its scalable and flexible document-based database. Use collections to aggregate race data by season and event.
  2. Use Cloud Spanner for its scalability and ability to version schemas with zero downtime. Split race data using season as a primary key.
  3. Use BigQuery for its scalability and ability to add columns to a schema. Partition race data based on season.
  4. Use Cloud SQL for its ability to automatically manage storage increases and compatibility with MySQL. Use separate database instances for each season.

Answer(s): C


Reference:

https://cloud.google.com/bigquery/public-data



View Related Case Study

For this question, refer to the Helicopter Racing League (HRL) case study. A recent finance audit of cloud infrastructure noted an exceptionally high number of Compute Engine instances are allocated to do video encoding and transcoding. You suspect that these Virtual Machines are zombie machines that were not deleted after their workloads completed. You need to quickly get a list of which VM instances are idle.
What should you do?

  1. Log into each Compute Engine instance and collect disk, CPU, memory, and network usage statistics for analysis.
  2. Use the gcloud compute instances list to list the virtual machine instances that have the idle: true label set.
  3. Use the gcloud recommender command to list the idle virtual machine instances.
  4. From the Google Console, identify which Compute Engine instances in the managed instance groups are no longer responding to health check probes.

Answer(s): C


Reference:

https://cloud.google.com/compute/docs/instances/viewing-and-applying-idle-vm- recommendations






Post your Comments and Discuss Google Google Cloud Architect Professional exam with other Community members:

Google Cloud Architect Professional Discussions & Posts