Free Google Cloud Architect Professional Exam Braindumps (page: 15)

Page 14 of 68
View Related Case Study

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to migrate from their current analytics and statistics reporting model to one that meets their technical requirements on Google Cloud Platform.
Which two steps should be part of their migration plan? (Choose two.)

  1. Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow.
  2. Write a schema migration plan to denormalize data for better performance in BigQuery.
  3. Draw an architecture diagram that shows how to move from a single MySQL database to a MySQL
    cluster.
  4. Load 10 TB of analytics data from a previous game into a Cloud SQL instance, and run test queries against the full dataset to confirm that they complete successfully.
  5. Integrate Cloud Armor to defend against possible SQL injection attacks in analytics files uploaded to Cloud Storage.

Answer(s): A,B

Explanation:

https://cloud.google.com/bigquery/docs/loading-
data#loading_denormalized_nested_and_repeated_data



View Related Case Study

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the compute workloads for your company, Mountkirk Games. Considering the Mountkirk Games business and technical requirements, what should you do?

  1. Create network load balancers. Use preemptible Compute Engine instances.
  2. Create network load balancers. Use non-preemptible Compute Engine instances.
  3. Create a global load balancer with managed instance groups and autoscaling policies. Use preemptible Compute Engine instances.
  4. Create a global load balancer with managed instance groups and autoscaling policies. Use non- preemptible Compute Engine instances.

Answer(s): D



View Related Case Study

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their solution for the future in order to take advantage of cloud and technology improvements as they become available.
Which two steps should they take? (Choose two.)

  1. Store as much analytics and game activity data as financially feasible today so it can be used to train machine learning models to predict user behavior in the future.
  2. Begin packaging their game backend artifacts in container images and running them on Kubernetes Engine to improve the availability to scale up or down based on game activity.
  3. Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve development velocity.
  4. Adopt a schema versioning tool to reduce downtime when adding new game features that require storing additional player data in the database.
  5. Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical kernel patches and package updates and reduce the risk of 0-day vulnerabilities.

Answer(s): B,C



View Related Case Study

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants you to design a way to test the analytics platform's resilience to changes in mobile network latency.
What should you do?

  1. Deploy failure injection software to the game analytics platform that can inject additional latency to mobile client analytics traffic.
  2. Build a test client that can be run from a mobile phone emulator on a Compute Engine virtual machine, and run multiple copies in Google Cloud Platform regions all over the world to generate realistic traffic.
  3. Add the ability to introduce a random amount of delay before beginning to process analytics files uploaded from mobile devices.
  4. Create an opt-in beta of the game that runs on players' mobile devices and collects response times from analytics endpoints running in Google Cloud Platform regions all over the world.

Answer(s): D






Post your Comments and Discuss Google Google Cloud Architect Professional exam with other Community members:

Google Cloud Architect Professional Discussions & Posts