Free Google Cloud Architect Professional Exam Braindumps (page: 23)

Page 22 of 68

During a high traffic portion of the day, one of your relational databases crashes, but the replica is never promoted to a master. You want to avoid this in the future.
What should you do?

  1. Use a different database.
  2. Choose larger instances for your database.
  3. Create snapshots of your database more regularly.
  4. Implement routinely scheduled failovers of your databases.

Answer(s): D

Explanation:

https://cloud.google.com/solutions/dr-scenarios-planning-guide



Your organization requires that metrics from all applications be retained for 5 years for future analysis in possible legal proceedings.
Which approach should you use?

  1. Grant the security team access to the logs in each Project.
  2. Configure Stackdriver Monitoring for all Projects, and export to BigQuery.
  3. Configure Stackdriver Monitoring for all Projects with the default retention policies.
  4. Configure Stackdriver Monitoring for all Projects, and export to Google Cloud Storage.

Answer(s): D

Explanation:

Overview of storage classes, price, and use cases https://cloud.google.com/storage/docs/storage- classes
Why export logs? https://cloud.google.com/logging/docs/export/ StackDriver Quotas and Limits for Monitoring https://cloud.google.com/monitoring/quotas

The BigQuery pricing. https://cloud.google.com/bigquery/pricing



Your company has decided to build a backup replica of their on-premises user authentication PostgreSQL database on Google Cloud Platform. The database is 4 TB, and large updates are frequent. Replication requires private address space communication.
Which networking approach should you use?

  1. Google Cloud Dedicated Interconnect
  2. Google Cloud VPN connected to the data center network
  3. A NAT and TLS translation gateway installed on-premises
  4. A Google Compute Engine instance with a VPN server installed connected to the data center network

Answer(s): A

Explanation:

https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations

Google Cloud Dedicated Interconnect provides direct physical connections and RFC 1918 communication between your on-premises network and Google's network. Dedicated Interconnect enables you to transfer large amounts of data between networks, which can be more cost effective than purchasing additional bandwidth over the public Internet or using VPN tunnels.

Benefits:
Traffic between your on-premises network and your VPC network doesn't traverse the public Internet. Traffic traverses a dedicated connection with fewer hops, meaning there are less points of failure where traffic might get dropped or disrupted.
Your VPC network's internal (RFC 1918) IP addresses are directly accessible from your on-premises network. You don't need to use a NAT device or VPN tunnel to reach internal IP addresses. Currently, you can only reach internal IP addresses over a dedicated connection. To reach Google external IP addresses, you must use a separate connection.
You can scale your connection to Google based on your needs. Connection capacity is delivered over one or more 10 Gbps Ethernet connections, with a maximum of eight connections (80 Gbps total per interconnect).
The cost of egress traffic from your VPC network to your on-premises network is reduced. A dedicated connection is generally the least expensive method if you have a high-volume of traffic to and from Google's network.


Reference:

https://cloud.google.com/interconnect/docs/details/dedicated



Your company is forecasting a sharp increase in the number and size of Apache Spark and Hadoop jobs being run on your local datacenter You want to utilize the cloud to help you scale this upcoming demand with the least amount of operations work and code change.
Which product should you use?

  1. Google Cloud Dataflow
  2. Google Cloud Dataproc
  3. Google Compute Engine
  4. Google Container Engine

Answer(s): B

Explanation:

Google Cloud Dataproc is a fast, easy-to-use, low-cost and fully managed service that lets you run the Apache Spark and Apache Hadoop ecosystem on Google Cloud Platform. Cloud Dataproc provisions big or small clusters rapidly, supports many popular job types, and is integrated with other Google Cloud Platform services, such as Google Cloud Storage and Stackdriver Logging, thus helping you reduce TCO.


Reference:

https://cloud.google.com/dataproc/docs/resources/faq






Post your Comments and Discuss Google Google Cloud Architect Professional exam with other Community members:

Google Cloud Architect Professional Discussions & Posts