Free Google Associate Cloud Engineer Exam Braindumps (page: 37)

Page 36 of 74

You created several resources in multiple Google Cloud projects. All projects are linked to different billing accounts. To better estimate future charges, you want to have a single visual representation of all costs incurred. You want to include new cost data as soon as possible.
What should you do?

  1. Configure Billing Data Export to BigQuery and visualize the data in Data Studio.
  2. Visit the Cost Table page to get a CSV export and visualize it using Data Studio.
  3. Fill all resources in the Pricing Calculator to get an estimate of the monthly cost.
  4. Use the Reports view in the Cloud Billing Console to view the desired cost information.

Answer(s): A

Explanation:

https://cloud.google.com/billing/docs/how-to/export-data-bigquery "Cloud Billing export to BigQuery enables you to export detailed Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically throughout the day to a BigQuery dataset that you specify."


Reference:

https://cloud.google.com/billing/docs/how-to/visualize-data



Your company has workloads running on Compute Engine and on-premises. The Google Cloud Virtual Private Cloud (VPC) is connected to your WAN over a Virtual Private Network (VPN). You need to deploy a new Compute Engine instance and ensure that no public Internet traffic can be routed to it.
What should you do?

  1. Create the instance without a public IP address.
  2. Create the instance with Private Google Access enabled.
  3. Create a deny-all egress firewall rule on the VPC network.
  4. Create a route on the VPC to route all traffic to the instance over the VPN tunnel.

Answer(s): A

Explanation:

VMs cannot communicate over the internet without a public IP address. Private Google Access permits access to Google APIs and services in Google's production infrastructure.
https://cloud.google.com/vpc/docs/private-google-access



Your team maintains the infrastructure for your organization. The current infrastructure requires changes. You need to share your proposed changes with the rest of the team. You want to follow Google's recommended best practices.
What should you do?

  1. Use Deployment Manager templates to describe the proposed changes and store them in a Cloud Storage bucket.
  2. Use Deployment Manager templates to describe the proposed changes and store them in Cloud Source Repositories.
  3. Apply the change in a development environment, run gcloud compute instances list, and then save the output in a shared Storage bucket.
  4. Apply the change in a development environment, run gcloud compute instances list, and then save the output in Cloud Source Repositories.

Answer(s): B

Explanation:

Showing Deployment Manager templates to your team will allow you to define the changes you want to implement in your cloud infrastructure. You can use Cloud Source Repositories to store

Deployment Manager templates and collaborate with your team. Cloud Source Repositories are fully-featured, scalable, and private Git repositories you can use to store, manage and track changes to your code.
https://cloud.google.com/source-repositories/docs/features



You have a Compute Engine instance hosting an application used between 9 AM and 6 PM on weekdays. You want to back up this instance daily for disaster recovery purposes. You want to keep the backups for 30 days. You want the Google-recommended solution with the least management overhead and the least number of services.
What should you do?

  1. 1. Update your instances' metadata to add the following value: snapshot­schedule: 0 1 * * *
    2. Update your instances' metadata to add the following value: snapshot­retention: 30
  2. 1. In the Cloud Console, go to the Compute Engine Disks page and select your instance's disk.
    2. In the Snapshot Schedule section, select Create Schedule and configure the following parameters:
    ­ Schedule frequency: Daily
    ­ Start time: 1:00 AM ­ 2:00 AM
    ­ Autodelete snapshots after 30 days
  3. 1. Create a Cloud Function that creates a snapshot of your instance's disk.
    2. Create a Cloud Function that deletes snapshots that are older than 30 days.
    3. Use Cloud Scheduler to trigger both Cloud Functions daily at 1:00 AM.
  4. 1. Create a bash script in the instance that copies the content of the disk to Cloud Storage.
    2. Create a bash script in the instance that deletes data older than 30 days in the backup Cloud Storage bucket.
    3. Configure the instance's crontab to execute these scripts daily at 1:00 AM.

Answer(s): B

Explanation:

Creating scheduled snapshots for persistent disk This document describes how to create a snapshot schedule to regularly and automatically back up your zonal and regional persistent disks. Use snapshot schedules as a best practice to back up your Compute Engine workloads. After creating a snapshot schedule, you can apply it to one or more persistent disks. https://cloud.google.com/compute/docs/disks/scheduled-snapshots






Post your Comments and Discuss Google Google Associate Cloud Engineer exam with other Community members:

Google Associate Cloud Engineer Exam Discussions & Posts