Free Google Associate Cloud Engineer Exam Braindumps (page: 29)

Page 29 of 69

You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs.
What should you do?

  1. Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.
  2. Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.
  3. Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Perform the export of logs to Cloud Storage.
  4. Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Direct the auditor to also review the logs for changes to Cloud IAM policy.

Answer(s): B

Explanation:

Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data Access, which are generated by Google Cloud services to help you answer the question of who did what, where, and when? within your Google Cloud projects.


Reference:

https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors



You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects.
What should you do?

  1. Navigate to Stackdriver Logging and select resource.labels.project_id="*"
  2. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
  3. Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
  4. Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.

Answer(s): B

Explanation:

Navigate to Stackdriver Logging and select resource.labels.project_id=*. is not right.

Log entries are held in Stackdriver Logging for a limited time known as the retention period which is 30 days (default configuration). After that, the entries are deleted. To keep log entries longer, you need to export them outside of Stackdriver Logging by configuring log sinks.


Reference:

https://cloud.google.com/blog/products/gcp/best-practices-for-working-with-google-cloud- audit-logging

Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days. is not right.

While this works, it makes no sense to use Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery when Google provides a feature (export sinks) that does exactly the same thing and works out of the box.

https://cloud.google.com/logging/docs/export/configure_export_v2

Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days. is not right.

You can export logs by creating one or more sinks that include a logs query and an export destination. Supported destinations for exported log entries are Cloud Storage, BigQuery, and Pub/Sub.

https://cloud.google.com/logging/docs/export/configure_export_v2

Sinks are limited to exporting log entries from the exact resource in which the sink was created: a Google Cloud project, organization, folder, or billing account. If it makes it easier to exporting from all projects of an organication, you can create an aggregated sink that can export log entries from all the projects, folders, and billing accounts of a Google Cloud organization.
https://cloud.google.com/logging/docs/export/aggregated_sinks

Either way, we now have the data in Cloud Storage, but querying logs information from Cloud Storage is harder than Querying information from BigQuery dataset. For this reason, we should prefer Big Query over Cloud Storage.

Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days. is the right answer.

You can export logs by creating one or more sinks that include a logs query and an export destination. Supported destinations for exported log entries are Cloud Storage, BigQuery, and Pub/Sub.

https://cloud.google.com/logging/docs/export/configure_export_v2

Sinks are limited to exporting log entries from the exact resource in which the sink was created: a Google Cloud project, organization, folder, or billing account. If it makes it easier to exporting from all projects of an organication, you can create an aggregated sink that can export log entries from all the projects, folders, and billing accounts of a Google Cloud organization.
https://cloud.google.com/logging/docs/export/aggregated_sinks

Either way, we now have the data in a BigQuery Dataset. Querying information from a Big Query dataset is easier and quicker than analyzing contents in Cloud Storage bucket. As our requirement is to Quickly analyze the log contents, we should prefer Big Query over Cloud Storage.

Also, You can control storage costs and optimize storage usage by setting the default table expiration for newly created tables in a dataset. If you set the property when the dataset is created, any table created in the dataset is deleted after the expiration period. If you set the property after the dataset is created, only new tables are deleted after the expiration period. For example, if you set the default table expiration to 7 days, older data is automatically deleted after 1 week.

https://cloud.google.com/bigquery/docs/best-practices-storage


https://cloud.google.com/blog/products/gcp/best-practices-for-working-with-google- cloud-audit- logging



You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project.
What should you do?

  1. 1. Verify that you are assigned the Project Owners IAM role for this project.
    2. Locate the project in the GCP console, click Shut down and then enter the project ID.
  2. 1. Verify that you are assigned the Project Owners IAM role for this project.
    2. Switch to the project in the GCP console, locate the resources and delete them.
  3. 1. Verify that you are assigned the Organizational Administrator IAM role for this project.
    2. Locate the project in the GCP console, enter the project ID and then click Shut down.
  4. 1. Verify that you are assigned the Organizational Administrators IAM role for this project.
    2. Switch to the project in the GCP console, locate the resources and delete them.

Answer(s): A

Explanation:

https://cloud.google.com/run/docs/tutorials/gcloud https://cloud.google.com/resource-manager/docs/creating-managing-projects https://cloud.google.com/iam/docs/understanding-roles#primitive_roles

You can shut down projects using the Cloud Console.
When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days. However, some resources may be deleted much earlier.



You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in crm-databases- proj. You want to follow Google-recommended practices to give access to the service account in the web-applications project.
What should you do?

  1. Give "project owner" for web-applications appropriate roles to crm-databases- proj
  2. Give "project owner" role to crm-databases-proj and the web-applications project.
  3. Give "project owner" role to crm-databases-proj and bigquery.dataViewer role to web- applications.
  4. Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.

Answer(s): C


Reference:

https://cloud.google.com/blog/products/gcp/best-practices-for-working-with-google- cloud-audit- logging bigquery.dataViewer role provides permissions to read the datasets metadata and list tables in the dataset as well as Read data and metadata from the datasets tables. This is exactly what we need to fulfil this requirement and follows the least privilege principle.


https://cloud.google.com/iam/docs/understanding-roles#bigquery-roles



Page 29 of 69



Post your Comments and Discuss Google Google Associate Cloud Engineer exam with other Community members:

Narasimha commented on December 21, 2024
it is helpful for ACE GCP Exsm
INDIA
upvote

Preeti commented on December 20, 2024
How many questions in exam was from dump who give exam recently?
INDIA
upvote

Preeti commented on December 20, 2024
Have any of you taken the exam recently and passed just by using this dump?
INDIA
upvote

Sultan commented on December 04, 2024
Helpful for clearing ACE exam
Anonymous
upvote

Mike commented on November 19, 2024
In my opinion, they work well for me, but it depends on how you approach them. My method isn't about memorizing the exact questions and answers from the practice tests to use on the real exam. Instead, I focus on understanding why I got certain questions wrong so I can deepen my comprehension of the material.
EUROPEAN UNION
upvote

Prabhat Kumar commented on November 06, 2024
Google Google Associate Cloud Engineer
EUROPEAN UNION
upvote

Shawn commented on October 24, 2024
As you must know by now the exam is extremely hard. The only way to pass is to know the questions and answers and I found these dump questions very relevant to actual exam.
Canada
upvote

Soniksha commented on October 10, 2024
I purchased the full version of this exam and it turned out quire accurate. I passed with the help of this exam.
UNITED STATES
upvote

Paras Gupta commented on September 17, 2024
great it a good course
Anonymous
upvote

Chesare commented on September 12, 2024
Have any of you taken the exam recently and passed just by using this dump?
MEXICO
upvote

kkraj commented on September 11, 2024
start to preparing the exam
Anonymous
upvote

Thanvi commented on August 29, 2024
Preparing for the exam
Anonymous
upvote

Thanvi commented on August 29, 2024
checking questions
Anonymous
upvote

Vinay G commented on July 24, 2024
Preparing for the exam
Anonymous
upvote

gk commented on July 14, 2024
checking questions
UNITED STATES
upvote

Kacha-Aloo commented on June 18, 2022
The questions are valid in this exam dumps. I passed my exam yesterday. Now going to enjoy some cricket.
INDIA
upvote

Cow-Toy commented on August 06, 2021
I wrote my exam this morning and pass with a 78% mark. While practicing with the Xengine Simulator I kept getting 85% and more but in real exam I got 78%. This means that about 5% of the answers are wrong. Or I got them wrong. Regardlesss I passed.
UNITED STATES
upvote

Nerd-Boy commented on December 23, 2020
I got by buy 1 get 1 free deal. I passed my first exam today. Going for next one. It looks like due to COVID-19, it is easier to pass your certification exam.
UNITED STATES
upvote