Google ASSOCIATE-CLOUD-ENGINEER Exam
Associate Cloud Engineer (Page 11 )

Updated On: 25-Jan-2026

You are using Data Studio to visualize a table from your data warehouse that is built on top of BigQuery. Data is appended to the data warehouse during the day. At night, the daily summary is recalculated by overwriting the table. You just noticed that the charts in Data Studio are broken, and you want to analyze the problem.
What should you do?

  1. Use the BigQuery interface to review the nightly Job and look for any errors
  2. Review the Error Reporting page in the Cloud Console to find any errors.
  3. In Cloud Logging create a filter for your Data Studio report
  4. Use the open source CLI tool. Snapshot Debugger, to find out why the data was not refreshed correctly.

Answer(s): D

Explanation:

Cloud Debugger helps inspect the state of an application, at any code location, without stopping or slowing down the running app // https://cloud.google.com/debugger/docs



Your company wants to standardize the creation and management of multiple Google Cloud resources using Infrastructure as Code. You want to minimize the amount of repetitive code needed to manage the environment What should you do?

  1. Create a bash script that contains all requirement steps as gcloud commands
  2. Develop templates for the environment using Cloud Deployment Manager
  3. Use curl in a terminal to send a REST request to the relevant Google API for each individual resource.
  4. Use the Cloud Console interface to provision and manage all related resources

Answer(s): B

Explanation:

You can use Google Cloud Deployment Manager to create a set of Google Cloud resources and manage them as a unit, called a deployment. For example, if your team's development environment needs two virtual machines (VMs) and a BigQuery database, you can define these resources in a configuration file, and use Deployment Manager to create, change, or delete these resources. You can make the configuration file part of your team's code repository, so that anyone can create the same environment with consistent results. https://cloud.google.com/deployment- manager/docs/quickstart



You have developed a containerized web application that will serve Internal colleagues during business hours. You want to ensure that no costs are incurred outside of the hours the application is used. You have just created a new Google Cloud project and want to deploy the application.
What should you do?

  1. Deploy the container on Cloud Run for Anthos, and set the minimum number of instances to zero
  2. Deploy the container on Cloud Run (fully managed), and set the minimum number of instances to zero.
  3. Deploy the container on App Engine flexible environment with autoscaling. and set the value min_instances to zero in the app yaml
  4. Deploy the container on App Engine flexible environment with manual scaling, and set the value instances to zero in the app yaml

Answer(s): B

Explanation:

https://cloud.google.com/kuberun/docs/architecture-
overview#components_in_the_default_installation



You are working for a hospital that stores Its medical images in an on-premises data room. The hospital wants to use Cloud Storage for archival storage of these images. The hospital wants an automated process to upload any new medical images to Cloud Storage. You need to design and implement a solution.
What should you do?

  1. Deploy a Dataflow job from the batch template "Datastore lo Cloud Storage" Schedule the batch job on the desired interval
  2. In the Cloud Console, go to Cloud Storage Upload the relevant images to the appropriate bucket
  3. Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage Schedule the script as a cron job
  4. Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topic. Create an application that sends all medical images to the Pub/Sub lope

Answer(s): C

Explanation:

they require cloud storage for archival and the want to automate the process to upload new medical image to cloud storage, hence we go for gsutil to copy on-prem images to cloud storage and automate the process via cron job. whereas Pub/Sub listens to the changes in the Cloud Storage bucket and triggers the pub/sub topic, which is not required.



You are running a data warehouse on BigQuery. A partner company is offering a recommendation engine based on the data in your data warehouse. The partner company is also running their application on Google Cloud. They manage the resources in their own project, but they need access to the BigQuery dataset in your project. You want to provide the partner company with access to the dataset What should you do?

  1. Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project
  2. Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project
  3. Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project
  4. Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project

Answer(s): D

Explanation:

https://gtseres.medium.com/using-service-accounts-across-projects-in-gcp- cf9473fef8f0#:~:text=Go%20to%20the%20destination%20project,Voila!



Viewing page 11 of 63
Viewing questions 51 - 55 out of 343 questions



Post your Comments and Discuss Google ASSOCIATE-CLOUD-ENGINEER exam prep with other Community members:

Join the ASSOCIATE-CLOUD-ENGINEER Discussion