Free Professional Cloud Developer Exam Braindumps (page: 18)

Page 18 of 82

Your application requires service accounts to be authenticated to GCP products via credentials stored on its host Compute Engine virtual machine instances. You want to distribute these credentials to the host instances as securely as possible.
What should you do?

  1. Use HTTP signed URLs to securely provide access to the required resources.
  2. Use the instance's service account Application Default Credentials to authenticate to the required resources.
  3. Generate a P12 file from the GCP Console after the instance is deployed, and copy the credentials to the host instance before starting the application.
  4. Commit the credential JSON file into your application's source repository, and have your CI/CD process package it with the software that is deployed to the instance.

Answer(s): B


Reference:

https://cloud.google.com/compute/docs/api/how-tos/authorization



Your application is deployed in a Google Kubernetes Engine (GKE) cluster. You want to expose this application publicly behind a Cloud Load Balancing HTTP(S) load balancer.
What should you do?

  1. Configure a GKE Ingress resource.
  2. Configure a GKE Service resource.
  3. Configure a GKE Ingress resource with type: LoadBalancer.
  4. Configure a GKE Service resource with type: LoadBalancer.

Answer(s): A


Reference:

https://cloud.google.com/kubernetes-engine/docs/concepts/ingress



Your company is planning to migrate their on-premises Hadoop environment to the cloud. Increasing storage cost and maintenance of data stored in HDFS is a major concern for your company. You also want to make minimal changes to existing data analytics jobs and existing architecture.
How should you proceed with the migration?

  1. Migrate your data stored in Hadoop to BigQuery. Change your jobs to source their information from BigQuery instead of the on-premises Hadoop environment.
  2. Create Compute Engine instances with HDD instead of SSD to save costs. Then perform a full migration of your existing environment into the new one in Compute Engine instances.
  3. Create a Cloud Dataproc cluster on Google Cloud Platform, and then migrate your Hadoop environment to the new Cloud Dataproc cluster. Move your HDFS data into larger HDD disks to save on storage costs.
  4. Create a Cloud Dataproc cluster on Google Cloud Platform, and then migrate your Hadoop code objects to the new cluster. Move your data to Cloud Storage and leverage the Cloud Dataproc connector to run jobs on that data.

Answer(s): D



Your data is stored in Cloud Storage buckets. Fellow developers have reported that data downloaded from Cloud Storage is resulting in slow API performance.
You want to research the issue to provide details to the GCP support team.
Which command should you run?

  1. gsutil test “"o output.json gs://my-bucket
  2. gsutil perfdiag “"o output.json gs://my-bucket
  3. gcloud compute scp example-instance:~/test-data “"o output.json gs://my-bucket
  4. gcloud services test “"o output.json gs://my-bucket

Answer(s): B


Reference:

https://groups.google.com/forum/#!topic/gce-discussion/xBl9Jq5HDsY



Page 18 of 82



Post your Comments and Discuss Google Professional Cloud Developer exam with other Community members:

DaveP commented on November 19, 2023
Some of these answers are wrong according to the Google sample questions.
UNITED STATES
upvote

devansh commented on June 21, 2023
does anyone recently took the exam , i have it in 2 days , are these dumps only enough for the prep?
Anonymous
upvote