Free Google Cloud Architect Professional Exam Braindumps (page: 4)

Page 3 of 68
View Related Case Study

For this question, refer to the TerramEarth case study

Your development team has created a structured API to retrieve vehicle dat

  1. They want to allow third parties to develop tools for dealerships that use this vehicle event data.
    You want to support delegated authorization against this data.
    What should you do?
  2. Build or leverage an OAuth-compatible access control system.
  3. Build SAML 2.0 SSO compatibility into your authentication system.
  4. Restrict data access based on the source IP address of the partner systems.
  5. Create secondary credentials for each dealer that can be given to the trusted third party.

Answer(s): A

Explanation:

https://cloud.google.com/appengine/docs/flexible/go/authorizing-apps https://cloud.google.com/docs/enterprise/best-practices-for-enterprise- organizations#delegate_application_authorization_with_oauth2

Delegate application authorization with OAuth2

Cloud Platform APIs support OAuth 2.0, and scopes provide granular authorization over the methods that are supported. Cloud Platform supports both service-account and user-account OAuth, also called three-legged OAuth.


Reference:

https://cloud.google.com/docs/enterprise/best-practices-for-enterprise- organizations#delegate_application_authorization_with_oauth2 https://cloud.google.com/appengine/docs/flexible/go/authorizing-apps



View Related Case Study

For this question, refer to the TerramEarth case study.

TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a second for 40 TB an hour. How should you design the data ingestion?

  1. Vehicles write data directly to GCS.
  2. Vehicles write data directly to Google Cloud Pub/Sub.
  3. Vehicles stream data directly to Google BigQuery.
  4. Vehicles continue to write data using the existing system (FTP).

Answer(s): B

Explanation:

https://cloud.google.com/solutions/data-lifecycle-cloud-platform https://cloud.google.com/solutions/designing-connected-vehicle-platform



View Related Case Study

For this question, refer to the TerramEarth case study

You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing customers' wait time for parts You decided to focus on reduction of the 3 weeks aggregate reporting time Which modifications to the company's processes should you recommend?

  1. Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics.
  2. Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics.
  3. Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics.
  4. Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor.

Answer(s): C

Explanation:

The Avro binary format is the preferred format for loading compressed data. Avro data is faster to load because the data can be read in parallel, even when the data blocks are compressed. Cloud Storage supports streaming transfers with the gsutil tool or boto library, based on HTTP chunked transfer encoding. Streaming data lets you stream data to and from your Cloud Storage account as soon as it becomes available without requiring that the data be first saved to a separate file. Streaming transfers are useful if you have a process that generates data and you do not want to buffer it locally before uploading it, or if you want to send the result from a computational pipeline directly into Cloud Storage.


Reference:

https://cloud.google.com/storage/docs/streaming https://cloud.google.com/bigquery/docs/loading-data



View Related Case Study

For this question refer to the TerramEarth case study.

Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform adoption.

  1. Opex/capex allocation, LAN changes, capacity planning
  2. Capacity planning, TCO calculations, opex/capex allocation
  3. Capacity planning, utilization measurement, data center expansion
  4. Data Center expansion, TCO calculations, utilization measurement

Answer(s): B

Explanation:

Capacity planning, TCO calculations, opex/capex allocation From the case study, it can conclude that Management (CXO) all concern rapid provision of resources (infrastructure) for growing as well as cost management, such as Cost optimization in Infrastructure, trade up front capital expenditures (Capex) for ongoing operating expenditures (Opex), and Total cost of ownership (TCO)






Post your Comments and Discuss Google Google Cloud Architect Professional exam with other Community members:

Google Cloud Architect Professional Discussions & Posts