Free Professional Data Engineer Exam Braindumps

Your startup has never implemented a formal security policy. Currently, everyone in the company has access to the datasets stored in Google BigQuery. Teams have freedom to use the service as they see fit, and they have not documented their use cases. You have been asked to secure the data warehouse. You need to discover what everyone is doing.
What should you do first?

  1. Use Google Stackdriver Audit Logs to review data access.
  2. Get the identity and access management IIAM) policy of each table
  3. Use Stackdriver Monitoring to see the usage of BigQuery query slots.
  4. Use the Google Cloud Billing API to see what account the warehouse is being billed to.

Answer(s): A



View Related Case Study

Flowlogistic's management has determined that the current Apache Kafka servers cannot handle the data volume for their real-time inventory tracking system. You need to build a new system on Google Cloud Platform (GCP) that will feed the proprietary tracking
software. The system must be able to ingest data from a variety of global sources, process and query in real-time, and store the data reliably.
Which combination of GCP products should you choose?

  1. Cloud Pub/Sub, Cloud Dataflow, and Cloud Storage
  2. Cloud Pub/Sub, Cloud Dataflow, and Local SSD
  3. Cloud Pub/Sub, Cloud SQL, and Cloud Storage
  4. Cloud Load Balancing, Cloud Dataflow, and Cloud Storage

Answer(s): C



View Related Case Study

Flowlogistic's CEO wants to gain rapid insight into their customer base so his sales team can be better informed in the field. This team is not very technical, so they've purchased a visualization tool to simplify the creation of BigQuery reports. However, they've been overwhelmed by all the data in the table, and are spending a lot of money on queries trying to find the data they need. You want to solve their problem in the most cost-effective way.
What should you do?

  1. Export the data into a Google Sheet for virtualization.
  2. Create an additional table with only the necessary columns.
  3. Create a view on the table to present to the virtualization tool.
  4. Create identity and access management (IAM) roles on the appropriate columns, so only they appear in a query.

Answer(s): C



View Related Case Study

Flowlogistic is rolling out their real-time inventory tracking system. The tracking devices will all send package-tracking messages, which will now go to a single Google Cloud Pub/Sub topic instead of the Apache Kafka cluster. A subscriber application will then process the messages for real-time reporting and store them in Google BigQuery for historical analysis. You want to ensure the package data can be analyzed over time.

Which approach should you take?

  1. Attach the timestamp on each message in the Cloud Pub/Sub subscriber application as they are received.
  2. Attach the timestamp and Package ID on the outbound message from each publisher device as they are sent to Clod Pub/Sub.
  3. Use the NOW () function in BigQuery to record the event's time.
  4. Use the automatically generated timestamp from Cloud Pub/Sub to order the data.

Answer(s): B






Post your Comments and Discuss Google Professional Data Engineer exam with other Community members:

madhan commented on June 16, 2023
next question
EUROPEAN UNION
upvote