Free Google Cloud Certified Professional Data Engineer Exam Braindumps (page: 32)

Page 32 of 68

To give a user read permission for only the first three columns of a table, which access control method would you use?

  1. Primitive role
  2. Predefined role
  3. Authorized view
  4. It's not possible to give access to only the first three columns of a table.

Answer(s): C

Explanation:

An authorized view allows you to share query results with particular users and groups without giving them read access to the underlying tables. Authorized views can only be created in a dataset that does not contain the tables queried by the view.
When you create an authorized view, you use the view's SQL query to restrict access to only the rows and columns you want the users to see.


Reference:

https://cloud.google.com/bigquery/docs/views#authorized-views



Which of the following is NOT a valid use case to select HDD (hard disk drives) as the storage for Google Cloud Bigtable?

  1. You expect to store at least 10 TB of data.
  2. You will mostly run batch workloads with scans and writes, rather than frequently executing random reads of a small number of rows.
  3. You need to integrate with Google BigQuery.
  4. You will not use the data to back a user-facing or latency-sensitive application.

Answer(s): C

Explanation:

For example, if you plan to store extensive historical data for a large number of remote- sensing devices and then use the data to generate daily reports, the cost savings for HDD storage may justify the performance tradeoff. On the other hand, if you plan to use the data to display a real-time dashboard, it probably would not make sense to use HDD storage--reads would be much more frequent in this case, and reads are much slower with HDD storage.


Reference:

https://cloud.google.com/bigtable/docs/choosing-ssd-hdd



Cloud Dataproc is a managed Apache Hadoop and Apache _____ service.

  1. Blaze
  2. Spark
  3. Fire
  4. Ignite

Answer(s): B

Explanation:

Cloud Dataproc is a managed Apache Spark and Apache Hadoop service that lets you use open source data tools for batch processing, querying, streaming, and machine learning.


Reference:

https://cloud.google.com/dataproc/docs/



The Dataflow SDKs have been recently transitioned into which Apache service?

  1. Apache Spark
  2. Apache Hadoop
  3. Apache Kafka
  4. Apache Beam

Answer(s): D

Explanation:

Dataflow SDKs are being transitioned to Apache Beam, as per the latest Google directive


Reference:

https://cloud.google.com/dataflow/docs/



Page 32 of 68



Post your Comments and Discuss Google Google Cloud Certified Professional Data Engineer exam with other Community members:

Guru Gee commented on March 23, 2024
Got to buy the full version to get all the answers. But questions are real and I cleared this paper.
INDIA
upvote

Mahitha Govindu commented on February 08, 2024
useful content
Anonymous
upvote

Mahitha Govindu commented on February 08, 2024
Good content
Anonymous
upvote

hello commented on October 31, 2023
good content
Anonymous
upvote

p das commented on December 07, 2023
very good questions
UNITED STATES
upvote

p das commented on December 07, 2023
very good questions
UNITED STATES
upvote

Simon Mukabana commented on November 09, 2023
Did the exam on 3rd and almost all questions came from this dump.
Anonymous
upvote

hello commented on October 31, 2023
good content
Anonymous
upvote

Swati commented on September 11, 2023
Really helpful
Anonymous
upvote