Free Professional Data Engineer Exam Braindumps (page: 30)

Page 30 of 68

Which of these numbers are adjusted by a neural network as it learns from a training dataset (select 2 answers)?

  1. Weights
  2. Biases
  3. Continuous features
  4. Input values

Answer(s): A,B

Explanation:

A neural network is a simple mechanism that's implemented with basic math. The only difference between the traditional programming model and a neural network is that you let the computer determine the parameters (weights and bias) by learning from training datasets.


Reference:

https://cloud.google.com/blog/big-data/2016/07/understanding-neural-networks- with-tensorflow-playground



Google Cloud Bigtable indexes a single value in each row. This value is called the _______.

  1. primary key
  2. unique key
  3. row key
  4. master key

Answer(s): C

Explanation:

Cloud Bigtable is a sparsely populated table that can scale to billions of rows and thousands of columns, allowing you to store terabytes or even petabytes of data. A single value in each row is indexed; this value is known as the row key.


Reference:

https://cloud.google.com/bigtable/docs/overview



Cloud Dataproc charges you only for what you really use with _____ billing.

  1. month-by-month
  2. minute-by-minute
  3. week-by-week
  4. hour-by-hour

Answer(s): B

Explanation:

One of the advantages of Cloud Dataproc is its low cost. Dataproc charges for what you really use with minute-by-minute billing and a low, ten-minute-minimum billing period.


Reference:

https://cloud.google.com/dataproc/docs/concepts/overview



What is the recommended action to do in order to switch between SSD and HDD storage for your Google Cloud Bigtable instance?

  1. create a third instance and sync the data from the two storage types via batch jobs
  2. export the data from the existing instance and import the data into a new instance
  3. run parallel instances where one is HDD and the other is SDD
  4. the selection is final and you must resume using the same storage type

Answer(s): B

Explanation:

When you create a Cloud Bigtable instance and cluster, your choice of SSD or HDD storage for the cluster is permanent. You cannot use the Google Cloud Platform Console to change the type of storage that is used for the cluster. If you need to convert an existing HDD cluster to SSD, or vice-versa, you can export the data from the existing instance and import the data into a new instance. Alternatively, you can write
a Cloud Dataflow or Hadoop MapReduce job that copies the data from one instance to another.


Reference:

https://cloud.google.com/bigtable/docs/choosing-ssd-hdd­



Page 30 of 68



Post your Comments and Discuss Google Professional Data Engineer exam with other Community members:

madhan commented on June 16, 2023
next question
EUROPEAN UNION
upvote