Free AWS-Certified-Big-Data-Specialty Exam Braindumps (page: 14)

Page 14 of 124

The job management API for AWS Snowball is a network protocol based on HTTP that uses a(n) model.

  1. RPC
  2. MPI
  3. publish/subscribe
  4. RMI

Answer(s): A

Explanation:

The job management API for AWS Snowball is a network protocol based on HTTP. It uses JSON (RFC 4627) documents for HTTP request/response bodies and is an RPC model, in which there is a fixed set of operations, and the syntax for each operation is known to clients without any prior interaction.


Reference:

http://docs.aws.amazon.com/snowball/latest/api-reference/api-reference.html



Which statements are true about re-sharding in Amazon Kinesis?

  1. The shard or pair of shards that result from the re-sharding operation are referred to as child shards.
  2. When you re-shard, data records that were flowing to the parent shards are rerouted to flow to the child shards based on the hash key values that the data record partition keys map to.
  3. The shard or pair of shards that the re-sharding operation acts on are referred to as parent shards.
  4. After you call a re-sharding operation, you do not need to wait for the stream to become active again.

Answer(s): A,B,C

Explanation:

Kinesis Streams supports re-sharding which enables you to adjust the number of shards in your stream in order to adapt to changes in the rate of data flow through the stream.
The shard or pair of shards that the re-sharding operation acts on are referred to as parent shards. The shard or pair of shards that result from the re-sharding operation are referred to as child shards. After you call a re-sharding operation, you need to wait for the stream to become active again.
When you re-shard, data records that were flowing to the parent shards are rerouted to flow to the child shards based on the hash key values that the data record partition keys map to.


Reference:

http://docs.aws.amazon.com/streams/latest/dev/working-with-kinesis.html



In AWS Data Pipeline, an activity is (choose one)

  1. A pipeline component that defines the work to perform
  2. The database schema of the pipeline data
  3. A set of scripts loaded at run time
  4. A read/ write event from the primary database

Answer(s): A

Explanation:


Reference:

http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-managingpipeline. html



In AWS Data Pipeline, an activity is a pipeline component that defines the work to perform. All AWS Data Pipeline schedules must have- (choose two)

  1. an execution time
  2. a start date
  3. a frequency
  4. an end date

Answer(s): B,C






Post your Comments and Discuss Amazon AWS-Certified-Big-Data-Specialty exam with other Community members:

AWS-Certified-Big-Data-Specialty Exam Discussions & Posts