Free AWS-Certified-Big-Data-Specialty Exam Braindumps (page: 11)

Page 10 of 124

When importing data from your facility to AWS, what are the schemas supported by Snowball?

  1. Locally mounted storage (e.g., C:\\) for the data source, and s3 (s3://) or HDFS (hdfs://) for the destination
  2. Locally mounted storage (e.g., C:\\) for the data source and s3 (s3://) for the destination
  3. HDFS (hdfs://) for the data source and s3 (s3://) for the destination
  4. Locally mounted storage (e.g., C:\\) or HDFS (hdfs://) for the data source, and s3 (s3://) for the destination

Answer(s): D

Explanation:

The Snowball client uses schemas to define what kind of data is transferred between the client’s data center and a Snowball. The schemas are declared when a command is issued. Currently, Snowball supports the following schemas: Locally mounted storage (e.g., C:\\) or hdfs (hdfs://) for the data source, and s3 (s3://) for the destination.


Reference:

http://docs.aws.amazon.com/snowball/latest/ug/using-client.html



Which of the following is NOT a standard activity in AWS Data Pipeline?

  1. SnsAlarm Activity
  2. ShellCommand Activity
  3. Hive Activity
  4. EMR Activity

Answer(s): A

Explanation:

In AWS Data Pipeline, an activity is a pipeline component that defines the work to perform. AWS Data Pipeline provides several pre-packaged activities that accommodate common scenarios, such as moving data from one location to another, running Hive queries, and so on. Activities are extensible, so you can run your own custom scripts to support endless combinations.
AWS Data Pipeline supports the following types of activities:
. CopyActivity: Copies data from one location to another.
. EmrActivity: Runs an Amazon EMR cluster.
. HiveActivity: Runs a Hive query on an Amazon EMR cluster.
. HiveCopyActivity: Runs a Hive query on an Amazon EMR cluster with support for advanced data filtering and support for S3DataNode and DynamoDBDataNode.
. PigActivity: Runs a Pig script on an Amazon EMR cluster.
. RedshiftCopyActivity: Copies data to and from Amazon Redshift tables.
. ShellCommandActivity: Runs a custom UNIX/Linux shell command as an activity.
. SqlActivity: Runs a SQL query on a database.


Reference:

http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-conceptsactivities. html



In AWS Snowball, the value is a 29-character code used to decrypt the manifest file.

  1. UnlockCode
  2. IAM code
  3. IAM private key
  4. KeyCode

Answer(s): A

Explanation:

In AWS Snowball, the UnlockCode value is a 29-character code with 25 alphanumeric characters and 4 hyphens used to decrypt the manifest file. As a best practice, AWS recommends that you don't save a copy of the UnlockCode in the same location as the manifest file for that job.


Reference:

http://docs.aws.amazon.com/snowball/latest/apireference/ API_GetJobUnlockCode.html



Which statements are true of sequence numbers in Amazon Kinesis? (choose three)

  1. Sequence numbers are assigned by Amazon Kinesis when a data producer calls PutRecords operation to add data to an Amazon Kinesis stream
  2. A data pipeline is a group of data records in a stream.
  3. The longer the time period between PutRecord or PutRecords requests, the larger the sequence number becomes.
  4. Sequence numbers are assigned by Amazon Kinesis when a data producer calls PutRecord operation to add data to an Amazon Kinesis stream

Answer(s): A,C,D

Explanation:

Sequence numbers in amazon Kinesis are assigned by Amazon Kinesis when a data producer calls PutRecord operation to add data to an Amazon Kinesis stream. Sequence numbers are assigned by Amazon Kinesis when a data producer calls PutRecords operation to add data to an Amazon Kinesis stream. Sequence numbers for the same partition key generally increase over time.The longer the time period between PutRecord or PutRecords requests, the larger the sequence number becomes.


Reference:

http://docs.aws.amazon.com/streams/latest/dev/working-with-kinesis.html






Post your Comments and Discuss Amazon AWS-Certified-Big-Data-Specialty exam with other Community members:

AWS-Certified-Big-Data-Specialty Exam Discussions & Posts