When importing data from your facility to AWS, what are the schemas supported by Snowball?
Answer(s): D
The Snowball client uses schemas to define what kind of data is transferred between the client’s data center and a Snowball. The schemas are declared when a command is issued. Currently, Snowball supports the following schemas: Locally mounted storage (e.g., C:\\) or hdfs (hdfs://) for the data source, and s3 (s3://) for the destination.
http://docs.aws.amazon.com/snowball/latest/ug/using-client.html
Which of the following is NOT a standard activity in AWS Data Pipeline?
Answer(s): A
In AWS Data Pipeline, an activity is a pipeline component that defines the work to perform. AWS Data Pipeline provides several pre-packaged activities that accommodate common scenarios, such as moving data from one location to another, running Hive queries, and so on. Activities are extensible, so you can run your own custom scripts to support endless combinations.AWS Data Pipeline supports the following types of activities:. CopyActivity: Copies data from one location to another.. EmrActivity: Runs an Amazon EMR cluster.. HiveActivity: Runs a Hive query on an Amazon EMR cluster.. HiveCopyActivity: Runs a Hive query on an Amazon EMR cluster with support for advanced data filtering and support for S3DataNode and DynamoDBDataNode.. PigActivity: Runs a Pig script on an Amazon EMR cluster.. RedshiftCopyActivity: Copies data to and from Amazon Redshift tables.. ShellCommandActivity: Runs a custom UNIX/Linux shell command as an activity.. SqlActivity: Runs a SQL query on a database.
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-conceptsactivities. html
In AWS Snowball, the value is a 29-character code used to decrypt the manifest file.
In AWS Snowball, the UnlockCode value is a 29-character code with 25 alphanumeric characters and 4 hyphens used to decrypt the manifest file. As a best practice, AWS recommends that you don't save a copy of the UnlockCode in the same location as the manifest file for that job.
http://docs.aws.amazon.com/snowball/latest/apireference/ API_GetJobUnlockCode.html
Which statements are true of sequence numbers in Amazon Kinesis? (choose three)
Answer(s): A,C,D
Sequence numbers in amazon Kinesis are assigned by Amazon Kinesis when a data producer calls PutRecord operation to add data to an Amazon Kinesis stream. Sequence numbers are assigned by Amazon Kinesis when a data producer calls PutRecords operation to add data to an Amazon Kinesis stream. Sequence numbers for the same partition key generally increase over time.The longer the time period between PutRecord or PutRecords requests, the larger the sequence number becomes.
http://docs.aws.amazon.com/streams/latest/dev/working-with-kinesis.html
Post your Comments and Discuss Amazon AWS-Certified-Big-Data-Specialty exam prep with other Community members:
We’re offering these study questions to support your success. The least you can do? Drop a useful comment about each question. Help others. Build the community.