Free AWS-Certified-Big-Data-Specialty Exam Braindumps (page: 9)

Page 8 of 124

What does it mean if an AWS Snowball action returns a “ServiceUnavailable” error?

  1. The request has failed due to a temporary failure of the server.
  2. The action or operation requested is invalid.
  3. The request processing has failed because of an unknown error, exception or failure.
  4. The request signature does not conform to AWS standards.

Answer(s): A

Explanation:

An Incomplete Signature error is returned when the request signature does not conform to AWS standards. An Internal Failure error is returned when the request processing has failed because of an unknown error, exception or failure. An Invalid Action error is returned when the action or operation requested is invalid. A Service Unavailable error is returned when the request has failed due to a temporary failure of the server.


Reference:

http://docs.aws.amazon.com/snowball/latest/api-reference/CommonErrors.html



In AWS Snowball, what happens if you try to transfer an object with a trailing slash in its name?

  1. If the name update causes a conflict (e.g., if an object with the target new name already exists), then the job fails.
  2. It is not transferred during the import/export process.
  3. Its name is auto-updated before transfer: any trailing slash is replaced by a dash.
  4. It is treated as a directory and the job fails if it turns out to be a regular file.

Answer(s): B

Explanation:

In AWS Snowball, objects with trailing slashes in their names (/ or \) are not transferred. Before exporting any objects with trailing slashes, you should update their names and remove the slash.


Reference:

http://docs.aws.amazon.com/snowball/latest/ug/create-export-job.html



What does the following command accomplish?
snowball cp -n hdfs://localhost:9000/ImportantPhotos/Cats s3://MyBucket/Photos/Cats

  1. It imports data from a HDFS cluster to a Snowball.
  2. It exports data from S3 to a HDFS cluster.
  3. It imports data from S3 to a Snowball.
  4. It exports data from a HDFS cluster to S3.

Answer(s): A

Explanation:

To transfer file data from a Hadoop Distributed File System (HDFS) to a Snowball, you specify the Name node URI as the source schema, which has the hdfs://hostname:port format. For example: snowball cp -n hdfs://localhost:9000/ImportantPhotos/Cats s3://MyBucket/Photos/Cats


Reference:

http://docs.aws.amazon.com/snowball/latest/ug/using-client.html



Which of the statements below are correct for Amazon Kinesis streams? (choose three)

  1. A record is composed of a sequence number and data blob
  2. A record is the unit of data stored in the Amazon Kinesis Stream
  3. A record is composed of a sequence number, partition key, and data blob.
  4. Each record in the stream has a sequenced number that is assigned by Kinesis Streams.

Answer(s): B,C,D

Explanation:

With Amazon Kinesis streams
Each record in the stream has a sequenced number that is assigned by Kinesis Streams. A record is the unit of data stored in the Amazon Kinesis Stream
A record is composed of a sequence number, partition key, and data blob.


Reference:

http://docs.aws.amazon.com/streams/latest/dev/working-with-kinesis.html






Post your Comments and Discuss Amazon AWS-Certified-Big-Data-Specialty exam with other Community members:

AWS-Certified-Big-Data-Specialty Discussions & Posts