Free DBS-C01 Exam Braindumps (page: 38)

Page 38 of 82

A database specialist must load 25 GB of data files from a company’s on-premises storage to an Amazon Neptune database.
Which approach to load the data is FASTEST?

  1. Upload the data to Amazon S3 and use the Loader command to load the data from Amazon S3 into the Neptune database.
  2. Write a utility to read the data from the on-premises storage and run INSERT statements in a loop to load the data into the Neptune database.
  3. Use the AWS CLI to load the data directly from the on-premises storage into the Neptune database.
  4. Use AWS DataSync to load the data directly from the on-premises storage into the Neptune database.

Answer(s): A


Reference:

https://docs.aws.amazon.com/neptune/latest/userguide/bulk-load.html



A finance company needs to make sure that its MySQL database backups are available for the most recent 90 days. All of the MySQL databases are hosted on Amazon RDS for MySQL DB instances. A database specialist must implement a solution that meets the backup retention requirement with the least possible development effort.
Which approach should the database specialist take?

  1. Use AWS Backup to build a backup plan for the required retention period. Assign the DB instances to the backup plan.
  2. Modify the DB instances to enable the automated backup option. Select the required backup retention period.
  3. Automate a daily cron job on an Amazon EC2 instance to create MySQL dumps, transfer to Amazon S3, and implement an S3 Lifecycle policy to meet the retention requirement.
  4. Use AWS Lambda to schedule a daily manual snapshot of the DB instances. Delete snapshots that exceed the retention requirement.

Answer(s): A



An online advertising company uses an Amazon DynamoDb table as its data store. The table has Amazon DynamoDB Streams enabled and has a global secondary index on one of the keys. The table is encrypted using an AWS Key Management Service (AWS KMS) customer managed key.

The company has decided to expand its operations globally and wants to replicate the database in a different AWS Region by using DynamoDB global tables. Upon review, an administrator notices the following:

-No role with the dynamodb: CreateGlobalTable permission exists in the account.
-An empty table with the same name exists in the new Region where replication is desired.
-A global secondary index with the same partition key but a different sort key exists in the new Region where replication is desired.

Which configurations will block the creation of a global table or the creation of a replica in the new Region? (Choose two.)

  1. A global secondary index with the same partition key but a different sort key exists in the new Region where replication is desired.
  2. An empty table with the same name exists in the Region where replication is desired.
  3. No role with the dynamodb:CreateGlobalTable permission exists in the account.
  4. DynamoDB Streams is enabled for the table.
  5. The table is encrypted using a KMS customer managed key.

Answer(s): A,D



A large automobile company is migrating the database of a critical financial application to Amazon DynamoDB.

The company’s risk and compliance policy requires that every change in the database be recorded as a log entry for audits. The system is anticipating more than 500,000 log entries each minute. Log entries should be stored in batches of at least 100,000 records in each file in Apache Parquet format.
How should a database specialist implement these requirements with DynamoDB?

  1. Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function triggered by the stream. Write the log entries to an Amazon S3 object.
  2. Create a backup plan in AWS Backup to back up the DynamoDB table once a day. Create an AWS Lambda function that restores the backup in another table and compares both tables for changes. Generate the log entries and write them to an Amazon S3 object.
  3. Enable AWS CloudTrail logs on the table. Create an AWS Lambda function that reads the log files once an hour and filters DynamoDB API actions. Write the filtered log files to Amazon S3.
  4. Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function triggered by the stream. Write the log entries to an Amazon Kinesis Data Firehose delivery stream with buffering and Amazon S3 as the destination.

Answer(s): D



Page 38 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote