Free DBS-C01 Exam Braindumps (page: 35)

Page 35 of 82

A company is moving its fraud detection application from on premises to the AWS Cloud and is using Amazon Neptune for data storage. The company has set up a 1 Gbps AWS Direct Connect connection to migrate 25 TB of fraud detection data from the on-premises data center to a Neptune DB instance. The company already has an Amazon S3 bucket and an S3 VPC endpoint, and 80% of the company’s network bandwidth is available.

How should the company perform this data load?

  1. Use an AWS SDK with a multipart upload to transfer the data from on premises to the S3 bucket. Use the Copy command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.
  2. Use AWS Database Migration Service (AWS DMS) to transfer the data from on premises to the S3 bucket. Use the Loader command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.
  3. Use AWS DataSync to transfer the data from on premises to the S3 bucket. Use the Loader command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.
  4. Use the AWS CLI to transfer the data from on premises to the S3 bucket. Use the Copy command for Neptune to move the data in bulk from the S3 bucket to the Neptune DB instance.

Answer(s): C



A company migrated one of its business-critical database workloads to an Amazon Aurora Multi-AZ DB cluster. The company requires a very low RTO and needs to improve the application recovery time after database failovers.

Which approach meets these requirements?

  1. Set the max_connections parameter to 16,000 in the instance-level parameter group.
  2. Modify the client connection timeout to 300 seconds.
  3. Create an Amazon RDS Proxy database proxy and update client connections to point to the proxy endpoint.
  4. Enable the query cache at the instance level.

Answer(s): C



A company is using an Amazon RDS for MySQL DB instance for its internal applications. A security audit shows that the DB instance is not encrypted at rest. The company’s application team needs to encrypt the DB instance.

What should the team do to meet this requirement?

  1. Stop the DB instance and modify it to enable encryption. Apply this setting immediately without waiting for the next scheduled RDS maintenance window.
  2. Stop the DB instance and create an encrypted snapshot. Restore the encrypted snapshot to a new encrypted DB instance. Delete the original DB instance, and update the applications to point to the new encrypted DB instance.
  3. Stop the DB instance and create a snapshot. Copy the snapshot into another encrypted snapshot. Restore the encrypted snapshot to a new encrypted DB instance. Delete the original DB instance, and update the applications to point to the new encrypted DB instance.
  4. Create an encrypted read replica of the DB instance. Promote the read replica to master. Delete the original DB instance, and update the applications to point to the new encrypted DB instance.

Answer(s): C


Reference:

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.Encryption.html



A database specialist must create nightly backups of an Amazon DynamoDB table in a mission-critical workload as part of a disaster recovery strategy.

Which backup methodology should the database specialist use to MINIMIZE management overhead?

  1. Install the AWS CLI on an Amazon EC2 instance. Write a CLI command that creates a backup of the DynamoDB table. Create a scheduled job or task that executes the command on a nightly basis.
  2. Create an AWS Lambda function that creates a backup of the DynamoDB table. Create an Amazon CloudWatch Events rule that executes the Lambda function on a nightly basis.
  3. Create a backup plan using AWS Backup, specify a backup frequency of every 24 hours, and give the plan a nightly backup window.
  4. Configure DynamoDB backup and restore for an on-demand backup frequency of every 24 hours.

Answer(s): C

Explanation:

On-demand backup allows you to create full backups of your Amazon DynamoDB table for data archiving, helping you meet your corporate and governmental regulatory requirements. You can back up tables from a few megabytes to hundreds of terabytes of data, with no impact on performance and availability to your production applications. Backups process in seconds regardless of the size of your tables, so you do not have to worry about backup schedules or long-running processes. In addition, all backups are automatically encrypted, cataloged, easily discoverable, and retained until explicitly deleted.



Page 35 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote