Free DBS-C01 Exam Braindumps (page: 12)

Page 12 of 82

An ecommerce company has tasked a Database Specialist with creating a reporting dashboard that visualizes critical business metrics that will be pulled from the core production database running on Amazon Aurora. Data that is read by the dashboard should be available within 100 milliseconds of an update.
The Database Specialist needs to review the current configuration of the Aurora DB cluster and develop a cost-effective solution. The solution needs to accommodate the unpredictable read workload from the reporting dashboard without any impact on the write availability and performance of the DB cluster.

Which solution meets these requirements?

  1. Turn on the serverless option in the DB cluster so it can automatically scale based on demand.
  2. Provision a clone of the existing DB cluster for the new Application team.
  3. Create a separate DB cluster for the new workload, refresh from the source DB cluster, and set up ongoing replication using AWS DMS change data capture (CDC).
  4. Add an automatic scaling policy to the DB cluster to add Aurora Replicas to the cluster based on CPU consumption.

Answer(s): D



A retail company is about to migrate its online and mobile store to AWS. The company’s CEO has strategic plans to grow the brand globally. A Database Specialist has been challenged to provide predictable read and write database performance with minimal operational overhead.
What should the Database Specialist do to meet these requirements?

  1. Use Amazon DynamoDB global tables to synchronize transactions
  2. Use Amazon EMR to copy the orders table data across Regions
  3. Use Amazon Aurora Global Database to synchronize all transactions
  4. Use Amazon DynamoDB Streams to replicate all DynamoDB transactions and sync them

Answer(s): A

Explanation:


Reference:

https://aws.amazon.com/dynamodb/



A company is closing one of its remote data centers. This site runs a 100 TB on-premises data warehouse solution. The company plans to use the AWS Schema Conversion Tool (AWS SCT) and AWS DMS for the migration to AWS. The site network bandwidth is 500 Mbps. A Database Specialist wants to migrate the on- premises data using Amazon S3 as the data lake and Amazon Redshift as the data warehouse. This move must take place during a 2-week period when source systems are shut down for maintenance. The data should stay encrypted at rest and in transit.
Which approach has the least risk and the highest likelihood of a successful data transfer?

  1. Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, start an AWS DMS task to move the data from the source to Amazon S3. Use AWS Glue to load the data from Amazon S3 to Amazon Redshift.
  2. Leverage AWS SCT and apply the converted schema to Amazon Redshift. Start an AWS DMS task with two AWS Snowball Edge devices to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS DMS to finish copying data to Amazon Redshift.
  3. Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, use a fleet of 10 TB dedicated encrypted drives using the AWS Import/Export feature to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS Glue to load the data to Amazon redshift.
  4. Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage a native database export feature to export the data and compress the files. Use the aws S3 cp multi-port upload command to upload these files to Amazon S3 with AWS KMS encryption. Once complete, load the data to Amazon Redshift using AWS Glue.

Answer(s): B



A company is looking to migrate a 1 TB Oracle database from on-premises to an Amazon Aurora PostgreSQL DB cluster. The company’s Database Specialist discovered that the Oracle database is storing 100 GB of large binary objects (LOBs) across multiple tables. The Oracle database has a maximum LOB size of 500 MB with an average LOB size of 350 MB. The Database Specialist has chosen AWS DMS to migrate the data with the largest replication instances.
How should the Database Specialist optimize the database migration using AWS DMS?

  1. Create a single task using full LOB mode with a LOB chunk size of 500 MB to migrate the data and LOBs together
  2. Create two tasks: task1 with LOB tables using full LOB mode with a LOB chunk size of 500 MB and task2 without LOBs
  3. Create two tasks: task1 with LOB tables using limited LOB mode with a maximum LOB size of 500 MB and task 2 without LOBs
  4. Create a single task using limited LOB mode with a maximum LOB size of 500 MB to migrate data and LOBs together

Answer(s): B



Page 12 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote