Free DBS-C01 Exam Braindumps (page: 45)

Page 45 of 82

A company is building a web application on AWS. The application requires the database to support read and write operations in multiple AWS Regions simultaneously. The database also needs to propagate data changes between Regions as the changes occur. The application must be highly available and must provide latency of single-digit milliseconds.
Which solution meets these requirements?

  1. Amazon DynamoDB global tables
  2. Amazon DynamoDB streams with AWS Lambda to replicate the data
  3. An Amazon ElastiCache for Redis cluster with cluster mode enabled and multiple shards
  4. An Amazon Aurora global database

Answer(s): A

Explanation:

Global tables enable you to read and write your data locally providing single-digit-millisecond latency for your globally distributed application at any scale.


Reference:

https://aws.amazon.com/dynamodb/global-tables/



A company is using Amazon Neptune as the graph database for one of its products. The company’s data science team accidentally created large amounts of temporary information during an ETL process. The Neptune DB cluster automatically increased the storage space to accommodate the new data, but the data science team deleted the unused information.
What should a database specialist do to avoid unnecessary charges for the unused cluster volume space?

  1. Take a snapshot of the cluster volume. Restore the snapshot in another cluster with a smaller volume size.
  2. Use the AWS CLI to turn on automatic resizing of the cluster volume.
  3. Export the cluster data into a new Neptune DB cluster.
  4. Add a Neptune read replica to the cluster. Promote this replica as a new primary DB instance. Reset the storage space of the cluster.

Answer(s): C

Explanation:

In addition, the post offers programmatic approaches for automatically stopping or detecting idle resources that are incurring costs, allowing you to avoid unnecessary charges.


Reference:

https://aws.amazon.com/blogs/machine-learning/right-sizing-resources-and-avoiding-unnecessarycosts-in-amazon-sagemaker/



A database specialist is responsible for designing a highly available solution for online transaction processing (OLTP) using Amazon RDS for MySQL production databases. Disaster recovery requirements include a cross- Region deployment along with an RPO of 5 minutes and RTO of 30 minutes.
What should the database specialist do to align to the high availability and disaster recovery requirements?

  1. Use a Multi-AZ deployment in each Region.
  2. Use read replica deployments in all Availability Zones of the secondary Region.
  3. Use Multi-AZ and read replica deployments within a Region
  4. Use Multi-AZ and deploy a read replica in a secondary Region.

Answer(s): D

Explanation:

DR for Multi-AZ with in-Region read replicas – While Amazon RDS Multi-AZ provides HA and data protection, the associated in-Region read replica renders the scalability of read-only workloads, and the cross- Region automated backups feature provides DR.


Reference:

https://dataintegration.info/managed-disaster-recovery-with-amazon-rds-for-oracle-cross- regionautomated-backups-part-1



A media company wants to use zero-downtime patching (ZDP) for its Amazon Aurora MySQL database. Multiple processing applications are using SSL certificates to connect to database endpoints and the read replicas.
Which factor will have the LEAST impact on the success of ZDP?

  1. Binary logging is enabled, or binary log replication is in progress.
  2. Current SSL connections are open to the database.
  3. Temporary tables or table locks are in use.
  4. The value of the lower_case_table_names server parameter was set to 0 when the tables were created.

Answer(s): D

Explanation:

Aurora MySQL 2.10 and higher, Aurora can perform a zero-downtime patch when binary log replication is enabled.


Reference:

https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Updates.Patching.html



Page 45 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote