Free DBS-C01 Exam Braindumps (page: 21)

Page 21 of 82

A Database Specialist must create a read replica to isolate read-only queries for an Amazon RDS for MySQL DB instance. Immediately after creating the read replica, users that query it report slow response times.
What could be causing these slow response times?

  1. New volumes created from snapshots load lazily in the background
  2. Long-running statements on the master
  3. Insufficient resources on the master
  4. Overload of a single replication thread by excessive writes on the master

Answer(s): A



A company developed an AWS CloudFormation template used to create all new Amazon DynamoDB tables in its AWS account. The template configures provisioned throughput capacity using hard-coded values. The company wants to change the template so that the tables it creates in the future have independently configurable read and write capacity units assigned.
Which solution will enable this change?

  1. Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. Configure DynamoDB to provision throughput capacity using the stack’s mappings.
  2. Add values for two Number parameters, rcuCount and wcuCount, to the template. Replace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.
  3. Add values for the rcuCount and wcuCount parameters as outputs of the template. Configure DynamoDB to provision throughput capacity using the stack outputs.
  4. Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. Replace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.

Answer(s): B



A retail company with its main office in New York and another office in Tokyo plans to build a database solution on AWS. The company’s main workload consists of a mission-critical application that updates its application data in a data store. The team at the Tokyo office is building dashboards with complex analytical queries using the application data. The dashboards will be used to make buying decisions, so they need to have access to the application data in less than 1 second.
Which solution meets these requirements?

  1. Use an Amazon RDS DB instance deployed in the us-east-1 Region with a read replica instance in the ap- northeast-1 Region. Create an Amazon ElastiCache cluster in the ap-northeast-1 Region to cache application data from the replica to generate the dashboards.
  2. Use an Amazon DynamoDB global table in the us-east-1 Region with replication into the ap-northeast-1 Region. Use Amazon QuickSight for displaying dashboard results.
  3. Use an Amazon RDS for MySQL DB instance deployed in the us-east-1 Region with a read replica instance in the ap-northeast-1 Region. Have the dashboard application read from the read replica.
  4. Use an Amazon Aurora global database. Deploy the writer instance in the us-east-1 Region and the replica in the ap-northeast-1 Region. Have the dashboard application read from the replica ap-northeast-1 Region.

Answer(s): D



A company is using Amazon RDS for PostgreSQL. The Security team wants all database connection requests to be logged and retained for 180 days. The RDS for PostgreSQL DB instance is currently using the default parameter group. A Database Specialist has identified that setting the log_connections parameter to 1 will enable connections logging.

Which combination of steps should the Database Specialist take to meet the logging and retention requirements? (Choose two.)

  1. Update the log_connections parameter in the default parameter group
  2. Create a custom parameter group, update the log_connections parameter, and associate the parameter with the DB instance
  3. Enable publishing of database engine logs to Amazon CloudWatch Logs and set the event expiration to 180 days
  4. Enable publishing of database engine logs to an Amazon S3 bucket and set the lifecycle policy to 180 days
  5. Connect to the RDS PostgreSQL host and update the log_connections parameter in the postgresql.conf file

Answer(s): B,C


Reference:

https://aws.amazon.com/blogs/database/working-with-rds-and-aurora-postgresql-logs-part-1/



Page 21 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote