Free DBS-C01 Exam Braindumps (page: 43)

Page 43 of 82

A company has an on-premises SQL Server database. The users access the database using Active Directory authentication. The company successfully migrated its database to Amazon RDS for SQL Server. However, the company is concerned about user authentication in the AWS Cloud environment.
Which solution should a database specialist provide for the user to authenticate?

  1. Deploy Active Directory Federation Services (AD FS) on premises and configure it with an on-premises Active Directory. Set up delegation between the on-premises AD FS and AWS Security Token Service (AWS STS) to map user identities to a role using the AmazonRDSDirectoryServiceAccess managed IAM policy.
  2. Establish a forest trust between the on-premises Active Directory and AWS Directory Service for Microsoft Active Directory. Use AWS SSO to configure an Active Directory user delegated to access the databases in RDS for SQL Server.
  3. Use Active Directory Connector to redirect directory requests to the company€™s on-premises Active Directory without caching any information in the cloud. Use the RDS master user credentials to connect to the DB instance and configure SQL Server logins and users from the Active Directory users and groups.
  4. Establish a forest trust between the on-premises Active Directory and AWS Directory Service for Microsoft Active Directory. Ensure RDS for SQL Server is using mixed mode authentication. Use the RDS master user credentials to connect to the DB instance and configure SQL Server logins and users from the Active Directory users and groups.

Answer(s): D


Reference:

https://docs.aws.amazon.com/directoryservice/latest/admin-guide/ms_ad_tutorial_setup_trust.html



A company uses an Amazon Redshift cluster to run its analytical workloads. Corporate policy requires that the company’s data be encrypted at rest with customer managed keys. The company’s disaster recovery plan requires that backups of the cluster be copied into another AWS Region on a regular basis.
How should a database specialist automate the process of backing up the cluster data in compliance with these policies?

  1. Copy the AWS Key Management Service (AWS KMS) customer managed key from the source Region to the destination Region. Set up an AWS Glue job in the source Region to copy the latest snapshot of the Amazon Redshift cluster from the source Region to the destination Region. Use a time-based schedule in AWS Glue to run the job on a daily basis.
  2. Create a new AWS Key Management Service (AWS KMS) customer managed key in the destination Region. Create a snapshot copy grant in the destination Region specifying the new key. In the source Region, configure cross-Region snapshots for the Amazon Redshift cluster specifying the destination Region, the snapshot copy grant, and retention periods for the snapshot.
  3. Copy the AWS Key Management Service (AWS KMS) customer-managed key from the source Region to the destination Region. Create Amazon S3 buckets in each Region using the keys from their respective Regions. Use Amazon EventBridge (Amazon CloudWatch Events) to schedule an AWS Lambda function in the source Region to copy the latest snapshot to the S3 bucket in that Region. Configure S3 Cross-Region Replication to copy the snapshots to the destination Region, specifying the source and destination KMS
    key IDs in the replication configuration.
  4. Use the same customer-supplied key materials to create a CMK with the same private key in the destination Region. Configure cross-Region snapshots in the source Region targeting the destination Region. Specify the corresponding CMK in the destination Region to encrypt the snapshot.

Answer(s): B


Reference:

https://docs.aws.amazon.com/kms/latest/developerguide/kms-dg.pdf



A database specialist is launching a test graph database using Amazon Neptune for the first time. The database specialist needs to insert millions of rows of test observations from a .csv file that is stored in Amazon S3. The database specialist has been using a series of API calls to upload the data to the Neptune DB instance.
Which combination of steps would allow the database specialist to upload the data faster? (Choose three.)

  1. Ensure Amazon Cognito returns the proper AWS STS tokens to authenticate the Neptune DB instance to the S3 bucket hosting the CSV file.
  2. Ensure the vertices and edges are specified in different .csv files with proper header column formatting.
  3. Use AWS DMS to move data from Amazon S3 to the Neptune Loader.
  4. Curl the S3 URI while inside the Neptune DB instance and then run the addVertex or addEdge commands.
  5. Ensure an IAM role for the Neptune DB instance is configured with the appropriate permissions to allow access to the file in the S3 bucket.
  6. Create an S3 VPC endpoint and issue an HTTP POST to the database’s loader endpoint.

Answer(s): B,E,F


Reference:

https://docs.aws.amazon.com/neptune/latest/userguide/bulk-load-data.html https://docs.aws.amazon.com/neptune/latest/userguide/dms-neptune.html https://docs.aws.amazon.com/neptune/latest/userguide/bulk-load-data.html#bulk-load-prereqs-s3



A company is using Amazon DynamoDB global tables for an online gaming application. The game has players around the world. As the game has become more popular, the volume of requests to DynamoDB hasincreased significantly. Recently, players have reported that the game state is inconsistent between players in different countries. A database specialist observes that the ReplicationLatency metric for some of the replica tables is too high.
Which approach will alleviate the problem?

  1. Configure all replica tables to use DynamoDB auto scaling.
  2. Configure a DynamoDB Accelerator (DAX) cluster on each of the replicas.
  3. Configure the primary table to use DynamoDB auto scaling and the replica tables to use manually provisioned capacity.
  4. Configure the table-level write throughput limit service quota to a higher value.

Answer(s): A

Explanation:

Using DynamoDB auto scaling is the recommended way to manage throughput capacity settings for replica tables that use the provisioned mode.


Reference:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/globaltables_reqs_bestpractices.html



Page 43 of 82



Post your Comments and Discuss Amazon DBS-C01 exam with other Community members:

Pedro commented on April 27, 2024
Thanks for the dumps. It was an easy pass.
UNITED STATES
upvote

Keran commented on April 26, 2024
All of these questions are in the real exam. I just wrote my test yesterday. This is a valid exam dumps.
Anonymous
upvote

Mungara commented on March 14, 2023
thanks to this exam dumps, i felt confident and passed my exam with ease.
UNITED STATES
upvote

Mungara commented on March 14, 2023
Thanks to this exam dumps, I felt confident and passed my exam with ease.
UNITED STATES
upvote

otaku commented on August 11, 2022
just passed my dbs-c01 exam this site is really helped a lot.
Anonymous
upvote

Erik commented on March 02, 2022
These braindumps questions make passing very easy.
UNITED KINGDOM
upvote

Sanjev commented on January 12, 2022
This is the easiest way to get a 90%. Perfect exam dumps.
UNITED STATES
upvote

Abigail commented on September 20, 2021
I know using prep course are not ethical but I had to do this as this exam is way too hard to pass on your own. Thid prep course got me out of trouble.
UNITED STATES
upvote