Free SAP-C01 Exam Braindumps (page: 52)

Page 52 of 134

An online retail company is migrating its legacy on-premises .NET application to AWS. The application runs on load-balanced frontend web servers, load-balanced application servers, and a Microsoft SQL Server database.

The company wants to use AWS managed services where possible and does not want to rewrite the application. A solutions architect needs to implement a solution to resolve scaling issues and minimize licensing costs as the application scales.

Which solution will meet these requirements MOST cost-effectively?

  1. Deploy Amazon EC2 instances in an Auto Scaling group behind an Application Load Balancer for the web tier and for the application tier. Use Amazon Aurora PostgreSQL with Babelfish turned on to replatform the SQL Server database.
  2. Create images of all the servers by using AWS Database Migration Service (AWS DMS). Deploy Amazon EC2 instances that are based on the on-premises imports. Deploy the instances in an Auto Scaling group behind a Network Load Balancer for the web tier and for the application tier. Use Amazon DynamoDB as the database tier.
  3. Containerize the web frontend tier and the application tier. Provision an Amazon Elastic Kubernetes Service (Amazon EKS) cluster. Create an Auto Scaling group behind a Network Load Balancer for the web tier and for the application tier. Use Amazon RDS for SQL Server to host the database.
  4. Separate the application functions into AWS Lambda functions. Use Amazon API Gateway for the web frontend tier and the application tier. Migrate the data to Amazon S3. Use Amazon Athena to query the data.

Answer(s): A

Explanation:

A: Deploying EC2 instances in an Auto Scaling group behind an Application Load Balancer for both the web and application tiers ensures scalability. Amazon Aurora PostgreSQL with Babelfish allows the company to replatform the SQL Server database without rewriting the application, minimizing licensing costs and improving scalability, making this the most cost-effective solution.



A software-as-a-service (SaaS) provider exposes APIs through an Application Load Balancer (ALB). The ALB connects to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster that is deployed in the us-east-1 Region. The exposed APIs contain usage of a few non-standard REST methods: LINK, UNLINK, LOCK, and UNLOCK.

Users outside the United States are reporting long and inconsistent response times for these APIs. A solutions architect needs to resolve this problem with a solution that minimizes operational overhead.

Which solution meets these requirements?

  1. Add an Amazon CloudFront distribution. Configure the ALB as the origin.
  2. Add an Amazon API Gateway edge-optimized API endpoint to expose the APIs. Configure the ALB as the target.
  3. Add an accelerator in AWS Global Accelerator. Configure the ALB as the origin.
  4. Deploy the APIs to two additional AWS Regions: eu-west-1 and ap-southeast-2. Add latency-based routing records in Amazon Route 53.

Answer(s): C

Explanation:

C: Adding an accelerator in AWS Global Accelerator reduces latency by routing user traffic to the optimal AWS edge location and then to the ALB. It improves performance for global users by providing faster and more consistent response times without requiring complex regional deployments or significant operational overhead.



A company runs an IoT application in the AWS Cloud. The company has millions of sensors that collect data from houses in the United States. The sensors use the MQTT protocol to connect and send data to a custom MQTT broker. The MQTT broker stores the data on a single Amazon EC2 instance. The sensors connect to the broker through the domain named iot.example.com. The company uses Amazon Route 53 as its DNS service. The company stores the data in Amazon DynamoDB.

On several occasions, the amount of data has overloaded the MQTT broker and has resulted in lost sensor data. The company must improve the reliability of the solution.

Which solution will meet these requirements?

  1. Create an Application Load Balancer (ALB) and an Auto Scaling group for the MQTT broker. Use the Auto Scaling group as the target for the ALB. Update the DNS record in Route 53 to an alias record. Point the alias record to the ALB. Use the MQTT broker to store the data.
  2. Set up AWS IoT Core to receive the sensor data. Create and configure a custom domain to connect to AWS IoT Core. Update the DNS record in Route 53 to point to the AWS IoT Core Data-ATS endpoint. Configure an AWS IoT rule to store the data.
  3. Create a Network Load Balancer (NLB). Set the MQTT broker as the target. Create an AWS Global Accelerator accelerator. Set the NLB as the endpoint for the accelerator. Update the DNS record in Route 53 to a multivalue answer record. Set the Global Accelerator IP addresses as values. Use the MQTT broker to store the data.
  4. Set up AWS IoT Greengrass to receive the sensor data. Update the DNS record in Route 53 to point to the AWS IoT Greengrass endpoint. Configure an AWS IoT rule to invoke an AWS Lambda function to store the data.

Answer(s): B

Explanation:

B: AWS IoT Core is a managed service designed to handle a massive scale of IoT data with the MQTT protocol. By setting up AWS IoT Core, you can eliminate the need to manage a custom MQTT broker on EC2. Updating Route 53 to point to AWS IoT Core ensures reliable, scalable handling of sensor data. Additionally, AWS IoT Core rules can be configured to store the data directly in DynamoDB, improving both reliability and scalability without operational overhead.



A company has Linux-based Amazon EC2 instances. Users must access the instances by using SSH with EC2 SSH key pairs. Each machine requires a unique EC2 key pair.

The company wants to implement a key rotation policy that will, upon request, automatically rotate all the EC2 key pairs and keep the keys in a securely encrypted place. The company will accept less than 1 minute of downtime during key rotation.

Which solution will meet these requirements?

  1. Store all the keys in AWS Secrets Manager. Define a Secrets Manager rotation schedule to invoke an AWS Lambda function to generate new key pairs. Replace public keys on EC2 instances. Update the private keys in Secrets Manager.
  2. Store all the keys in Parameter Store, a capability of AWS Systems Manager, as a string. Define a Systems Manager maintenance window to invoke an AWS Lambda function to generate new key pairs. Replace public keys on EC2 instances. Update the private keys in Parameter Store.
  3. Import the EC2 key pairs into AWS Key Management Service (AWS KMS). Configure automatic key rotation for these key pairs. Create an Amazon EventBridge scheduled rule to invoke an AWS Lambda function to initiate the key rotation in AWS KMS.
  4. Add all the EC2 instances to Fleet Manager, a capability of AWS Systems Manager. Define a Systems Manager maintenance window to issue a Systems Manager Run Command document to generate new key pairs and to rotate public keys to all the instances in Fleet Manager.

Answer(s): A

Explanation:

A: Storing the EC2 key pairs in AWS Secrets Manager allows secure management and rotation of the keys. A rotation schedule can invoke an AWS Lambda function to generate new key pairs, replace the public keys on EC2 instances, and update the private keys in Secrets Manager. This ensures secure key rotation with minimal downtime and meets the requirement for keeping the keys encrypted and easily accessible.



Page 52 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote