Free SAP-C01 Exam Braindumps (page: 17)

Page 17 of 134

A company that uses AWS Organizations allows developers to experiment on AWS. As part of the landing zone that the company has deployed, developers use their company email address to request an account. The company wants to ensure that developers are not launching costly services or running services unnecessarily. The company must give developers a fixed monthly budget to limit their AWS costs.

Which combination of steps will meet these requirements? (Choose three.)

  1. Create an SCP to set a fixed monthly account usage limit. Apply the SCP to the developer accounts.
  2. Use AWS Budgets to create a fixed monthly budget for each developer’s account as part of the account creation process.
  3. Create an SCP to deny access to costly services and components. Apply the SCP to the developer accounts.
  4. Create an IAM policy to deny access to costly services and components. Apply the IAM policy to the developer accounts.
  5. Create an AWS Budgets alert action to terminate services when the budgeted amount is reached. Configure the action to terminate all services.
  6. Create an AWS Budgets alert action to send an Amazon Simple Notification Service (Amazon SNS) notification when the budgeted amount is reached. Invoke an AWS Lambda function to terminate all services.

Answer(s): B,C,F

Explanation:

B) Use AWS Budgets to create a fixed monthly budget for each developer’s account as part of the account creation process. AWS Budgets allows you to set a fixed monthly budget for each account, ensuring cost limits are in place from the start.

C) Create an SCP to deny access to costly services and components. Apply the SCP to the developer accounts. An SCP (Service Control Policy) restricts access to expensive services, preventing developers from launching costly resources.

F) Create an AWS Budgets alert action to send an Amazon Simple Notification Service (Amazon SNS) notification when the budgeted amount is reached. Invoke an AWS Lambda function to terminate all services. This ensures that when the budget is exceeded, an SNS alert triggers a Lambda function to terminate running services, effectively enforcing the cost limit.



A company has applications in an AWS account that is named Source. The account is in an organization in AWS Organizations. One of the applications uses AWS Lambda functions and stores inventory data in an Amazon Aurora database. The application deploys the Lambda functions by using a deployment package. The company has configured automated backups for Aurora.

The company wants to migrate the Lambda functions and the Aurora database to a new AWS account that is named Target. The application processes critical data, so the company must minimize downtime.

Which solution will meet these requirements?

  1. Download the Lambda function deployment package from the Source account. Use the deployment package and create new Lambda functions in the Target account. Share the automated Aurora DB cluster snapshot with the Target account.
  2. Download the Lambda function deployment package from the Source account. Use the deployment package and create new Lambda functions in the Target account. Share the Aurora DB cluster with the Target account by using AWS Resource Access Manager {AWS RAM). Grant the Target account permission to clone the Aurora DB cluster.
  3. Use AWS Resource Access Manager (AWS RAM) to share the Lambda functions and the Aurora DB cluster with the Target account. Grant the Target account permission to clone the Aurora DB cluster.
  4. Use AWS Resource Access Manager (AWS RAM) to share the Lambda functions with the Target account. Share the automated Aurora DB cluster snapshot with the Target account.

Answer(s): B

Explanation:

B) Download the Lambda function deployment package from the Source account. Use the deployment package and create new Lambda functions in the Target account. Share the Aurora DB cluster with the Target account by using AWS Resource Access Manager (AWS RAM). Grant the Target account permission to clone the Aurora DB cluster. This solution minimizes downtime by allowing the migration of the Lambda functions and the Aurora database to the Target account. Using AWS RAM to share the Aurora DB cluster allows the Target account to create a clone of the database, ensuring data continuity. Downloading and deploying the Lambda functions ensures that they are properly migrated to the new account.



A company runs a Python script on an Amazon EC2 instance to process data. The script runs every 10 minutes. The script ingests files from an Amazon S3 bucket and processes the files. On average, the script takes approximately 5 minutes to process each file The script will not reprocess a file that the script has already processed.

The company reviewed Amazon CloudWatch metrics and noticed that the EC2 instance is idle for approximately 40% of the time because of the file processing speed. The company wants to make the workload highly available and scalable. The company also wants to reduce long-term management overhead.

Which solution will meet these requirements MOST cost-effectively?

  1. Migrate the data processing script to an AWS Lambda function. Use an S3 event notification to invoke the Lambda function to process the objects when the company uploads the objects.
  2. Create an Amazon Simple Queue Service (Amazon SQS) queue. Configure Amazon S3 to send event notifications to the SQS queue. Create an EC2 Auto Scaling group with a minimum size of one instance. Update the data processing script to poll the SQS queue. Process the S3 objects that the SQS message identifies.
  3. Migrate the data processing script to a container image. Run the data processing container on an EC2 instance. Configure the container to poll the S3 bucket for new objects and to process the resulting objects.
  4. Migrate the data processing script to a container image that runs on Amazon Elastic Container Service (Amazon ECS) on AWS Fargate. Create an AWS Lambda function that calls the Fargate RunTaskAPI operation when the container processes the file. Use an S3 event notification to invoke the Lambda function.

Answer(s): A

Explanation:

A) Migrate the data processing script to an AWS Lambda function. Use an S3 event notification to invoke the Lambda function to process the objects when the company uploads the objects. This solution makes the workload highly available and scalable with minimal management overhead. By using AWS Lambda and S3 event notifications, the company avoids idle EC2 time, paying only for the actual processing time, which reduces costs. Lambda automatically scales to handle incoming events, eliminating the need to manage EC2 instances or containers, making it the most cost-effective solution.



A financial services company in North America plans to release a new online web application to its customers on AWS. The company will launch the application in the us-east-1 Region on Amazon EC2 instances. The application must be highly available and must dynamically scale to meet user traffic. The company also wants to implement a disaster recovery environment for the application in the us-west-1 Region by using active-passive failover.

Which solution will meet these requirements?

  1. Create a VPC in us-east-1 and a VPC in us-west-1. Configure VPC peering. In the us-east-1 VPC, create an Application Load Balancer (ALB) that extends across multiple Availability Zones in both VPCs. Create an Auto Scaling group that deploys the EC2 instances across the multiple Availability Zones in both VPCs. Place the Auto Scaling group behind the ALB.
  2. Create a VPC in us-east-1 and a VPC in us-west-1. In the us-east-1 VPC, create an Application Load Balancer (ALB) that extends across multiple Availability Zones in that VPC. Create an Auto Scaling group that deploys the EC2 instances across the multiple Availability Zones in the us-east-1 VPC. Place the Auto Scaling group behind the AL Set up the same configuration in the us-west-1 VPC. Create an Amazon Route 53 hosted zone. Create separate records for each AL Enable health checks to ensure high availability between Regions.
  3. Create a VPC in us-east-1 and a VPC in us-west-1. In the us-east-1 VPC, create an Application Load Balancer (ALB) that extends across multiple Availability Zones in that VP Create an Auto Scaling group that deploys the EC2 instances across the multiple Availability Zones in the us-east-1 VP Place the Auto Scaling group behind the ALB. Set up the same configuration in the us-west-1 VP Create an Amazon Route 53 hosted zone. Create separate records for each ALB. Enable health checks and configure a failover routing policy for each record.
  4. Create a VPC in us-east-1 and a VPC in us-west-1. Configure VPC peering. In the us-east-1 VPC, create an Application Load Balancer (ALB) that extends across multiple Availability Zones in both VPCs. Create an Auto Scaling group that deploys the EC2 instances across the multiple Availability Zones in both VPCs. Place the Auto Scaling group behind the ALB. Create an Amazon Route 53 hosted zone. Create a record for the ALB.

Answer(s): C

Explanation:

C) Create a VPC in us-east-1 and a VPC in us-west-1. In the us-east-1 VPC, create an Application Load Balancer (ALB) that extends across multiple Availability Zones in that VPC. Create an Auto Scaling group that deploys the EC2 instances across the multiple Availability Zones in the us-east-1 VPC. Place the Auto Scaling group behind the ALB. Set up the same configuration in the us-west-1 VPC. Create an Amazon Route 53 hosted zone. Create separate records for each ALB. Enable health checks and configure a failover routing policy for each record.

This solution ensures high availability by using an ALB and Auto Scaling group across multiple Availability Zones in us-east-1. It also provides an active-passive failover disaster recovery environment in us-west-1. By configuring Route 53 with failover routing and health checks, traffic will automatically switch to us-west-1 in case of an outage in us-east-1.
This setup meets the requirements for scalability, high availability, and disaster recovery.



Page 17 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote