Free SAP-C01 Exam Braindumps (page: 49)

Page 49 of 134

An online gaming company needs to rehost its gaming platform on AWS. The company's gaming application requires high performance computing (HPC) processing and has a leaderboard that changes frequently. An Ubuntu instance that is optimized for compute generation hosts a Node.js application for game display. Game state is tracked in an on-premises Redis instance.

The company needs a migration strategy that optimizes application performance.

Which solution will meet these requirements?

  1. Create an Auto Scaling group of m5.large Amazon EC2 Spot Instances behind an Application Load Balancer. Use an Amazon ElastlCache for Redis cluster to maintain the leaderboard.
  2. Create an Auto Scaling group of c5.large Amazon EC2 Spot Instances behind an Application Load Balancer. Use an Amazon OpenSearch Service cluster to maintain the leaderboard.
  3. Create an Auto Scaling group of c5.large Amazon EC2 On-Demand Instances behind an Application Load Balancer. Use an Amazon ElastiCache for Redis cluster to maintain the leaderboard.
  4. Create an Auto Scaling group of m5.large Amazon EC2 On-Demand Instances behind an Application Load Balancer. Use an Amazon DynamoDB table to maintain the leaderboard.

Answer(s): C

Explanation:

-C) Create an Auto Scaling group of c5.large Amazon EC2 On-Demand Instances behind an Application Load Balancer. Use an Amazon ElastiCache for Redis cluster to maintain the leaderboard: This solution is optimized for high-performance computing (HPC) because the c5.large instance type is designed for compute-intensive workloads. Using On-Demand Instances ensures consistent performance for the gaming platform, while ElastiCache for Redis provides in-memory data storage that is ideal for maintaining frequently updated data like a gaming leaderboard. Redis is well-suited for real-time applications, which optimizes both game state tracking and performance.



A solutions architect is designing an application to accept timesheet entries from employees on their mobile devices. Timesheets will be submitted weekly, with most of the submissions occurring on Friday. The data must be stored in a format that allows payroll administrators to run monthly reports. The infrastructure must be highly available and scale to match the rate of incoming data and reporting requests.

Which combination of steps meets these requirements while minimizing operational overhead? (Choose two.)

  1. Deploy the application to Amazon EC2 On-Demand Instances with load balancing across multiple Availability Zones. Use scheduled Amazon EC2 Auto Scaling to add capacity before the high volume of submissions on Fridays.
  2. Deploy the application in a container using Amazon Elastic Container Service (Amazon ECS) with load balancing across multiple Availability Zones. Use scheduled Service Auto Scaling to add capacity before the high volume of submissions on Fridays.
  3. Deploy the application front end to an Amazon S3 bucket served by Amazon CloudFront. Deploy the application backend using Amazon API Gateway with an AWS Lambda proxy integration.
  4. Store the timesheet submission data in Amazon Redshift. Use Amazon QuickSight to generate the reports using Amazon Redshift as the data source.
  5. Store the timesheet submission data in Amazon S3. Use Amazon Athena and Amazon QuickSight to generate the reports using Amazon S3 as the data source.

Answer(s): C,E

Explanation:

-C) Deploy the application front end to an Amazon S3 bucket served by Amazon CloudFront. Deploy the application backend using Amazon API Gateway with an AWS Lambda proxy integration: This solution provides a highly available, scalable, and serverless architecture with minimal operational overhead. S3 and CloudFront can handle large numbers of users without requiring infrastructure management, while API Gateway and Lambda automatically scale to match the demand.
-E) Store the timesheet submission data in Amazon S3. Use Amazon Athena and Amazon QuickSight to generate the reports using Amazon S3 as the data source: Storing data in S3 is cost-effective and highly available. Using Athena allows for querying data directly in S3 without the need for complex ETL processes, and QuickSight provides an easy way to visualize the data. This combination minimizes operational complexity and provides the ability to generate reports as needed.
These options ensure scalability, availability, and minimal operational overhead while supporting reporting and data storage needs.



A company is storing sensitive data in an Amazon S3 bucket. The company must log all activities for objects in the S3 bucket and must keep the logs for 5 years. The company's security team also must receive an email notification every time there is an attempt to delete data in the S3 bucket.

Which combination of steps will meet these requirements MOST cost-effectively? (Choose three.)

  1. Configure AWS CloudTrail to log S3 data events.
  2. Configure S3 server access logging for the S3 bucket.
  3. Configure Amazon S3 to send object deletion events to Amazon Simple Email Service (Amazon SES).
  4. Configure Amazon S3 to send object deletion events to an Amazon EventBridge event bus that publishes to an Amazon Simple Notification Service (Amazon SNS) topic.
  5. Configure Amazon S3 to send the logs to Amazon Timestream with data storage tiering.
  6. Configure a new S3 bucket to store the logs with an S3 Lifecycle policy.

Answer(s): A,D,F

Explanation:

-A) Configure AWS CloudTrail to log S3 data events: CloudTrail can log S3 object-level API activity, including data access and deletion events. This helps meet the requirement of logging all activities for objects in the S3 bucket.
-D) Configure Amazon S3 to send object deletion events to an Amazon EventBridge event bus that publishes to an Amazon Simple Notification Service (Amazon SNS) topic: This setup allows for automatic email notifications to be sent whenever there is an attempt to delete an object in the S3 bucket. SNS sends the email, and EventBridge triggers the notification based on deletion events.
-F) Configure a new S3 bucket to store the logs with an S3 Lifecycle policy: Storing logs in S3 is cost-effective. Using an S3 Lifecycle policy ensures that logs are automatically archived or deleted after 5 years, meeting the retention requirement.
This solution meets the company's needs for logging, notifications, and cost-effectiveness with minimal operational complexity.



A company is building a hybrid environment that includes servers in an on-premises data center and in the AWS Cloud. The company has deployed Amazon EC2 instances in three VPCs. Each VPC is in a different AWS Region. The company has established an AWS Direct. Connect connection to the data center from the Region that is closest to the data center.

The company needs the servers in the on-premises data center to have access to the EC2 instances in all three VPCs. The servers in the on-premises data center also must have access to AWS public services.

Which combination of steps will meet these requirements with the LEAST cost? (Choose two.)

  1. Create a Direct Connect gateway in the Region that is closest to the data center. Attach the Direct Connect connection to the Direct Connect gateway. Use the Direct Connect gateway to connect the VPCs in the other two Regions.
  2. Set up additional Direct Connect connections from the on-premises data center to the other two Regions.
  3. Create a private VIF. Establish an AWS Site-to-Site VPN connection over the private VIF to the VPCs in the other two Regions.
  4. Create a public VIF. Establish an AWS Site-to-Site VPN connection over the public VIF to the VPCs in the other two Regions.
  5. Use VPC peering to establish a connection between the VPCs across the Regions Create a private VIF with the existing Direct Connect connection to connect to the peered VPCs.

Answer(s): A,D

Explanation:

-A) Create a Direct Connect gateway in the Region that is closest to the data center. Attach the Direct Connect connection to the Direct Connect gateway. Use the Direct Connect gateway to connect the VPCs in the other two Regions: The Direct Connect gateway allows for centralized and cost-effective connectivity between on-premises data centers and VPCs in multiple AWS Regions. This setup avoids the need for multiple Direct Connect connections, reducing costs.
-D) Create a public VIF. Establish an AWS Site-to-Site VPN connection over the public VIF to the VPCs in the other two Regions: A public VIF allows access to public AWS services like S3 and DynamoDB. Establishing a VPN over the public VIF provides secure communication between the on-premises data center and AWS services without additional Direct Connect connections, minimizing costs.
This combination provides secure, scalable, and cost-effective access to EC2 instances in multiple Regions and AWS public services.



Page 49 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote