Free SAP-C01 Exam Braindumps (page: 11)

Page 11 of 134

A solutions architect needs to advise a company on how to migrate its on-premises data processing application to the AWS Cloud. Currently, users upload input files through a web portal. The web server then stores the uploaded files on NAS and messages the processing server over a message queue. Each media file can take up to 1 hour to process. The company has determined that the number of media files awaiting processing is significantly higher during business hours, with the number of files rapidly declining after business hours.

What is the MOST cost-effective migration recommendation?

  1. Create a queue using Amazon SQS. Configure the existing web server to publish to the new queue. When there are messages in the queue, invoke an AWS Lambda function to pull requests from the queue and process the files. Store the processed files in an Amazon S3 bucket.
  2. Create a queue using Amazon MQ. Configure the existing web server to publish to the new queue. When there are messages in the queue, create a new Amazon EC2 instance to pull requests from the queue and process the files. Store the processed files in Amazon EFS. Shut down the EC2 instance after the task is complete.
  3. Create a queue using Amazon MQ. Configure the existing web server to publish to the new queue. When there are messages in the queue, invoke an AWS Lambda function to pull requests from the queue and process the files. Store the processed files in Amazon EFS.
  4. Create a queue using Amazon SQS. Configure the existing web server to publish to the new queue. Use Amazon EC2 instances in an EC2 Auto Scaling group to pull requests from the queue and process the files. Scale the EC2 instances based on the SQS queue length. Store the processed files in an Amazon S3 bucket.

Answer(s): D

Explanation:

D) Create a queue using Amazon SQS. Configure the existing web server to publish to the new queue. Use Amazon EC2 instances in an EC2 Auto Scaling group to pull requests from the queue and process the files. Scale the EC2 instances based on the SQS queue length. Store the processed files in an Amazon S3 bucket is the correct solution because it is the most cost-effective option.

Amazon SQS provides a reliable and scalable message queue for handling the file processing requests.
EC2 Auto Scaling ensures that the number of instances dynamically adjusts based on the load (queue length), saving costs during low-demand periods.
Storing the processed files in Amazon S3 is cost-efficient and scalable for long-term storage, compared to more expensive options like Amazon EFS.
This setup provides flexibility, cost savings, and the ability to handle varying loads throughout the day.



A company is using Amazon OpenSearch Service to analyze data. The company loads data into an OpenSearch Service cluster with 10 data nodes from an Amazon S3 bucket that uses S3 Standard storage. The data resides in the cluster for 1 month for read-only analysis. After 1 month, the company deletes the index that contains the data from the cluster. For compliance purposes, the company must retain a copy of all input data.

The company is concerned about ongoing costs and asks a solutions architect to recommend a new solution. Which solution will meet these requirements MOST cost-effectively?

  1. Replace all the data nodes with UltraWarm nodes to handle the expected capacity. Transition the input data from S3 Standard to S3 Glacier Deep Archive when the company loads the data into the cluster.
  2. Reduce the number of data nodes in the cluster to 2 Add UltraWarm nodes to handle the expected capacity. Configure the indexes to transition to UltraWarm when OpenSearch Service ingests the data. Transition the input data to S3 Glacier Deep Archive after 1 month by using an S3 Lifecycle policy.
  3. Reduce the number of data nodes in the cluster to 2. Add UltraWarm nodes to handle the expected capacity. Configure the indexes to transition to UltraWarm when OpenSearch Service ingests the data. Add cold storage nodes to the cluster Transition the indexes from UltraWarm to cold storage. Delete the input data from the S3 bucket after 1 month by using an S3 Lifecycle policy.
  4. Reduce the number of data nodes in the cluster to 2. Add instance-backed data nodes to handle the expected capacity. Transition the input data from S3 Standard to S3 Glacier Deep Archive when the company loads the data into the cluster.

Answer(s): B

Explanation:

B) Reduce the number of data nodes in the cluster to 2. Add UltraWarm nodes to handle the expected capacity. Configure the indexes to transition to UltraWarm when OpenSearch Service ingests the data. Transition the input data to S3 Glacier Deep Archive after 1 month by using an S3 Lifecycle policy is the correct answer.

This solution significantly reduces costs by:

Reducing the number of data nodes (which are more expensive) to only 2.
Using UltraWarm nodes for cost-effective storage of read-only data that is frequently queried for the first month.
Using an S3 Lifecycle policy to transition the input data to S3 Glacier Deep Archive, the most cost-effective long-term storage solution, once the data is no longer needed for analysis in OpenSearch.
This combination reduces costs for both the OpenSearch cluster and S3 storage, meeting the company's cost requirements.



A company has 10 accounts that are part of an organization in AWS Organizations. AWS Config is configured in each account. All accounts belong to either the Prod OU or the NonProd OU.

The company has set up an Amazon EventBridge rule in each AWS account to notify an Amazon Simple Notification Service (Amazon SNS) topic when an Amazon EC2 security group inbound rule is created with 0.0.0.0/0 as the source. The company’s security team is subscribed to the SNS topic.

For all accounts in the NonProd OU, the security team needs to remove the ability to create a security group inbound rule that includes 0.0.0.0/0 as the source.

Which solution will meet this requirement with the LEAST operational overhead?

  1. Modify the EventBridge rule to invoke an AWS Lambda function to remove the security group inbound rule and to publish to the SNS topic. Deploy the updated rule to the NonProd OU.
  2. Add the vpc-sg-open-only-to-authorized-ports AWS Config managed rule to the NonProd OU.
  3. Configure an SCP to allow the ec2:AuthorizeSecurityGroupIngress action when the value of the aws:SourceIp condition key is not 0.0.0.0/0. Apply the SCP to the NonProd OU.
  4. Configure an SCP to deny the ec2:AuthorizeSecurityGroupIngress action when the value of the aws:SourceIp condition key is 0.0.0.0/0. Apply the SCP to the NonProd OU.

Answer(s): C

Explanation:

C) Configure an SCP to allow the ec2
action when the value of the aws
condition key is not 0.0.0.0/0. Apply the SCP to the NonProd OU is the correct solution.

Using a Service Control Policy (SCP) in AWS Organizations is the most effective way to prevent security group rules from allowing 0.0.0.0/0 (i.e., open to the internet) in the NonProd OU. This SCP can be applied across all accounts in the NonProd OU and will prevent any user from creating security group rules that include this source IP.

This approach minimizes operational overhead as the SCP is centrally managed and enforced across all accounts, ensuring that the restriction is consistently applied without needing to modify or deploy additional AWS Config rules or EventBridge rules.



A company hosts a Git repository in an on-premises data center. The company uses webhooks to invoke functionality that runs in the AWS Cloud. The company hosts the webhook logic on a set of Amazon EC2 instances in an Auto Scaling group that the company set as a target for an Application Load Balancer (ALB). The Git server calls the ALB for the configured webhooks. The company wants to move the solution to a serverless architecture.

Which solution will meet these requirements with the LEAST operational overhead?

  1. For each webhook, create and configure an AWS Lambda function URL. Update the Git servers to call the individual Lambda function URLs.
  2. Create an Amazon API Gateway HTTP API. Implement each webhook logic in a separate AWS Lambda function. Update the Git servers to call the API Gateway endpoint.
  3. Deploy the webhook logic to AWS App Runner. Create an ALB, and set App Runner as the target. Update the Git servers to call the ALB endpoint.
  4. Containerize the webhook logic. Create an Amazon Elastic Container Service (Amazon ECS) cluster, and run the webhook logic in AWS Fargate. Create an Amazon API Gateway REST API, and set Fargate as the target. Update the Git servers to call the API Gateway endpoint.

Answer(s): B

Explanation:

B) Create an Amazon API Gateway HTTP API. Implement each webhook logic in a separate AWS Lambda function. Update the Git servers to call the API Gateway endpoint is the correct solution because it offers a fully serverless architecture with minimal operational overhead. Using API Gateway allows the Git server to call a single endpoint, and each webhook can trigger a different AWS Lambda function, which processes the webhook logic. This solution is cost-effective, scales automatically, and removes the need to manage EC2 instances or an Auto Scaling group.



Page 11 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote