Free SAP-C01 Exam Braindumps (page: 46)

Page 46 of 134

A company’s public API runs as tasks on Amazon Elastic Container Service (Amazon ECS). The tasks run on AWS Fargate behind an Application Load Balancer (ALB) and are configured with Service Auto Scaling for the tasks based on CPU utilization. This service has been running well for several months.

Recently, API performance slowed down and made the application unusable. The company discovered that a significant number of SQL injection attacks had occurred against the API and that the API service had scaled to its maximum amount.

A solutions architect needs to implement a solution that prevents SQL injection attacks from reaching the ECS API service. The solution must allow legitimate traffic through and must maximize operational efficiency.

Which solution meets these requirements?

  1. Create a new AWS WAF web ACL to monitor the HTTP requests and HTTPS requests that are forwarded to the ALB in front of the ECS tasks.
  2. Create a new AWS WAF Bot Control implementation. Add a rule in the AWS WAF Bot Control managed rule group to monitor traffic and allow only legitimate traffic to the ALB in front of the ECS tasks.
  3. Create a new AWS WAF web ACL. Add a new rule that blocks requests that match the SQL database rule group. Set the web ACL to allow all other traffic that does not match those rules. Attach the web ACL to the ALB in front of the ECS tasks.
  4. Create a new AWS WAF web ACL. Create a new empty IP set in AWS WAF. Add a new rule to the web ACL to block requests that originate from IP addresses in the new IP set. Create an AWS Lambda function that scrapes the API logs for IP addresses that send SQL injection attacks, and add those IP addresses to the IP set. Attach the web ACL to the ALB in front of the ECS tasks.

Answer(s): C

Explanation:

The selected solution is C because it effectively addresses the SQL injection attacks while maximizing operational efficiency.
1.SQL Injection Protection: By creating an AWS WAF web ACL and adding a rule that specifically blocks requests matching known patterns associated with SQL injection attacks, you can directly target and prevent malicious traffic from reaching the API service.
2.Allow Legitimate Traffic: The rule set in the web ACL will block only those requests that match the SQL injection patterns, while allowing all other legitimate traffic to pass through. This ensures that valid users can still access the API without disruption.
3.Operational Efficiency: AWS WAF is a managed service that requires minimal operational overhead. Once configured, it automatically filters requests without the need for ongoing management or manual intervention, effectively enhancing the API's security posture.
This approach provides a proactive security measure against SQL injection while maintaining performance and availability for legitimate users.



An environmental company is deploying sensors in major cities throughout a country to measure air quality. The sensors connect to AWS IoT Core to ingest timeseries data readings. The company stores the data in Amazon DynamoDB.

For business continuity, the company must have the ability to ingest and store data in two AWS Regions.

Which solution will meet these requirements?

  1. Create an Amazon Route 53 alias failover routing policy with values for AWS IoT Core data endpoints in both Regions Migrate data to Amazon Aurora global tables.
  2. Create a domain configuration for AWS IoT Core in each Region. Create an Amazon Route 53 latency-based routing policy. Use AWS IoT Core data endpoints in both Regions as values. Migrate the data to Amazon MemoryDB for Redis and configure cross-Region replication.
  3. Create a domain configuration for AWS IoT Core in each Region. Create an Amazon Route 53 health check that evaluates domain configuration health. Create a failover routing policy with values for the domain name from the AWS IoT Core domain configurations. Update the DynamoDB table to a global table.
  4. Create an Amazon Route 53 latency-based routing policy. Use AWS IoT Core data endpoints in both Regions as values. Configure DynamoDB streams and cross-Region data replication.

Answer(s): C

Explanation:

The selected solution is C because it effectively addresses the requirements for data ingestion and storage in two AWS Regions while ensuring business continuity.
1.Domain Configuration for AWS IoT Core: By creating a domain configuration for AWS IoT Core in each Region, the solution enables the sensors to connect to the nearest endpoint, enhancing data ingestion efficiency.
2.Amazon Route 53 Health Checks: Implementing health checks allows the system to monitor the status of the IoT Core endpoints continuously. If one Region becomes unavailable, Route 53 can redirect traffic to the other Region seamlessly.
3.Failover Routing Policy: The failover routing policy ensures that in case of a failure in one Region, the traffic is automatically routed to the other Region. This provides a robust mechanism for maintaining data ingestion without interruption.
4.Global Tables in DynamoDB: Updating the DynamoDB table to a global table allows for automatic cross-Region replication of data. This ensures that the timeseries data from the sensors is consistently stored in both Regions, providing redundancy and durability.
This approach maximizes resilience and guarantees that the company can continue to ingest and store air quality data even in the event of a Regional outage.



A company uses AWS Organizations for a multi-account setup in the AWS Cloud. The company's finance team has a data processing application that uses AWS Lambda and Amazon DynamoDB. The company's marketing team wants to access the data that is stored in the DynamoDB table.

The DynamoDB table contains confidential data. The marketing team can have access to only specific attributes of data in the DynamoDB table. The finance team and the marketing team have separate AWS accounts.

What should a solutions architect do to provide the marketing team with the appropriate access to the DynamoDB table?

  1. Create an SCP to grant the marketing team's AWS account access to the specific attributes of the DynamoDB table. Attach the SCP to the OU of the finance team.
  2. Create an IAM role in the finance team's account by using IAM policy conditions for specific DynamoDB attributes (fine-grained access control). Establish trust with the marketing team's account. In the marketing team's account, create an IAM role that has permissions to assume the IAM role in the finance team's account.
  3. Create a resource-based IAM policy that includes conditions for specific DynamoDB attributes (fine-grained access control). Attach the policy to the DynamoDB table. In the marketing team's account, create an IAM role that has permissions to access the DynamoDB table in the finance team's account.
  4. Create an IAM role in the finance team's account to access the DynamoDB table. Use an IAM permissions boundary to limit the access to the specific attributes. In the marketing team's account, create an IAM role that has permissions to assume the IAM role in the finance team's account.

Answer(s): B

Explanation:

The selected solution is B because it allows the marketing team to access specific attributes of the DynamoDB table while maintaining fine-grained access control. By creating an IAM role in the finance team's account with appropriate policy conditions for the desired DynamoDB attributes, and establishing trust with the marketing team's account, the marketing team can assume this role to access the data they need. This approach ensures that the marketing team only gets access to the specific attributes they are permitted to see, without exposing the entire table's contents, thereby adhering to data confidentiality requirements.



A solutions architect is creating an application that stores objects in an Amazon S3 bucket. The solutions architect must deploy the application in two AWS Regions that will be used simultaneously. The objects in the two S3 buckets must remain synchronized with each other.

Which combination of steps will meet these requirements with the LEAST operational overhead? (Choose three.)

  1. Create an S3 Multi-Region Access Point Change the application to refer to the Multi-Region Access Point
  2. Configure two-way S3 Cross-Region Replication (CRR) between the two S3 buckets
  3. Modify the application to store objects in each S3 bucket
  4. Create an S3 Lifecycle rule for each S3 bucket to copy objects from one S3 bucket to the other S3 bucket
  5. Enable S3 Versioning for each S3 bucket
  6. Configure an event notification for each S3 bucket to invoke an AWS Lambda function to copy objects from one S3 bucket to the other S3 bucket

Answer(s): A,B,E

Explanation:

The selected solutions are A, B, and E because they effectively ensure that the objects in the two S3 buckets remain synchronized while minimizing operational overhead:
-A: Creating an S3 Multi-Region Access Point allows the application to seamlessly access and manage the S3 buckets across different regions. It simplifies the connection to the buckets without requiring multiple configurations in the application.
-B: Configuring two-way S3 Cross-Region Replication (CRR) ensures that any changes made to the objects in one bucket are automatically replicated to the other bucket. This provides real-time synchronization between the two regions.
-E: Enabling S3 Versioning on each bucket ensures that all versions of an object are preserved, allowing for easy recovery and management of objects in case of accidental deletions or overwrites. This is essential for maintaining data integrity across regions.
Together, these steps provide a robust solution for synchronizing objects in S3 buckets across two AWS regions with minimal management efforts.



Page 46 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote