Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 46)

Page 46 of 134

A company’s public API runs as tasks on Amazon Elastic Container Service (Amazon ECS). The tasks run on AWS Fargate behind an Application Load Balancer (ALB) and are configured with Service Auto Scaling for the tasks based on CPU utilization. This service has been running well for several months.

Recently, API performance slowed down and made the application unusable. The company discovered that a significant number of SQL injection attacks had occurred against the API and that the API service had scaled to its maximum amount.

A solutions architect needs to implement a solution that prevents SQL injection attacks from reaching the ECS API service. The solution must allow legitimate traffic through and must maximize operational efficiency.

Which solution meets these requirements?

  1. Create a new AWS WAF web ACL to monitor the HTTP requests and HTTPS requests that are forwarded to the ALB in front of the ECS tasks.
  2. Create a new AWS WAF Bot Control implementation. Add a rule in the AWS WAF Bot Control managed rule group to monitor traffic and allow only legitimate traffic to the ALB in front of the ECS tasks.
  3. Create a new AWS WAF web ACL. Add a new rule that blocks requests that match the SQL database rule group. Set the web ACL to allow all other traffic that does not match those rules. Attach the web ACL to the ALB in front of the ECS tasks.
  4. Create a new AWS WAF web ACL. Create a new empty IP set in AWS WAF. Add a new rule to the web ACL to block requests that originate from IP addresses in the new IP set. Create an AWS Lambda function that scrapes the API logs for IP addresses that send SQL injection attacks, and add those IP addresses to the IP set. Attach the web ACL to the ALB in front of the ECS tasks.

Answer(s): C

Explanation:

The selected solution is C because it effectively addresses the SQL injection attacks while maximizing operational efficiency.
1.SQL Injection Protection: By creating an AWS WAF web ACL and adding a rule that specifically blocks requests matching known patterns associated with SQL injection attacks, you can directly target and prevent malicious traffic from reaching the API service.
2.Allow Legitimate Traffic: The rule set in the web ACL will block only those requests that match the SQL injection patterns, while allowing all other legitimate traffic to pass through. This ensures that valid users can still access the API without disruption.
3.Operational Efficiency: AWS WAF is a managed service that requires minimal operational overhead. Once configured, it automatically filters requests without the need for ongoing management or manual intervention, effectively enhancing the API's security posture.
This approach provides a proactive security measure against SQL injection while maintaining performance and availability for legitimate users.



An environmental company is deploying sensors in major cities throughout a country to measure air quality. The sensors connect to AWS IoT Core to ingest timeseries data readings. The company stores the data in Amazon DynamoDB.

For business continuity, the company must have the ability to ingest and store data in two AWS Regions.

Which solution will meet these requirements?

  1. Create an Amazon Route 53 alias failover routing policy with values for AWS IoT Core data endpoints in both Regions Migrate data to Amazon Aurora global tables.
  2. Create a domain configuration for AWS IoT Core in each Region. Create an Amazon Route 53 latency-based routing policy. Use AWS IoT Core data endpoints in both Regions as values. Migrate the data to Amazon MemoryDB for Redis and configure cross-Region replication.
  3. Create a domain configuration for AWS IoT Core in each Region. Create an Amazon Route 53 health check that evaluates domain configuration health. Create a failover routing policy with values for the domain name from the AWS IoT Core domain configurations. Update the DynamoDB table to a global table.
  4. Create an Amazon Route 53 latency-based routing policy. Use AWS IoT Core data endpoints in both Regions as values. Configure DynamoDB streams and cross-Region data replication.

Answer(s): C

Explanation:

The selected solution is C because it effectively addresses the requirements for data ingestion and storage in two AWS Regions while ensuring business continuity.
1.Domain Configuration for AWS IoT Core: By creating a domain configuration for AWS IoT Core in each Region, the solution enables the sensors to connect to the nearest endpoint, enhancing data ingestion efficiency.
2.Amazon Route 53 Health Checks: Implementing health checks allows the system to monitor the status of the IoT Core endpoints continuously. If one Region becomes unavailable, Route 53 can redirect traffic to the other Region seamlessly.
3.Failover Routing Policy: The failover routing policy ensures that in case of a failure in one Region, the traffic is automatically routed to the other Region. This provides a robust mechanism for maintaining data ingestion without interruption.
4.Global Tables in DynamoDB: Updating the DynamoDB table to a global table allows for automatic cross-Region replication of data. This ensures that the timeseries data from the sensors is consistently stored in both Regions, providing redundancy and durability.
This approach maximizes resilience and guarantees that the company can continue to ingest and store air quality data even in the event of a Regional outage.



A company uses AWS Organizations for a multi-account setup in the AWS Cloud. The company's finance team has a data processing application that uses AWS Lambda and Amazon DynamoDB. The company's marketing team wants to access the data that is stored in the DynamoDB table.

The DynamoDB table contains confidential data. The marketing team can have access to only specific attributes of data in the DynamoDB table. The finance team and the marketing team have separate AWS accounts.

What should a solutions architect do to provide the marketing team with the appropriate access to the DynamoDB table?

  1. Create an SCP to grant the marketing team's AWS account access to the specific attributes of the DynamoDB table. Attach the SCP to the OU of the finance team.
  2. Create an IAM role in the finance team's account by using IAM policy conditions for specific DynamoDB attributes (fine-grained access control). Establish trust with the marketing team's account. In the marketing team's account, create an IAM role that has permissions to assume the IAM role in the finance team's account.
  3. Create a resource-based IAM policy that includes conditions for specific DynamoDB attributes (fine-grained access control). Attach the policy to the DynamoDB table. In the marketing team's account, create an IAM role that has permissions to access the DynamoDB table in the finance team's account.
  4. Create an IAM role in the finance team's account to access the DynamoDB table. Use an IAM permissions boundary to limit the access to the specific attributes. In the marketing team's account, create an IAM role that has permissions to assume the IAM role in the finance team's account.

Answer(s): B

Explanation:

The selected solution is B because it allows the marketing team to access specific attributes of the DynamoDB table while maintaining fine-grained access control. By creating an IAM role in the finance team's account with appropriate policy conditions for the desired DynamoDB attributes, and establishing trust with the marketing team's account, the marketing team can assume this role to access the data they need. This approach ensures that the marketing team only gets access to the specific attributes they are permitted to see, without exposing the entire table's contents, thereby adhering to data confidentiality requirements.



A solutions architect is creating an application that stores objects in an Amazon S3 bucket. The solutions architect must deploy the application in two AWS Regions that will be used simultaneously. The objects in the two S3 buckets must remain synchronized with each other.

Which combination of steps will meet these requirements with the LEAST operational overhead? (Choose three.)

  1. Create an S3 Multi-Region Access Point Change the application to refer to the Multi-Region Access Point
  2. Configure two-way S3 Cross-Region Replication (CRR) between the two S3 buckets
  3. Modify the application to store objects in each S3 bucket
  4. Create an S3 Lifecycle rule for each S3 bucket to copy objects from one S3 bucket to the other S3 bucket
  5. Enable S3 Versioning for each S3 bucket
  6. Configure an event notification for each S3 bucket to invoke an AWS Lambda function to copy objects from one S3 bucket to the other S3 bucket

Answer(s): A,B,E

Explanation:

The selected solutions are A, B, and E because they effectively ensure that the objects in the two S3 buckets remain synchronized while minimizing operational overhead:
-A: Creating an S3 Multi-Region Access Point allows the application to seamlessly access and manage the S3 buckets across different regions. It simplifies the connection to the buckets without requiring multiple configurations in the application.
-B: Configuring two-way S3 Cross-Region Replication (CRR) ensures that any changes made to the objects in one bucket are automatically replicated to the other bucket. This provides real-time synchronization between the two regions.
-E: Enabling S3 Versioning on each bucket ensures that all versions of an object are preserved, allowing for easy recovery and management of objects in case of accidental deletions or overwrites. This is essential for maintaining data integrity across regions.
Together, these steps provide a robust solution for synchronizing objects in S3 buckets across two AWS regions with minimal management efforts.



Page 46 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote