Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 68)

Page 68 of 134

A company wants to refactor its retail ordering web application that currently has a load-balanced Amazon EC2 instance fleet for web hosting, database API services, and business logic. The company needs to create a decoupled, scalable architecture with a mechanism for retaining failed orders while also minimizing operational costs.

Which solution will meet these requirements?

  1. Use Amazon S3 for web hosting with Amazon API Gateway for database API services. Use Amazon Simple Queue Service (Amazon SQS) for order queuing. Use Amazon Elastic Container Service (Amazon ECS) for business logic with Amazon SQS long polling for retaining failed orders.
  2. Use AWS Elastic Beanstalk for web hosting with Amazon API Gateway for database API services. Use Amazon MQ for order queuing. Use AWS Step Functions for business logic with Amazon S3 Glacier Deep Archive for retaining failed orders.
  3. Use Amazon S3 for web hosting with AWS AppSync for database API services. Use Amazon Simple Queue Service (Amazon SQS) for order queuing. Use AWS Lambda for business logic with an Amazon SQS dead-letter queue for retaining failed orders.
  4. Use Amazon Lightsail for web hosting with AWS AppSync for database API services. Use Amazon Simple Email Service (Amazon SES) for order queuing. Use Amazon Elastic Kubernetes Service (Amazon EKS) for business logic with Amazon OpenSearch Service for retaining failed orders.

Answer(s): C

Explanation:

C) Using Amazon S3 for web hosting and AWS AppSync for database API services provides a highly scalable and cost-effective architecture. For order queuing, Amazon Simple Queue Service (SQS) ensures decoupling and scalability. Integrating AWS Lambda for business logic provides a serverless, scalable solution that minimizes operational overhead. Implementing an SQS dead-letter queue ensures that failed orders are retained for future processing or debugging. This solution effectively refactors the application to meet the requirements of scalability, decoupling, and operational cost minimization.



A company hosts a web application on AWS in the us-east-1 Region. The application servers are distributed across three Availability Zones behind an Application Load Balancer. The database is hosted in a MySQL database on an Amazon EC2 instance. A solutions architect needs to design a cross-Region data recovery solution using AWS services with an RTO of less than 5 minutes and an RPO of less than 1 minute. The solutions architect is deploying application servers in us-west-2, and has configured Amazon Route 53 health checks and DNS failover to us-west-2.

Which additional step should the solutions architect take?

  1. Migrate the database to an Amazon RDS for MySQL instance with a cross-Region read replica in us-west-2.
  2. Migrate the database to an Amazon Aurora global database with the primary in us-east-1 and the secondary in us-west-2.
  3. Migrate the database to an Amazon RDS for MySQL instance with a Multi-AZ deployment.
  4. Create a MySQL standby database on an Amazon EC2 instance in us-west-2.

Answer(s): B

Explanation:

B) Using an Amazon Aurora global database provides a highly available and low-latency cross-Region replication solution. With the primary database in us-east-1 and a secondary in us-west-2, Aurora ensures RPO of less than 1 minute due to asynchronous replication with low lag. In case of a failure in us-east-1, the secondary in us-west-2 can be promoted to primary within the required RTO of less than 5 minutes. This solution meets both the RTO and RPO requirements and ensures seamless cross-Region disaster recovery.



A company is using AWS Organizations to manage multiple accounts. Due to regulatory requirements, the company wants to restrict specific member accounts to certain AWS Regions, where they are permitted to deploy resources. The resources in the accounts must be tagged, enforced based on a group standard, and centrally managed with minimal configuration.

What should a solutions architect do to meet these requirements?

  1. Create an AWS Config rule in the specific member accounts to limit Regions and apply a tag policy.
  2. From the AWS Billing and Cost Management console, in the management account, disable Regions for the specific member accounts and apply a tag policy on the root.
  3. Associate the specific member accounts with the root. Apply a tag policy and an SCP using conditions to limit Regions.
  4. Associate the specific member accounts with a new OU. Apply a tag policy and an SCP using conditions to limit Regions.

Answer(s): D

Explanation:

D) Associating the specific member accounts with a new Organizational Unit (OU) and applying both a tag policy and a Service Control Policy (SCP) is the best approach. The SCP can be used to restrict access to certain AWS Regions, while the tag policy ensures consistent tagging across resources in those accounts. This solution allows for centralized management and enforces the required limitations on Region access and tagging, fulfilling the regulatory requirements with minimal configuration.



A company has an application that generates reports and stores them in an Amazon S3 bucket. When a user accesses their report, the application generates a signed URL to allow the user to download the report. The company's security team has discovered that the files are public and that anyone can download them without authentication. The company has suspended the generation of new reports until the problem is resolved.

Which set of actions will immediately remediate the security issue without impacting the application's normal workflow?

  1. Create an AWS Lambda function that applies a deny all policy for users who are not authenticated. Create a scheduled event to invoke the Lambda function.
  2. Review the AWS Trusted Advisor bucket permissions check and implement the recommended actions.
  3. Run a script that puts a private ACL on all of the objects in the bucket.
  4. Use the Block Public Access feature in Amazon S3 to set the IgnorePublicAcIs option to TRUE on the bucket.

Answer(s): D

Explanation:

D) Using the Block Public Access feature in Amazon S3 and setting the IgnorePublicAcls option to TRUE on the bucket will immediately prevent any public access to the objects in the bucket without disrupting the application's normal workflow. This will ensure that only authenticated and authorized users can access the reports via signed URLs, addressing the security issue while maintaining the application's functionality.



Page 68 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote