Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 35)

Page 35 of 134

A company runs a content management application on a single Windows Amazon EC2 instance in a development environment. The application reads and writes static content to a 2 TB Amazon Elastic Block Store (Amazon EBS) volume that is attached to the instance as the root device. The company plans to deploy this application in production as a highly available and fault-tolerant solution that runs on at least three EC2 instances across multiple Availability Zones.

A solutions architect must design a solution that joins all the instances that run the application to an Active Directory domain. The solution also must implement Windows ACLs to control access to file contents. The application always must maintain exactly the same content on all running instances at any given point in time.

Which solution will meet these requirements with the LEAST management overhead?

  1. Create an Amazon Elastic File System (Amazon EFS) file share. Create an Auto Scaling group that extends across three Availability Zones and maintains a minimum size of three instances. Implement a user data script to install the application, join the instance to the AD domain, and mount the EFS file share.
  2. Create a new AMI from the current EC2 Instance that is running. Create an Amazon FSx for Lustre file system. Create an Auto Scaling group that extends across three Availability Zones and maintains a minimum size of three instances. Implement a user data script to join the instance to the AD domain and mount the FSx for Lustre file system.
  3. Create an Amazon FSx for Windows File Server file system. Create an Auto Scaling group that extends across three Availability Zones and maintains a minimum size of three instances. Implement a user data script to install the application and mount the FSx for Windows File Server file system. Perform a seamless domain join to join the instance to the AD domain.
  4. Create a new AMI from the current EC2 instance that is running. Create an Amazon Elastic File System (Amazon EFS) file system. Create an Auto Scaling group that extends across three Availability Zones and maintains a minimum size of three Instances. Perform a seamless domain join to join the instance to the AD domain.

Answer(s): C

Explanation:

To meet the requirements of deploying the content management application in a highly available and fault-tolerant manner while maintaining synchronization of content across multiple EC2 instances, the recommended solution is:
C) Create an Amazon FSx for Windows File Server file system. Create an Auto Scaling group that extends across three Availability Zones and maintains a minimum size of three instances. Implement a user data script to install the application and mount the FSx for Windows File Server file system. Perform a seamless domain join to join the instance to the AD domain.
1.Amazon FSx for Windows File Server: This fully managed Windows file system is designed to work with Windows applications. It supports SMB protocols, Active Directory integration, and Windows ACLs, making it suitable for applications that require shared access to file content across multiple instances while maintaining access control.
2.Auto Scaling Group: By creating an Auto Scaling group that spans multiple Availability Zones, the solution ensures high availability and fault tolerance. This setup allows the application to automatically scale based on demand while maintaining a minimum of three instances to ensure redundancy.
3.User Data Script: The user data script can be utilized to automate the installation of the application and join the instances to the Active Directory domain seamlessly. This reduces manual intervention and operational overhead.
4.Consistent Content Across Instances: By using FSx for Windows File Server, all instances can access the same content stored on the shared file system, ensuring that they maintain exactly the same content at all times.
Overall, this solution is efficient, minimizes management overhead, and meets the requirements for high availability, fault tolerance, and proper access control through Windows ACLs.



A software as a service (SaaS) based company provides a case management solution to customers A3 part of the solution. The company uses a standalone Simple Mail Transfer Protocol (SMTP) server to send email messages from an application. The application also stores an email template for acknowledgement email messages that populate customer data before the application sends the email message to the customer.

The company plans to migrate this messaging functionality to the AWS Cloud and needs to minimize operational overhead.

Which solution will meet these requirements MOST cost-effectively?

  1. Set up an SMTP server on Amazon EC2 instances by using an AMI from the AWS Marketplace. Store the email template in an Amazon S3 bucket. Create an AWS Lambda function to retrieve the template from the S3 bucket and to merge the customer data from the application with the template. Use an SDK in the Lambda function to send the email message.
  2. Set up Amazon Simple Email Service (Amazon SES) to send email messages. Store the email template in an Amazon S3 bucket. Create an AWS Lambda function to retrieve the template from the S3 bucket and to merge the customer data from the application with the template. Use an SDK in the Lambda function to send the email message.
  3. Set up an SMTP server on Amazon EC2 instances by using an AMI from the AWS Marketplace. Store the email template in Amazon Simple Email Service (Amazon SES) with parameters for the customer data. Create an AWS Lambda function to call the SES template and to pass customer data to replace the parameters. Use the AWS Marketplace SMTP server to send the email message.
  4. Set up Amazon Simple Email Service (Amazon SES) to send email messages. Store the email template on Amazon SES with parameters for the customer data. Create an AWS Lambda function to call the SendTemplatedEmail API operation and to pass customer data to replace the parameters and the email destination.

Answer(s): D

Explanation:

The most cost-effective solution for migrating the email functionality of the SaaS company to the AWS Cloud while minimizing operational overhead is:
D) Set up Amazon Simple Email Service (Amazon SES) to send email messages. Store the email template on Amazon SES with parameters for the customer data. Create an AWS Lambda function to call the SendTemplatedEmail API operation and to pass customer data to replace the parameters and the email destination.
1.Amazon SES: By using Amazon SES, the company leverages a fully managed email service designed for sending emails at scale, which reduces the complexity of managing a standalone SMTP server. This service also provides features like deliverability, tracking, and compliance out of the box.
2.Email Templates: Storing email templates directly within Amazon SES allows for easy management and use of templates, including dynamic parameterization for customer data. This means less overhead in managing external templates and simplifies the integration process.
3.AWS Lambda Function: The use of a Lambda function to call the SendTemplatedEmail API operation allows for serverless execution of the email sending process. It dynamically populates the templates with customer data, thus ensuring personalized communication without needing a dedicated server.
4.Operational Overhead: This solution minimizes operational overhead since AWS SES handles the complexities of email delivery, and the serverless architecture of AWS Lambda eliminates the need for ongoing server management.
Overall, this combination effectively meets the company's requirements for sending acknowledgment emails to customers while being cost-effective and easy to manage.



A company is processing videos in the AWS Cloud by Using Amazon EC2 instances in an Auto Scaling group. It takes 30 minutes to process a video Several EC2 instances scale in and out depending on the number of videos in an Amazon Simple Queue Service (Amazon SQS) queue.

The company has configured the SQS queue with a redrive policy that specifies a target dead-letter queue and a maxReceiveCount of 1. The company has set the visibility timeout for the SQS queue to 1 hour. The company has set up an Amazon CloudWatch alarm to notify the development team when there are messages in the dead-letter queue.

Several times during the day. the development team receives notification that messages are in the dead-letter queue and that videos have not been processed property. An investigation finds no errors m the application logs.

How can the company solve this problem?

  1. Turn on termination protection tor the EC2 Instances
  2. Update the visibility timeout for the SQS queue to 3 hours
  3. Configure scale-in protection for the instances during processing
  4. Update the redrive policy and set maxReceiveCount to 0.

Answer(s): C

Explanation:

The best solution for addressing the issue of videos not being processed properly and ending up in the dead-letter queue is:
C) Configure scale-in protection for the instances during processing.
1.Understanding the Problem: The company has set a visibility timeout for the SQS queue of 1 hour, which is longer than the time it takes to process a video (30 minutes). However, if an EC2 instance that is processing a video is terminated (due to scale-in actions) before the visibility timeout expires, the message will become visible again in the queue, leading to potential reprocessing failures and ultimately sending the message to the dead-letter queue if it fails to process again.
2.Scale-In Protection: By configuring scale-in protection for the instances during processing, the Auto Scaling group will not terminate instances that are currently working on video processing tasks. This ensures that the instances can complete their workload without being prematurely terminated, thus preventing the messages from being sent to the dead-letter queue.
3.Operational Efficiency: This solution allows the application to process videos reliably without requiring changes to the visibility timeout or the maxReceiveCount, maintaining the original design intent of the SQS and Auto Scaling configurations.
Therefore, implementing scale-in protection directly addresses the issue while optimizing resource utilization and maintaining operational efficiency.



A company has developed APIs that use Amazon API Gateway with Regional endpoints. The APIs call AWS Lambda functions that use API Gateway authentication mechanisms. After a design review, a solutions architect identifies a set of APIs that do not require public access.

The solutions architect must design a solution to make the set of APIs accessible only from a VPC. All APIs need to be called with an authenticated user

Which solution will meet these requirements with the LEAST amount of effort?

  1. Create an internal Application Load Balancer (ALB). Create a target group. Select the Lambda function to call. Use the ALB DNS name to call the API from the VPC.
  2. Remove the DNS entry that is associated with the API in API Gateway. Create a hosted zone in Amazon Route 53. Create a CNAME record in the hosted zone. Update the API in API Gateway with the CNAME record. Use the CNAME record to call the API from the VPC.
  3. Update the API endpoint from Regional to private in API Gateway. Create an interface VPC endpoint in the VP Create a resource policy, and attach it to the API. Use the VPC endpoint to call the API from the VP
  4. Deploy the Lambda functions inside the VPC Provision an EC2 instance, and install an Apache server. From the Apache server, call the Lambda functions. Use the internal CNAME record of the EC2 instance to call the API from the VPC.

Answer(s): C

Explanation:

The optimal solution for making a set of APIs accessible only from a VPC while ensuring that all API calls are authenticated is:
C) Update the API endpoint from Regional to private in API Gateway. Create an interface VPC endpoint in the VPC. Create a resource policy, and attach it to the API. Use the VPC endpoint to call the API from the VPC.
1.Private API Gateway: By changing the API endpoint type from Regional to private in API Gateway, the APIs become accessible only from within the specified VPC. This ensures that only resources within the VPC can invoke the APIs.
2.Interface VPC Endpoint: Creating an interface VPC endpoint for the API Gateway allows the VPC resources to communicate with the API without needing to go over the public internet. This adds a layer of security by keeping the traffic internal.
3.Resource Policy: Attaching a resource policy to the API allows for finer-grained access control, specifying which principals (e.g., IAM roles or users) can invoke the API. This ensures that only authenticated users can access the APIs, even when they are restricted to the VPC.
4.Least Effort: This solution requires minimal changes and no additional infrastructure compared to other options like creating an ALB or deploying an EC2 instance with an Apache server. It leverages existing services (API Gateway and VPC) and maintains the authentication mechanisms already in place.
By implementing this solution, the company can ensure secure access to its APIs while minimizing complexity and maintaining the required authentication protocols.



Page 35 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote