Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 63)

Page 63 of 134

A company is creating a REST API to share information with six of its partners based in the United States. The company has created an Amazon API Gateway Regional endpoint. Each of the six partners will access the API once per day to post daily sales figures.

After initial deployment, the company observes 1,000 requests per second originating from 500 different IP addresses around the world. The company believes this traffic is originating from a botnet and wants to secure its API while minimizing cost.

Which approach should the company take to secure its API?

  1. Create an Amazon CloudFront distribution with the API as the origin. Create an AWS WAF web ACL with a rule to block clients that submit more than five requests per day. Associate the web ACL with the CloudFront distribution. Configure CloudFront with an origin access identity (OAI) and associate it with the distribution. Configure API Gateway to ensure only the OAI can run the POST method.
  2. Create an Amazon CloudFront distribution with the API as the origin. Create an AWS WAF web ACL with a rule to block clients that submit more than five requests per day. Associate the web ACL with the CloudFront distribution. Add a custom header to the CloudFront distribution populated with an API key. Configure the API to require an API key on the POST method.
  3. Create an AWS WAF web ACL with a rule to allow access to the IP addresses used by the six partners. Associate the web ACL with the API. Create a resource policy with a request limit and associate it with the API. Configure the API to require an API key on the POST method.
  4. Create an AWS WAF web ACL with a rule to allow access to the IP addresses used by the six partners. Associate the web ACL with the API. Create a usage plan with a request limit and associate it with the API. Create an API key and add it to the usage plan.

Answer(s): D

Explanation:

D) Creating an AWS WAF web ACL to allow access only from the IP addresses of the six partners ensures that only trusted partners can access the API. Associating a usage plan with the API and creating an API key for each partner limits the number of requests they can make (e.g., one request per day). This solution effectively secures the API against botnet traffic while minimizing cost and ensuring the partners can continue to access the API as needed.



A company uses an Amazon Aurora PostgreSQL DB cluster for applications in a single AWS Region. The company's database team must monitor all data activity on all the databases.

Which solution will achieve this goal?

  1. Set up an AWS Database Migration Service (AWS DMS) change data capture (CDC) task. Specify the Aurora DB cluster as the source. Specify Amazon Kinesis Data Firehose as the target. Use Kinesis Data Firehose to upload the data into an Amazon OpenSearch Service cluster for further analysis.
  2. Start a database activity stream on the Aurora DB cluster to capture the activity stream in Amazon EventBridge. Define an AWS Lambda function as a target for EventBridge. Program the Lambda function to decrypt the messages from EventBridge and to publish all database activity to Amazon S3 for further analysis.
  3. Start a database activity stream on the Aurora DB cluster to push the activity stream to an Amazon Kinesis data stream. Configure Amazon Kinesis Data Firehose to consume the Kinesis data stream and to deliver the data to Amazon S3 for further analysis.
  4. Set up an AWS Database Migration Service (AWS DMS) change data capture (CDC) task. Specify the Aurora DB cluster as the source. Specify Amazon Kinesis Data Firehose as the target. Use Kinesis Data Firehose to upload the data into an Amazon Redshift cluster. Run queries on the Amazon Redshift data to determine database activities on the Aurora database.

Answer(s): C

Explanation:

C) Starting a database activity stream on the Aurora DB cluster and pushing the stream to an Amazon Kinesis data stream ensures that all database activities are captured in near real-time. Using Amazon Kinesis Data Firehose to deliver the data to Amazon S3 provides a scalable and durable storage solution for further analysis. This approach allows the company to monitor all data activities efficiently and provides flexibility in processing and analyzing the data.



An entertainment company recently launched a new game. To ensure a good experience for players during the launch period, the company deployed a static quantity of 12 r6g.16xlarge (memory optimized) Amazon EC2 instances behind a Network Load Balancer. The company's operations team used the Amazon CloudWatch agent and a custom metric to include memory utilization in its monitoring strategy.

Analysis of the CloudWatch metrics from the launch period showed consumption at about one quarter of the CPU and memory that the company expected. Initial demand for the game has subsided and has become more variable. The company decides to use an Auto Scaling group that monitors the CPU and memory consumption to dynamically scale the instance fleet. A solutions architect needs to configure the Auto Scaling group to meet demand in the most cost-effective way.

Which solution will meet these requirements?

  1. Configure the Auto Scaling group to deploy c6g.4xlarge (compute optimized) instances. Configure a minimum capacity of 3, a desired capacity of 3, and a maximum capacity of 12.
  2. Configure the Auto Scaling group to deploy m6g.4xlarge (general purpose) instances. Configure a minimum capacity of 3, a desired capacity of 3, and a maximum capacity of 12.
  3. Configure the Auto Scaling group to deploy r6g.4xlarge (memory optimized) instances. Configure a minimum capacity of 3, a desired capacity of 3, and a maximum capacity of 12.
  4. Configure the Auto Scaling group to deploy r6g.8xlarge (memory optimized) instances. Configure a minimum capacity of 2, a desired capacity of 2, and a maximum capacity of 6.

Answer(s): C

Explanation:

C) Configuring the Auto Scaling group to deploy r6g.4xlarge (memory optimized) instances with a minimum capacity of 3, a desired capacity of 3, and a maximum capacity of 12 optimizes cost-effectiveness and resource utilization. Since the application was initially deployed on r6g.16xlarge instances, scaling down to smaller r6g.4xlarge instances based on actual usage patterns (lower-than-expected CPU and memory consumption) will better match the demand, while still providing the necessary memory optimization for the application. This solution ensures the instances are properly sized for the workload, improving cost-efficiency while maintaining performance.



A financial services company loaded millions of historical stock trades into an Amazon DynamoDB table. The table uses on-demand capacity mode. Once each day at midnight, a few million new records are loaded into the table. Application read activity against the table happens in bursts throughout the day. and a limited set of keys are repeatedly looked up. The company needs to reduce costs associated with DynamoDB.

Which strategy should a solutions architect recommend to meet this requirement?

  1. Deploy an Amazon ElastiCache cluster in front of the DynamoDB table
  2. Deploy DynamoDB Accelerator (DAX). Configure DynamoDB auto scaling. Purchase Savings Plans in Cost Explorer.
  3. Use provisioned capacity mode. Purchase Savings Plans in Cost Explorer.
  4. Deploy DynamoDB Accelerator (DAX). Use provisioned capacity mode. Configure DynamoDB auto scaling.

Answer(s): D

Explanation:

D) Deploying DynamoDB Accelerator (DAX) can help reduce the read traffic cost by caching frequently accessed data, reducing the need to query the DynamoDB table repeatedly. Switching to provisioned capacity mode with auto scaling allows the system to scale based on actual demand, providing a cost-effective solution for handling periodic data loading and read bursts. This setup meets the need for cost reduction while maintaining performance.



Page 63 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote