Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 34)

Page 34 of 134

A company is creating a sequel for a popular online game. A large number of users from all over the world will play the game within the first week after launch. Currently, the game consists of the following components deployed in a single AWS Region:

Amazon S3 bucket that stores game assets
Amazon DynamoDB table that stores player scores

A solutions architect needs to design a multi-Region solution that will reduce latency, improve reliability, and require the least effort to implement.

What should the solutions architect do to meet these requirements?

  1. Create an Amazon CloudFront distribution to serve assets from the S3 bucket. Configure S3 Cross-Region Replication. Create a new DynamoDB table in a new Region. Use the new table as a replica target for DynamoDB global tables.
  2. Create an Amazon CloudFront distribution to serve assets from the S3 bucket. Configure S3 Same-Region Replication. Create a new DynamoDB table in a new Region. Configure asynchronous replication between the DynamoDB tables by using AWS Database Migration Service (AWS DMS) with change data capture (CDC).
  3. Create another S3 bucket in a new Region, and configure S3 Cross-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets in each Region. Configure DynamoDB global tables by enabling Amazon DynamoDB Streams, and add a replica table in a new Region.
  4. Create another S3 bucket in the sine Region, and configure S3 Same-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets. Create a new DynamoDB table in a new Region. Use the new table as a replica target for DynamoDB global tables.

Answer(s): C

Explanation:

C) Create another S3 bucket in a new Region, and configure S3 Cross-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets in each Region. Configure DynamoDB global tables by enabling Amazon DynamoDB Streams, and add a replica table in a new Region.
This solution effectively addresses the requirements for reduced latency, improved reliability, and minimal implementation effort. Here’s how it meets each requirement:
-Reduced Latency: By using Amazon CloudFront, a content delivery network (CDN), the game assets stored in S3 can be cached at edge locations around the world, significantly decreasing load times for users accessing the game from different geographical locations.
-Improved Reliability: S3 Cross-Region Replication ensures that game assets are replicated across multiple regions, providing redundancy. In case of an issue in one region, the other region can serve the content.
-Minimal Implementation Effort: Using DynamoDB global tables simplifies the management of player score data across regions. By enabling DynamoDB Streams and adding a replica table in a new region, the solution automates the replication process of data, which requires less manual effort compared to implementing change data capture with AWS DMS.
Overall, this combination of services provides a robust multi-Region architecture that enhances performance and reliability for a global audience.



A company has an on-premises website application that provides real estate information for potential renters and buyers. The website uses a Java backend and a NoSQL MongoDB database to store subscriber data.

The company needs to migrate the entire application to AWS with a similar structure. The application must be deployed for high availability, and the company cannot make changes to the application.

Which solution will meet these requirements?

  1. Use an Amazon Aurora DB cluster as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
  2. Use MongoDB on Amazon EC2 instances as the database for the subscriber data. Deploy EC2 instances in an Auto Scaling group in a single Availability Zone for the Java backend application.
  3. Configure Amazon DocumentDB (with MongoDB compatibility) with appropriately sized instances in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
  4. Configure Amazon DocumentDB (with MongoDB compatibility) in on-demand capacity mode in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.

Answer(s): C

Explanation:

C) Configure Amazon DocumentDB (with MongoDB compatibility) with appropriately sized instances in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
This solution effectively meets the company's requirements for migrating the application to AWS while ensuring high availability and minimal changes to the existing structure. Here’s how it meets the criteria:
-High Availability: By configuring Amazon DocumentDB in multiple Availability Zones (AZs), the solution provides built-in redundancy and failover capabilities, ensuring that the database remains available even in the event of an AZ failure. Deploying Amazon EC2 instances in an Auto Scaling group across multiple AZs further enhances availability by automatically distributing the load and maintaining performance during spikes.
-Compatibility with MongoDB: Amazon DocumentDB is designed to be compatible with MongoDB, allowing the existing application to operate with minimal changes. This is critical since the company cannot modify the application.
-Managed Service: Utilizing Amazon DocumentDB means that the company benefits from a fully managed database service, which simplifies operations such as backups, patching, and scaling, allowing the company to focus on its application instead of managing infrastructure.
Overall, this approach provides a scalable and robust architecture that aligns well with the existing application requirements and operational goals.



A digital marketing company has multiple AWS accounts that belong to various teams. The creative team uses an Amazon S3 bucket in its AWS account to securely store images and media files that are used as content for the company’s marketing campaigns. The creative team wants to share the S3 bucket with the strategy team so that the strategy team can view the objects.

A solutions architect has created an IAM role that is named strategy_reviewer in the Strategy account. The solutions architect also has set up a custom AWS Key Management Service (AWS KMS) key in the Creative account and has associated the key with the S3 bucket. However, when users from the Strategy account assume the IAM role and try to access objects in the S3 bucket, they receive an Access Denied error.

The solutions architect must ensure that users in the Strategy account can access the S3 bucket. The solution must provide these users with only the minimum permissions that they need.

Which combination of steps should the solutions architect take to meet these requirements? (Choose three.)

  1. Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to the account ID of the Strategy account.
  2. Update the strategy_reviewer IAM role to grant full permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.
  3. Update the custom KMS key policy in the Creative account to grant decrypt permissions to the strategy_reviewer IAM role.
  4. Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to an anonymous user.
  5. Update the custom KMS key policy in the Creative account to grant encrypt permissions to the strategy_reviewer IAM role.
  6. Update the strategy_reviewer IAM role to grant read permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.

Answer(s): A,C,F

Explanation:

To ensure that users in the Strategy account can access the S3 bucket in the Creative account with the minimum necessary permissions, the solutions architect should take the following steps:
A) Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to the account ID of the Strategy account.
-This bucket policy grants the necessary permissions for users in the Strategy account to read (list and get) objects from the S3 bucket. By specifying the account ID as the principal, it ensures that only users in the Strategy account can access the bucket.
C) Update the custom KMS key policy in the Creative account to grant decrypt permissions to the strategy_reviewer IAM role.
-Since the S3 bucket uses a KMS key for encryption, users must also have permission to decrypt the objects. Updating the KMS key policy to include the strategy_reviewer IAM role allows users assuming this role to decrypt objects stored in the S3 bucket.
F) Update the strategy_reviewer IAM role to grant read permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.
-While the bucket policy provides access at the bucket level, granting permissions directly in the IAM role ensures that the users assuming this role have the necessary permissions for both the S3 bucket and the KMS key. This allows for a fine-grained approach to security, ensuring that users have exactly the permissions they need.
These steps collectively ensure that the users from the Strategy account can access the S3 bucket and its contents securely and with minimal permissions, adhering to best practices for security and access control in AWS.



A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data. Sequencing data is generated and stored on a local storage area network (SAN), and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turnaround time from weeks to days.

The company has a high-speed AWS Direct Connect connection. Sequencers will generate around 200 GB of data for each genome, and individual jobs can take several hours to process the data with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day.

Which solution meets these requirements?

  1. Use regularly scheduled AWS Snowball Edge devices to transfer the sequencing data into AWS. When AWS receives the Snowball Edge device and the data is loaded into Amazon S3, use S3 events to trigger an AWS Lambda function to process the data.
  2. Use AWS Data Pipeline to transfer the sequencing data to Amazon S3. Use S3 events to trigger an Amazon EC2 Auto Scaling group to launch custom-AMI EC2 instances running the Docker containers to process the data.
  3. Use AWS DataSync to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow. Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data.
  4. Use an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Batch job that executes on Amazon EC2 instances running the Docker containers to process the data.

Answer(s): C

Explanation:

To meet the requirements of the life sciences company's genomics analysis platform on AWS, the following solution is proposed:
C) Use AWS DataSync to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow. Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data.
1.AWS DataSync: This service is designed for efficiently transferring large amounts of data to AWS. It is suitable for moving the sequencing data from the on-premises storage area network (SAN) into Amazon S3 with minimal operational overhead.
2.S3 Events: Once the data is in S3, leveraging S3 event notifications can trigger subsequent processes. This mechanism allows for automation, as it can initiate workflows immediately after data is available.
3.AWS Lambda and Step Functions: By using AWS Lambda to trigger an AWS Step Functions workflow, you can orchestrate complex workflows easily, such as data validation, processing, and additional steps as needed for the genomics data.
4.Amazon Elastic Container Registry (ECR): Storing Docker images in ECR allows for easy management and deployment of containerized applications, making it simpler to run the necessary data processing jobs.
5.AWS Batch: This service enables you to run batch computing jobs in a managed environment, which is particularly useful for handling the compute-intensive analysis of genomics data. AWS Batch will dynamically provision the required resources based on the job demands, thereby optimizing cost and performance.
This comprehensive solution effectively addresses the company's need to scale based on workload demands and reduces the turnaround time for processing genomics data from weeks to days while ensuring efficient data transfer and processing on AWS.



Page 34 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote