Free SAP-C01 Exam Braindumps (page: 34)

Page 34 of 134

A company is creating a sequel for a popular online game. A large number of users from all over the world will play the game within the first week after launch. Currently, the game consists of the following components deployed in a single AWS Region:

Amazon S3 bucket that stores game assets
Amazon DynamoDB table that stores player scores

A solutions architect needs to design a multi-Region solution that will reduce latency, improve reliability, and require the least effort to implement.

What should the solutions architect do to meet these requirements?

  1. Create an Amazon CloudFront distribution to serve assets from the S3 bucket. Configure S3 Cross-Region Replication. Create a new DynamoDB table in a new Region. Use the new table as a replica target for DynamoDB global tables.
  2. Create an Amazon CloudFront distribution to serve assets from the S3 bucket. Configure S3 Same-Region Replication. Create a new DynamoDB table in a new Region. Configure asynchronous replication between the DynamoDB tables by using AWS Database Migration Service (AWS DMS) with change data capture (CDC).
  3. Create another S3 bucket in a new Region, and configure S3 Cross-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets in each Region. Configure DynamoDB global tables by enabling Amazon DynamoDB Streams, and add a replica table in a new Region.
  4. Create another S3 bucket in the sine Region, and configure S3 Same-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets. Create a new DynamoDB table in a new Region. Use the new table as a replica target for DynamoDB global tables.

Answer(s): C

Explanation:

C) Create another S3 bucket in a new Region, and configure S3 Cross-Region Replication between the buckets. Create an Amazon CloudFront distribution and configure origin failover with two origins accessing the S3 buckets in each Region. Configure DynamoDB global tables by enabling Amazon DynamoDB Streams, and add a replica table in a new Region.
This solution effectively addresses the requirements for reduced latency, improved reliability, and minimal implementation effort. Here’s how it meets each requirement:
-Reduced Latency: By using Amazon CloudFront, a content delivery network (CDN), the game assets stored in S3 can be cached at edge locations around the world, significantly decreasing load times for users accessing the game from different geographical locations.
-Improved Reliability: S3 Cross-Region Replication ensures that game assets are replicated across multiple regions, providing redundancy. In case of an issue in one region, the other region can serve the content.
-Minimal Implementation Effort: Using DynamoDB global tables simplifies the management of player score data across regions. By enabling DynamoDB Streams and adding a replica table in a new region, the solution automates the replication process of data, which requires less manual effort compared to implementing change data capture with AWS DMS.
Overall, this combination of services provides a robust multi-Region architecture that enhances performance and reliability for a global audience.



A company has an on-premises website application that provides real estate information for potential renters and buyers. The website uses a Java backend and a NoSQL MongoDB database to store subscriber data.

The company needs to migrate the entire application to AWS with a similar structure. The application must be deployed for high availability, and the company cannot make changes to the application.

Which solution will meet these requirements?

  1. Use an Amazon Aurora DB cluster as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
  2. Use MongoDB on Amazon EC2 instances as the database for the subscriber data. Deploy EC2 instances in an Auto Scaling group in a single Availability Zone for the Java backend application.
  3. Configure Amazon DocumentDB (with MongoDB compatibility) with appropriately sized instances in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
  4. Configure Amazon DocumentDB (with MongoDB compatibility) in on-demand capacity mode in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.

Answer(s): C

Explanation:

C) Configure Amazon DocumentDB (with MongoDB compatibility) with appropriately sized instances in multiple Availability Zones as the database for the subscriber data. Deploy Amazon EC2 instances in an Auto Scaling group across multiple Availability Zones for the Java backend application.
This solution effectively meets the company's requirements for migrating the application to AWS while ensuring high availability and minimal changes to the existing structure. Here’s how it meets the criteria:
-High Availability: By configuring Amazon DocumentDB in multiple Availability Zones (AZs), the solution provides built-in redundancy and failover capabilities, ensuring that the database remains available even in the event of an AZ failure. Deploying Amazon EC2 instances in an Auto Scaling group across multiple AZs further enhances availability by automatically distributing the load and maintaining performance during spikes.
-Compatibility with MongoDB: Amazon DocumentDB is designed to be compatible with MongoDB, allowing the existing application to operate with minimal changes. This is critical since the company cannot modify the application.
-Managed Service: Utilizing Amazon DocumentDB means that the company benefits from a fully managed database service, which simplifies operations such as backups, patching, and scaling, allowing the company to focus on its application instead of managing infrastructure.
Overall, this approach provides a scalable and robust architecture that aligns well with the existing application requirements and operational goals.



A digital marketing company has multiple AWS accounts that belong to various teams. The creative team uses an Amazon S3 bucket in its AWS account to securely store images and media files that are used as content for the company’s marketing campaigns. The creative team wants to share the S3 bucket with the strategy team so that the strategy team can view the objects.

A solutions architect has created an IAM role that is named strategy_reviewer in the Strategy account. The solutions architect also has set up a custom AWS Key Management Service (AWS KMS) key in the Creative account and has associated the key with the S3 bucket. However, when users from the Strategy account assume the IAM role and try to access objects in the S3 bucket, they receive an Access Denied error.

The solutions architect must ensure that users in the Strategy account can access the S3 bucket. The solution must provide these users with only the minimum permissions that they need.

Which combination of steps should the solutions architect take to meet these requirements? (Choose three.)

  1. Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to the account ID of the Strategy account.
  2. Update the strategy_reviewer IAM role to grant full permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.
  3. Update the custom KMS key policy in the Creative account to grant decrypt permissions to the strategy_reviewer IAM role.
  4. Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to an anonymous user.
  5. Update the custom KMS key policy in the Creative account to grant encrypt permissions to the strategy_reviewer IAM role.
  6. Update the strategy_reviewer IAM role to grant read permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.

Answer(s): A,C,F

Explanation:

To ensure that users in the Strategy account can access the S3 bucket in the Creative account with the minimum necessary permissions, the solutions architect should take the following steps:
A) Create a bucket policy that includes read permissions for the S3 bucket. Set the principal of the bucket policy to the account ID of the Strategy account.
-This bucket policy grants the necessary permissions for users in the Strategy account to read (list and get) objects from the S3 bucket. By specifying the account ID as the principal, it ensures that only users in the Strategy account can access the bucket.
C) Update the custom KMS key policy in the Creative account to grant decrypt permissions to the strategy_reviewer IAM role.
-Since the S3 bucket uses a KMS key for encryption, users must also have permission to decrypt the objects. Updating the KMS key policy to include the strategy_reviewer IAM role allows users assuming this role to decrypt objects stored in the S3 bucket.
F) Update the strategy_reviewer IAM role to grant read permissions for the S3 bucket and to grant decrypt permissions for the custom KMS key.
-While the bucket policy provides access at the bucket level, granting permissions directly in the IAM role ensures that the users assuming this role have the necessary permissions for both the S3 bucket and the KMS key. This allows for a fine-grained approach to security, ensuring that users have exactly the permissions they need.
These steps collectively ensure that the users from the Strategy account can access the S3 bucket and its contents securely and with minimal permissions, adhering to best practices for security and access control in AWS.



A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data. Sequencing data is generated and stored on a local storage area network (SAN), and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turnaround time from weeks to days.

The company has a high-speed AWS Direct Connect connection. Sequencers will generate around 200 GB of data for each genome, and individual jobs can take several hours to process the data with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day.

Which solution meets these requirements?

  1. Use regularly scheduled AWS Snowball Edge devices to transfer the sequencing data into AWS. When AWS receives the Snowball Edge device and the data is loaded into Amazon S3, use S3 events to trigger an AWS Lambda function to process the data.
  2. Use AWS Data Pipeline to transfer the sequencing data to Amazon S3. Use S3 events to trigger an Amazon EC2 Auto Scaling group to launch custom-AMI EC2 instances running the Docker containers to process the data.
  3. Use AWS DataSync to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow. Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data.
  4. Use an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Batch job that executes on Amazon EC2 instances running the Docker containers to process the data.

Answer(s): C

Explanation:

To meet the requirements of the life sciences company's genomics analysis platform on AWS, the following solution is proposed:
C) Use AWS DataSync to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow. Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data.
1.AWS DataSync: This service is designed for efficiently transferring large amounts of data to AWS. It is suitable for moving the sequencing data from the on-premises storage area network (SAN) into Amazon S3 with minimal operational overhead.
2.S3 Events: Once the data is in S3, leveraging S3 event notifications can trigger subsequent processes. This mechanism allows for automation, as it can initiate workflows immediately after data is available.
3.AWS Lambda and Step Functions: By using AWS Lambda to trigger an AWS Step Functions workflow, you can orchestrate complex workflows easily, such as data validation, processing, and additional steps as needed for the genomics data.
4.Amazon Elastic Container Registry (ECR): Storing Docker images in ECR allows for easy management and deployment of containerized applications, making it simpler to run the necessary data processing jobs.
5.AWS Batch: This service enables you to run batch computing jobs in a managed environment, which is particularly useful for handling the compute-intensive analysis of genomics data. AWS Batch will dynamically provision the required resources based on the job demands, thereby optimizing cost and performance.
This comprehensive solution effectively addresses the company's need to scale based on workload demands and reduces the turnaround time for processing genomics data from weeks to days while ensuring efficient data transfer and processing on AWS.



Page 34 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote