Free SAP-C01 Exam Braindumps (page: 76)

Page 76 of 134

An online survey company runs its application in the AWS Cloud. The application is distributed and consists of microservices that run in an automatically scaled Amazon Elastic Container Service (Amazon ECS) cluster. The ECS cluster is a target for an Application Load Balancer (ALB). The ALB is a custom origin for an Amazon CloudFront distribution.

The company has a survey that contains sensitive data. The sensitive data must be encrypted when it moves through the application. The application's data-handling microservice is the only microservice that should be able to decrypt the data

Which solution will meet these requirements?

  1. Create a symmetric AWS Key Management Service (AWS KMS) key that is dedicated to the data-handling microservice. Create a field-level encryption profile and a configuration. Associate the KMS key and the configuration with the CloudFront cache behavior.
  2. Create an RSA key pair that is dedicated to the data-handing microservice. Upload the public key to the CloudFront distribution. Create a field-level encryption profile and a configuration. Add the configuration to the CloudFront cache behavior.
  3. Create a symmetric AWS Key Management Service (AWS KMS) key that is dedicated to the data-handling microservice. Create a Lambda@Edge function. Program the function to use the KMS key to encrypt the sensitive data.
  4. Create an RSA key pair that is dedicated to the data-handling microservice. Create a Lambda@Edge function. Program the function to use the private key of the RSA key pair to encrypt the sensitive data.

Answer(s): B

Explanation:

B) is the correct solution. Here's why:
-Field-level encryption in Amazon CloudFront is designed for protecting sensitive information in web requests by encrypting the data at the field level before the data is sent to the origin. It ensures that only specific parts of the sensitive data are encrypted.
-The RSA key pair approach is ideal in this scenario. The public key is uploaded to CloudFront, and CloudFront uses it to encrypt sensitive data fields in the request.
-The data-handling microservice is the only microservice that has access to the private key of the RSA key pair. This ensures that only this microservice can decrypt the data.
Other options like using symmetric AWS KMS keys or Lambda@Edge do not directly apply to this field-level encryption requirement. CloudFront with RSA encryption is purpose-built for this type of sensitive data protection in transit.



A solutions architect is determining the DNS strategy for an existing VPC. The VPC is provisioned to use the 10.24.34.0/24 CIDR block. The VPC also uses Amazon Route 53 Resolver for DNS. New requirements mandate that DNS queries must use private hosted zones. Additionally instances that have public IP addresses must receive corresponding public hostnames

Which solution will meet these requirements to ensure that the domain names are correctly resolved within the VPC?

  1. Create a private hosted zone. Activate the enableDnsSupport attribute and the enableDnsHostnames attribute for the VPC. Update the VPC DHCP options set to include domain-name-servers=10.24.34.2.
  2. Create a private hosted zone Associate the private hosted zone with the VPC. Activate the enableDnsSupport attribute and the enableDnsHostnames attribute for the VPC. Create a new VPC DHCP options set, and configure domain-name-servers=AmazonProvidedDNS. Associate the new DHCP options set with the VPC.
  3. Deactivate the enableDnsSupport attribute for the VPActivate the enableDnsHostnames attribute for the VPCreate a new VPC DHCP options set, and configure doman-name-servers=10.24.34.2. Associate the new DHCP options set with the VP
  4. Create a private hosted zone. Associate the private hosted zone with the VPC. Activate the enableDnsSupport attribute for the VPC. Deactivate the enableDnsHostnames attribute for the VPC. Update the VPC DHCP options set to include domain-name-servers=AmazonProvidedDNS.

Answer(s): B

Explanation:

B) is the correct solution. Here's why:
1.Private Hosted Zone: To meet the requirement that DNS queries must use private hosted zones, you need to create a private hosted zone and associate it with the VPC.
2.DNS Support and Hostnames:
oThe enableDnsSupport attribute must be activated to ensure that instances in the VPC can communicate with the Amazon-provided DNS server.
oThe enableDnsHostnames attribute must also be activated to ensure that instances with public IP addresses can receive corresponding public DNS hostnames.
3.VPC DHCP Options Set: To ensure the domain names are correctly resolved and that Amazon-provided DNS servers are used, you need to create a new VPC DHCP options set. The domain-name-servers should be set to AmazonProvidedDNS, which ensures the correct DNS resolution for public and private addresses within the VPC.
Other options either deactivate necessary attributes or provide incorrect DNS settings, which would not meet the requirements for both public hostnames and private hosted zones.



A data analytics company has an Amazon Redshift cluster that consists of several reserved nodes. The cluster is experiencing unexpected bursts of usage because a team of employees is compiling a deep audit analysis report. The queries to generate the report are complex read queries and are CPU intensive.

Business requirements dictate that the cluster must be able to service read and write queries at all times. A solutions architect must devise a solution that accommodates the bursts of usage.

Which solution meets these requirements MOST cost-effectively?

  1. Provision an Amazon EMR cluster Offload the complex data processing tasks.
  2. Deploy an AWS Lambda function to add capacity to the Amazon Redshift cluster by using a classic resize operation when the cluster’s CPU metrics in Amazon CloudWatch reach 80%.
  3. Deploy an AWS Lambda function to add capacity to the Amazon Redshift cluster by using an elastic resize operation when the cluster’s CPU metrics in Amazon CloudWatch reach 80%.
  4. Turn on the Concurrency Scaling feature for the Amazon Redshift cluster.

Answer(s): D

Explanation:

D) Turning on the Concurrency Scaling feature for the Amazon Redshift cluster is the most cost-effective and appropriate solution for accommodating bursts in usage.
-Concurrency Scaling automatically adds and removes extra capacity in response to demand spikes, ensuring that your queries are always serviced without affecting ongoing workloads. It allows you to handle sudden spikes in query volume without the need to provision additional nodes manually or scale the cluster, making it a flexible and cost-efficient option.
-This solution meets the requirement of handling both read and write queries at all times without significant changes to the existing infrastructure.
Other options:
-A) Offloading complex queries to Amazon EMR could be more costly and complex than needed.
-B & C) Lambda with a classic or elastic resize operation would introduce delays during resizing, which might not handle the immediate bursts in demand efficiently.



A research center is migrating to the AWS Cloud and has moved its on-premises 1 PB object storage to an Amazon S3 bucket. One hundred scientists are using this object storage to store their work-related documents. Each scientist has a personal folder on the object store. All the scientists are members of a single IAM user group.

The research center's compliance officer is worried that scientists will be able to access each other's work. The research center has a strict obligation to report on which scientist accesses which documents. The team that is responsible for these reports has little AWS experience and wants a ready-to-use solution that minimizes operational overhead.

Which combination of actions should a solutions architect take to meet these requirements? (Choose two.)

  1. Create an identity policy that grants the user read and write access. Add a condition that specifies that the S3 paths must be prefixed with $(aws:username). Apply the policy on the scientists’ IAM user group.
  2. Configure a trail with AWS CloudTrail to capture all object-level events in the S3 bucket. Store the trail output in another S3 bucket. Use Amazon Athena to query the logs and generate reports.
  3. Enable S3 server access logging. Configure another S3 bucket as the target for log delivery. Use Amazon Athena to query the logs and generate reports.
  4. Create an S3 bucket policy that grants read and write access to users in the scientists’ IAM user group.
    E. Configure a trail with AWS CloudTrail to capture all object-level events in the S3 bucket and write the events to Amazon CloudWatch. Use the Amazon Athena CloudWatch connector to query the logs and generate reports.

Answer(s): A,B

Explanation:

To meet the research center's requirement of securing the scientists' work and reporting on access to the documents, the following steps should be taken:
A) Create an identity policy that grants the users access to their own documents. By adding a condition that specifies the S3 paths must be prefixed with $(aws:username), you ensure that each scientist can only access their personal folder. This enforces proper access control so that scientists cannot view or modify each other's work.
B) Configure AWS CloudTrail to capture all object-level events in the S3 bucket. CloudTrail can log details such as which user accessed or modified which objects in S3. Storing the output in another S3 bucket and using Amazon Athena to query the logs provides an easy, ready-to-use solution for generating reports on access activity. This minimizes operational overhead and gives the compliance team visibility into object-level events.
Other options:
-C) S3 server access logging can capture access events, but it lacks the detailed, easy-to-query logging provided by CloudTrail.
-D) A general S3 bucket policy would not restrict access on a per-user basis, so it would not meet the requirement of limiting access to personal folders.
-E) Writing CloudTrail events directly to CloudWatch introduces unnecessary complexity compared to storing them in S3 and querying with Athena.
Thus, A and B provide the best combination to meet both security and reporting requirements with minimal operational overhead.



Page 76 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote