Free SAP-C01 Exam Braindumps

Your company policies require encryption of sensitive data at rest. You are considering the possible options for protecting data while storing it at rest on an EBS data volume, attached to an EC2 instance.
Which of these options would allow you to encrypt your data at rest? (Choose 3)

  1. Implement third party volume encryption tools
  2. Implement SSL/TLS for all services running on the server
  3. Encrypt data inside your applications before storing it on EBS
  4. Encrypt data using native data encryption drivers at the file system level
  5. Do nothing as EBS volumes are encrypted by default

Answer(s): A,C,D



A company runs a memory-intensive analytics application using on-demand Amazon EC2 C5 compute optimized instance. The application is used continuously and application demand doubles during working hours. The application currently scales based on CPU usage. When scaling in occurs, a lifecycle hook is used because the instance requires 4 minutes to clean the application state before terminating.

Because users reported poor performance during working hours, scheduled scaling actions were implemented so additional instances would be added during working hours. The Solutions Architect has been asked to reduce the cost of the application.

Which solution is MOST cost-effective?

  1. Use the existing launch configuration that uses C5 instances, and update the application AMI to include the Amazon CloudWatch agent. Change the Auto Scaling policies to scale based on memory utilization. Use Reserved Instances for the number of instances required after working hours, and use Spot Instances to cover the increased demand during working hours.
  2. Update the existing launch configuration to use R5 instances, and update the application AMI to include SSM Agent. Change the Auto Scaling policies to scale based on memory utilization. Use Reserved instances for the number of instances required after working hours, and use Spot Instances with on-Demand instances to cover the increased demand during working hours.
  3. Use the existing launch configuration that uses C5 instances, and update the application AMI to include SSM Agent. Leave the Auto Scaling policies to scale based on CPU utilization. Use scheduled Reserved Instances for the number of instances required after working hours, and use Spot Instances to cover the increased demand during working hours.
  4. Create a new launch configuration using R5 instances, and update the application AMI to include the Amazon CloudWatch agent. Change the Auto Scaling policies to scale based on memory utilization. Use Reserved Instances for the number of instances required after working hours, and use Standard Reserved Instances with On-Demand Instances to cover the increased demand during working hours.

Answer(s): D



A company has a data center that must be migrated to AWS as quickly as possible. The data center has a 500 Mbps AWS Direct Connect link and a separate, fully available 1 Gbps ISP connection. A Solutions Architect must transfer 20 TB of data from the data center to an Amazon S3 bucket.
What is the FASTEST way transfer the data?

  1. Upload the data to the S3 bucket using the existing DX link.
  2. Send the data to AWS using the AWS Import/Export service.
  3. Upload the data using an 80 TB AWS Snowball device.
  4. Upload the data to the S3 bucket using S3 Transfer Acceleration.

Answer(s): B

Explanation:

Import/Export supports importing and exporting data into and out of Amazon S3 buckets. For significant data sets, AWS Import/Export is often faster than Internet transfer and more cost effective than upgrading your connectivity.


Reference:

https://stackshare.io/stackups/aws-direct-connect-vs-aws-import-export



A company wants to follow its website on AWS using serverless architecture design patterns for global customers. The company has outlined its requirements as follow:

The website should be responsive.
The website should offer minimal latency. The website should be highly available.
Users should be able to authenticate through social identity providers such as Google, Facebook, and Amazon.
There should be baseline DDoS protections for spikes in traffic.

How can the design requirements be met?

  1. Use Amazon CloudFront with Amazon ECS for hosting the website. Use AWS Secrets Manager for provide user management and authentication functions. Use ECS Docker containers to build an API.
  2. Use Amazon Route 53 latency routing with an Application Load Balancer and AWS Fargate in different regions for hosting the website.use Amazon Cognito to provide user management and authentication functions. Use Amazon EKS containers.
  3. Use Amazon CloudFront with Amazon S3 for hosting static web resources. Use Amazon Cognito to provide user management authentication functions. Use Amazon API Gateway with AWS Lambda to build an API.
  4. Use AWS Direct Connect with Amazon CloudFront and Amazon S3 for hosting static web resource. Use Amazon Cognito to provide user management authentication functions. Use AWS Lambda to build an API.

Answer(s): C



A company is currently using AWS CodeCommit for its source control and AWS CodePipeline for continuous integration. The pipeline has a build stage for building the artifacts which is then staged in an Amazon S3 bucket.

The company has identified various improvement opportunities in the existing process, and a Solutions Architect has been given the following requirement:

Create a new pipeline to support feature development
Support feature development without impacting production applications
Incorporate continuous testing with unit tests
Isolate development and production artifacts
Support the capability to merge tested code into production code.

How should the Solutions Architect achieve these requirements?

  1. Trigger a separate pipeline from CodeCommit feature branches. Use AWS CodeBuild for running unit tests. Use CodeBuild to stage the artifacts within an S3 bucket in a separate testing account.
  2. Trigger a separate pipeline from CodeCommit feature branches. Use AWS Lambda for running unit tests. Use AWS CodeDeploy to stage the artifacts within an S3 bucket in a separate testing account.
  3. Trigger a separate pipeline from CodeCommit tags Use Jenkins for running unit tests. Create a stage in the pipeline with S3 as the target for staging the artifacts with an S3 bucket in a separate testing account.
  4. Create a separate CodeCommit repository for feature development and use it to trigger the pipeline. Use AWS Lambda for running unit tests. Use AWS CodeBuild to stage the artifacts within different S3 buckets in the same production account.

Answer(s): A