Free Amazon SCS-C01 Exam Braindumps (page: 18)

A company needs its Amazon Elastic Block Store (Amazon EBS) volumes to be encrypted at all times. During a security incident. EBS snapshots of suspicious instances are shared to a forensics account for analysis A security engineer attempting to share a suspicious EBS snapshot to the forensics account receives the following error

"Unable to share snapshot: An error occurred (OperationNotPermitted) when calling the ModifySnapshotAttribute operation: Encrypted snapshots with EBS default key cannot be shared.

Which combination of steps should the security engineer take in the incident account to complete the sharing operation? (Select THREE )

  1. Create a customer managed CMK Copy the EBS snapshot encrypting the destination snapshot using the new CMK.
  2. Allow forensics accounting principals to use the CMK by modifying its policy.
  3. Create an Amazon EC2 instance. Attach the encrypted and suspicious EBS volume. Copy data from the suspicious volume to an unencrypted volume. Snapshot the unencrypted volume
  4. Copy the EBS snapshot to the new decrypted snapshot
  5. Restore a volume from the suspicious EBS snapshot. Create an unencrypted EBS volume of the same size.
  6. Share the target EBS snapshot with the forensics account.

Answer(s): A,B,F



A convoys data lake uses Amazon S3 and Amazon Athena. The company's security engineer has been asked to design an encryption solution that meets the company's data protection requirements. The encryption solution must work with Amazon S3 and keys managed by the company. The encryption solution must be protected in a hardware security module that is validated id Federal information Processing Standards (FPS) 140-2 Level 3.

Which solution meets these requirements?

  1. Use client-side encryption with an IAM KMS customer-managed key implemented with the IAM Encryption SDK
  2. Use IAM CloudHSM to store the keys and perform cryptographic operations Save the encrypted text in Amazon S3
  3. Use an IAM KMS customer-managed key that is backed by a custom key store using IAM CloudHSM
  4. Use an IAM KMS customer-managed key with the bring your own key (BYOK) feature to import a key stored in IAM CloudHSM

Answer(s): B



A company's information security team want to do near-real-time anomaly detection on Amazon EC2 performance and usage statistics. Log aggregation is the responsibility of a security engineer. To do the study, the Engineer needs gather logs from all of the company's IAM accounts in a single place.

How should the Security Engineer go about doing this?

  1. Log in to each account four times a day and filter the IAM CloudTrail log data, then copy and paste the logs in to the Amazon S3 bucket in the destination account.
  2. Set up Amazon CloudWatch to stream data to an Amazon S3 bucket in each source account. Set up bucket replication for each source account into a centralized bucket owned by the Security Engineer.
  3. Set up an IAM Config aggregator to collect IAM configuration data from multiple sources.
  4. Set up Amazon CloudWatch cross-account log data sharing with subscriptions in each account. Send the logs to Amazon Kinesis Data Firehose in the Security Engineer's account.

Answer(s): D

Explanation:

Read the prerequisites in the question carefully. The solution must support "near real time" analysis of the log data. Cloudwatch doesn't stream logs to S3; it supports exporting them to S3 with an up to 12 hour expected delay:

https://docs.IAM.amazon.com/AmazonCloudWatch/latest/logs/S3Export.html

"Log data can take up to 12 hours to become available for export. For near real-time analysis of log data, see Analyzing log data with CloudWatch Logs Insights or Real-time processing of log data with subscriptions instead."
https://docs.IAM.amazon.com/AmazonCloudWatch/latest/logs/Subscriptions.html

"You can use subscriptions to get access to a real-time feed of log events from CloudWatch Logs and have it delivered to other services such as an Amazon Kinesis stream, an Amazon Kinesis Data Firehose stream, or IAM Lambda for custom processing, analysis, or loading to other systems. When log events are sent to the receiving service, they are Base64 encoded and compressed with the gzip format." https://docs.IAM.amazon.com/AmazonCloudWatch/latest/logs/CrossAccountSubscriptions.
html



A company has a VPC with several Amazon EC2 instances behind a NAT gateway. The company's security policy states that all network traffic must be logged and must include the original source and destination IP addresses. The existing VPC Flow Logs do not include this information. A security engineer needs to recommend a solution.

Which combination of steps should the security engineer recommend? (Select TWO )

  1. Edit the existing VPC Flow Logs. Change the log format of the VPC Flow Logs from the Amazon default format to a custom format.
  2. Delete and recreate the existing VPC Flow Logs. Change the log format of the VPC Flow Logs from the Amazon default format to a custom format.
  3. Change the destination to Amazon CloudWatch Logs.
  4. Include the pkt-srcaddr and pkt-dstaddr fields in the log format.
  5. Include the subnet-id and instance-id fields in the log format.

Answer(s): A,E



Viewing page 18 of 134
Viewing questions 69 - 72 out of 532 questions



Post your Comments and Discuss Amazon SCS-C01 exam prep with other Community members:

SCS-C01 Exam Discussions & Posts