A company with several AWS accounts is using AWS Organizations and service control policies (SCPs). An administrator created the following SCP and has attached it to an organizational unit (OU) that contains AWS account 1111-1111-1111: Developers working in account 1111-1111-1111 complain that they cannot create Amazon S3 buckets. How should the administrator address this problem?
Answer(s): C
C) Instruct the developers to add Amazon S3 permissions to their IAM entities is the correct answer.An SCP (Service Control Policy) defines what services and actions can be used within an AWS account, but it does not grant permissions on its own. It acts as a boundary. In this case, the SCP does not explicitly deny the ability to create Amazon S3 buckets, so the issue is that the developers' IAM roles or users are not assigned the appropriate permissions to create S3 buckets.To resolve this, the developers need to have the necessary S3 permissions (e.g., s3:CreateBucket) in their IAM roles or policies. Once the appropriate permissions are added, they will be able to create S3 buckets, as the SCP does not restrict that action.
A company has a monolithic application that is critical to the company’s business. The company hosts the application on an Amazon EC2 instance that runs Amazon Linux 2. The company’s application team receives a directive from the legal department to back up the data from the instance’s encrypted Amazon Elastic Block Store (Amazon EBS) volume to an Amazon S3 bucket. The application team does not have the administrative SSH key pair for the instance. The application must continue to serve the users.Which solution will meet these requirements?
Answer(s): A
A) Attach a role to the instance with permission to write to Amazon S3. Use the AWS Systems Manager Session Manager option to gain access to the instance and run commands to copy data into Amazon S3 is the correct answer.This solution allows you to securely access the EC2 instance without needing the SSH key pair by using AWS Systems Manager Session Manager. Once access is gained through Session Manager, the necessary commands can be executed to copy data from the EBS volume to the Amazon S3 bucket. Attaching an IAM role with S3 write permissions to the instance ensures that the instance has the necessary permissions to upload the data to S3.This approach does not interrupt the running application, ensuring that the application continues to serve users while meeting the backup requirement from the legal department.
A solutions architect needs to copy data from an Amazon S3 bucket m an AWS account to a new S3 bucket in a new AWS account. The solutions architect must implement a solution that uses the AWS CLI. Which combination of steps will successfully copy the data? (Choose three.)
Answer(s): B,D,F
The correct answers are:B) Create a bucket policy to allow a user in the destination account to list the source bucket’s contents and read the source bucket’s objects. Attach the bucket policy to the source bucket.This step ensures that the destination account has the necessary permissions to access and read the objects from the source bucket.D) Create an IAM policy in the destination account. Configure the policy to allow a user in the destination account to list contents and get objects in the source bucket, and to list contents, put objects, and set object ACLs in the destination bucket. Attach the policy to the user.This step grants the user in the destination account permissions to interact with both the source and destination buckets.F) Run the aws s3 sync command as a user in the destination account. Specify the source and destination buckets to copy the data.Running the aws s3 sync command from the destination account allows the user to copy the data from the source S3 bucket to the new S3 bucket, ensuring that the permissions set in the previous steps are applied correctly.This combination of actions ensures that the data is copied from the source bucket in one AWS account to the destination bucket in another AWS account using the AWS CLI, with the appropriate permissions for accessing and managing the data across accounts.
A company built an application based on AWS Lambda deployed in an AWS CloudFormation stack. The last production release of the web application introduced an issue that resulted in an outage lasting several minutes. A solutions architect must adjust the deployment process to support a canary release.Which solution will meet these requirements?
A) Create an alias for every new deployed version of the Lambda function. Use the AWS CLI update-alias command with the routing-config parameter to distribute the load is the correct answer.This approach allows you to implement a canary release using AWS Lambda's versioning and aliases. By creating an alias for the new version and using the update-alias command with the routing-config parameter, you can gradually shift traffic to the new version of the Lambda function. This allows you to test the new version with a small percentage of users before fully rolling it out, which is a key aspect of canary releases.This method ensures that you can detect and mitigate any issues with new Lambda function versions before they affect all users, minimizing the risk of outages or issues during deployment.
A finance company hosts a data lake in Amazon S3. The company receives financial data records over SFTP each night from several third parties. The company runs its own SFTP server on an Amazon EC2 instance in a public subnet of a VPC. After the files are uploaded, they are moved to the data lake by a cron job that runs on the same instance. The SFTP server is reachable on DNS sftp.example.com through the use of Amazon Route 53.What should a solutions architect do to improve the reliability and scalability of the SFTP solution?
Answer(s): B
B) Migrate the SFTP server to AWS Transfer for SFTP. Update the DNS record sftp.example.com in Route 53 to point to the server endpoint hostname is the correct answer.AWS Transfer for SFTP is a fully managed service that scales automatically and is highly reliable compared to managing an SFTP server on an EC2 instance. This migration would offload the operational burden of managing the SFTP server while providing enhanced scalability, availability, and built-in integration with Amazon S3 for direct data transfer to the data lake. By updating the DNS record in Route 53 to point to the AWS Transfer SFTP endpoint, the company ensures a smooth transition without requiring changes from the third parties uploading the data.This solution improves both reliability and scalability without the need for manual instance management or custom scaling configurations.
Post your Comments and Discuss Amazon SAP-C02 exam dumps with other Community members:
AWS Learner Commented on April 11, 2025 This sample questions for SAP-C02 exam really helped me pass the exam from the first try. Anonymous
Mini monk Commented on March 09, 2025 Didn't test yet Anonymous
ry Commented on February 12, 2025 very helpful Anonymous
Vlad Commented on February 06, 2024 This is my 2nd time getting a test from you for AWS and first one worked out well lets hope this one does too UNITED STATES
Darnell Morris Commented on February 05, 2024 I'm looking forward to passing the AWS Solutions Architect Professional exam. My system crashed with my previous purchase and my subscription expired therefore I need to renew. UNITED STATES
Roberts Commented on October 24, 2023 I gave the AWS SAP-C02 test and studied through as it has latest mock tests available which helped me evaluate my performance and got me 906/1000. Anonymous
Andrew Commented on August 23, 2023 very helpful Anonymous
Mukesh Commented on July 10, 2023 Good questions UNITED KINGDOM
Mukesh Commented on July 10, 2023 good questions UNITED KINGDOM
Willard Commented on March 18, 2023 This guide is a one-way ticket to Successville - Passed my exam and now I am the mayor! AUSTRALIA
Mora Commented on February 09, 2023 Free-Braindumps.com helped me ace my exam. The practice practice questions were spot on and the explanations were helpful. UNITED STATES