Free Amazon AWS-Certified-Big-Data-Specialty Exam Braindumps (page: 63)

When an EC2 instance that is backed by an s3-based AMI is terminated. What happens to the data on the root volume?

  1. Data is unavailable until the instance is restarted
  2. Data is automatically deleted
  3. Data is automatically saved as an EBS snapshot
  4. Data is automatically saved as an EBS volume

Answer(s): B



A user has created a launch configuration for Auto Scaling where CloudWatch detailed monitoring is disabled. The user wants to now enable detailed monitoring. How can the user achieve this?

  1. Update the Launch config with CLI to set InstanceMonitoringDisabled = false
  2. The user should change the Auto Scaling group from the AWS console to enable detailed monitoring
  3. Update the Launch config with CLI to set InstanceMonitoring.Enabled = true
  4. Create a new Launch Config with detail monitoring enabled and update the Auto Scaling group

Answer(s): D



A web-startup runs its very successful social news application on Amazon EC2 with an Elastic Load Balancer, an Auto-Scaling group of Java/Tomcat application-servers, and DynamoDB as data store. The main web-application best runs on m2 x large instances since it is highly memory- bound Each new deployment requires semi-automated creation and testing of a new AMI for the application servers which takes quite a while ana is therefore only done once per week.
Recently, a new chat feature has been implemented in nodejs and wails to be integrated in the architecture. First tests show that the new component is CPU bound Because the company has some experience with using Chef, they decided to streamline the deployment process and use AWS Ops Works as an application life cycle tool to simplify management of the application and reduce the deployment cycles.
What configuration in AWS Ops Works is necessary to integrate the new chat module in the most cost-efficient and filexible way?

  1. Create one AWS OpsWorks stack, create one AWS Ops Works layer, create one custom recipe
  2. Create one AWS OpsWorks stack create two AWS Ops Works layers create one custom recipe
  3. Create two AWS OpsWorks stacks create two AWS Ops Works layers create one custom recipe
  4. Create two AWS OpsWorks stacks create two AWS Ops Works layers create two custom recipe

Answer(s): C



Your firm has uploaded a large amount of aerial image data to S3 In the past, in your on-premises environment, you used a dedicated group of servers to oaten process this data and used Rabbit MQ - An open source messaging system to get job information to the servers. Once processed the data would go to tape and be shipped offsite. Your manager told you to stay with the current design, and leverage AWS archival storage and messaging services to minimize cost. Which is correct?

  1. Use SQS for passing job messages use Cloud Watch alarms to terminate EC2 worker instances when they become idle. Once data is processed, change the storage class of the S3 objects to Reduced Redundancy Storage.
  2. Setup Auto-Scaled workers triggered by queue depth that use spot instances to process messages in SOS Once data is processed,
  3. Change the storage class of the S3 objects to Reduced Redundancy Storage. Setup Auto-Scaled workers triggered by queue depth that use spot instances to process messages in SQS Once data is processed, change the storage class of the S3 objects to Glacier.
  4. Use SNS to pass job messages use Cloud Watch alarms to terminate spot worker instances when they become idle. Once data is processed, change the storage class of the S3 object to Glacier.

Answer(s): D






Post your Comments and Discuss Amazon AWS-Certified-Big-Data-Specialty exam prep with other Community members:

AWS-Certified-Big-Data-Specialty Exam Discussions & Posts