Free DAS-C01 Exam Braindumps (page: 21)

Page 21 of 42

An online retailer needs to deploy a product sales reporting solution. The source data is exported from an external online transaction processing (OLTP) system for reporting. Roll-up data is calculated each day for the previous day's activities. The reporting system has the following requirements:
Have the daily roll-up data readily available for 1 year.
After 1 year, archive the daily roll-up data for occasional but immediate access. The source data exports stored in the reporting system must be retained for 5 years. Query access will be needed only for re-evaluation, which may occur within the rst 90 days.
Which combination of actions will meet these requirements while keeping storage costs to a minimum? (Choose two.)

  1. Store the source data initially in the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Apply a lifecycle con guration that changes the storage class to Amazon S3 Glacier Deep Archive 90 days after creation, and then deletes the data 5 years after creation.
  2. Store the source data initially in the Amazon S3 Glacier storage class. Apply a lifecycle con guration that changes the storage class from Amazon S3 Glacier to Amazon S3 Glacier Deep Archive 90 days after creation, and then deletes the data 5 years after creation.
  3. Store the daily roll-up data initially in the Amazon S3 Standard storage class. Apply a lifecycle con guration that changes the storage class to Amazon S3 Glacier Deep Archive 1 year after data creation.
  4. Store the daily roll-up data initially in the Amazon S3 Standard storage class. Apply a lifecycle con guration that changes the storage class to Amazon S3 Standard-Infrequent Access (S3 Standard-IA) 1 year after data creation.
  5. Store the daily roll-up data initially in the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Apply a lifecycle con guration that changes the storage class to Amazon S3 Glacier 1 year after data creation.

Answer(s): A,D



A company needs to store objects containing log data in JSON format. The objects are generated by eight applications running in AWS. Six of the applications generate a total of 500 KiB of data per second, and two of the applications can generate up to 2 MiB of data per second. A data engineer wants to implement a scalable solution to capture and store usage data in an Amazon S3 bucket. The usage data objects need to be reformatted, converted to .csv format, and then compressed before they are stored in Amazon S3. The company requires the solution to include the least custom code possible and has authorized the data engineer to request a service quota increase if needed.
Which solution meets these requirements?

  1. Con gure an Amazon Kinesis Data Firehose delivery stream for each application. Write AWS Lambda functions to read log data objects from the stream for each application. Have the function perform reformatting and .csv conversion. Enable compression on all the delivery streams.
  2. Con gure an Amazon Kinesis data stream with one shard per application. Write an AWS Lambda function to read usage data objects from the shards. Have the function perform .csv conversion, reformatting, and compression of the data. Have the function store the output in Amazon S3.
  3. Con gure an Amazon Kinesis data stream for each application. Write an AWS Lambda function to read usage data objects from the stream for each application. Have the function perform .csv conversion, reformatting, and compression of the data. Have the function store the output in Amazon S3.
  4. Store usage data objects in an Amazon DynamoDB table. Con gure a DynamoDB stream to copy the objects to an S3 bucket. Con gure an AWS Lambda function to be triggered when objects are written to the S3 bucket. Have the function convert the objects into .csv format.

Answer(s): A



A data analytics specialist is building an automated ETL ingestion pipeline using AWS Glue to ingest compressed les that have been uploaded to an Amazon S3 bucket. The ingestion pipeline should support incremental data processing. Which AWS Glue feature should the data analytics specialist use to meet this requirement?

  1. Work ows
  2. Triggers
  3. Job bookmarks
  4. Classi ers

Answer(s): C


Reference:

https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/build-an-etl-service-pipeline-to-load-data-incrementally-from-amazon-s3- to- amazon-redshift-using-aws-glue.html



A telecommunications company is looking for an anomaly-detection solution to identify fraudulent calls. The company currently uses Amazon Kinesis to stream voice call records in a JSON format from its on-premises database to Amazon S3. The existing dataset contains voice call records with 200 columns. To detect fraudulent calls, the solution would need to look at 5 of these columns only. The company is interested in a cost-effective solution using AWS that requires minimal effort and experience in anomaly-detection algorithms.
Which solution meets these requirements?

  1. Use an AWS Glue job to transform the data from JSON to Apache Parquet. Use AWS Glue crawlers to discover the schema and build the AWS Glue Data Catalog. Use Amazon Athena to create a table with a subset of columns. Use Amazon QuickSight to visualize the data and then use Amazon QuickSight machine learning-powered anomaly detection.
  2. Use Kinesis Data Firehose to detect anomalies on a data stream from Kinesis by running SQL queries, which compute an anomaly score for all calls and store the output in Amazon RDS. Use Amazon Athena to build a dataset and Amazon QuickSight to visualize the results.
  3. Use an AWS Glue job to transform the data from JSON to Apache Parquet. Use AWS Glue crawlers to discover the schema and build the AWS Glue Data Catalog. Use Amazon SageMaker to build an anomaly detection model that can detect fraudulent calls by ingesting data from Amazon S3.
  4. Use Kinesis Data Analytics to detect anomalies on a data stream from Kinesis by running SQL queries, which compute an anomaly score for all calls. Connect Amazon QuickSight to Kinesis Data Analytics to visualize the anomaly scores.

Answer(s): A



Page 21 of 42



Post your Comments and Discuss Amazon DAS-C01 exam with other Community members:

Gobenathan commented on October 16, 2024
This is a good exam done but the free version is not complete the PDF version has all the question. that is what I used to pass my exam.
INDIA
upvote

Girish commented on October 16, 2024
Question are nice
Anonymous
upvote

SS commented on October 16, 2024
Nice Interface
UNITED STATES
upvote

Mohit commented on October 16, 2024
Passed this exam on second try with the help of this exam dumps. Very close to real exam.
India
upvote

XyRome commented on October 15, 2024
Where is the next set?
FRANCE
upvote

ano commented on October 15, 2024
Nice one help me lot
Anonymous
upvote

Draksh commented on October 15, 2024
Good content
UNITED STATES
upvote

Kumar commented on October 15, 2024
I can confirm this is legit and valid in UK. Passed the exam today. Good work.
UNITED STATES
upvote

Ank commented on October 15, 2024
good questions
Anonymous
upvote

Ankita commented on October 15, 2024
Nice questions
Anonymous
upvote

Ankita commented on October 15, 2024
Interesting questions
Anonymous
upvote

Laks commented on October 15, 2024
If you need to pass in first try you must use this exam dump. I passed on the first go.
Anonymous
upvote

Lakshmy S commented on October 15, 2024
question 3 the correct answer is EDISCOVERY and not customer lockbox
Anonymous
upvote

Ss commented on October 15, 2024
Did someone pass the exam with the questions from the dump? Are they valid?
UNITED STATES
upvote

Ashutosh commented on October 15, 2024
Its really good to have all informative data. Thanks !
Anonymous
upvote

Ram commented on October 15, 2024
Good material
Anonymous
upvote

karishma commented on October 15, 2024
is this right answer or wrong
UNITED KINGDOM
upvote

Nelis commented on October 15, 2024
going to write my 1102 soon is this still legit?
Anonymous
upvote

Comeru commented on October 15, 2024
You pass this exam with these questions. But you need to get the full version.
UNITED STATES
upvote

Jeron commented on October 15, 2024
Family hard exam. Unless you're an expert you cannot pass without using these exams.
UNITED KINGDOM
upvote

Suraj commented on October 15, 2024
Much better than the other website. No annoying recapture validation or advertisements.
INDIA
upvote

Sar commented on October 15, 2024
Nice exam dumps
Anonymous
upvote

Jawad commented on October 15, 2024
This is valuable resource for Az-900, i think
Anonymous
upvote

MIGUEL AVELLANEDA commented on October 14, 2024
Real and accurate examples of the CSA exam.
Anonymous
upvote

CompTIA commented on October 14, 2024
These questions are valid but you can't rely on them. We do not use these questions no more. On god.
UNITED STATES
upvote

Faruk commented on October 14, 2024
This is valuable resource for Az-900, i think
Anonymous
upvote

Ramu commented on October 14, 2024
It helps the pattern of exam
Anonymous
upvote

Ramu commented on October 14, 2024
Good content
Anonymous
upvote

Royal commented on October 14, 2024
This exam dump is valid in my country. I passed. I received 97%.
Brazil
upvote

Rodrigo C. commented on October 14, 2024
Great to have full access to the Salesforce Associate Exam! Thank you!!!
ROMANIA
upvote

salma commented on October 14, 2024
i need the pdf pls someone help me !
Anonymous
upvote

Gunnyk commented on October 14, 2024
@Nmap_Lord22- How was the PBQ'S?
UNITED STATES
upvote

Gunnyk commented on October 14, 2024
Anyone pass the exam recently?
UNITED STATES
upvote

Saurabh commented on October 14, 2024
Good content
EUROPEAN UNION
upvote