Free DAS-C01 Exam Braindumps (page: 20)

Page 20 of 42

A bank operates in a regulated environment. The compliance requirements for the country in which the bank operates say that customer data for each state should only be accessible by the bank's employees located in the same state. Bank employees in one state should NOT be able to access data for customers who have provided a home address in a different state. The bank's marketing team has hired a data analyst to gather insights from customer data for a new campaign being launched in certain states. Currently, data linking each customer account to its home state is stored in a tabular .csv le within a single Amazon S3 folder in a private S3 bucket. The total size of the S3 folder is 2 GB uncompressed. Due to the country's compliance requirements, the marketing team is not able to access this folder.
The data analyst is responsible for ensuring that the marketing team gets one-time access to customer data for their campaign analytics project, while being subject to all the compliance requirements and controls.
Which solution should the data analyst implement to meet the desired requirements with the LEAST amount of setup effort?

  1. Re-arrange data in Amazon S3 to store customer data about each state in a different S3 folder within the same bucket. Set up S3 bucket policies to provide marketing employees with appropriate data access under compliance controls. Delete the bucket policies after the project.
  2. Load tabular data from Amazon S3 to an Amazon EMR cluster using s3DistCp. Implement a custom Hadoop-based row-level security solution on the Hadoop Distributed File System (HDFS) to provide marketing employees with appropriate data access under compliance controls. Terminate the EMR cluster after the project.
  3. Load tabular data from Amazon S3 to Amazon Redshift with the COPY command. Use the built-in row-level security feature in Amazon Redshift to provide marketing employees with appropriate data access under compliance controls. Delete the Amazon Redshift tables after the project.
  4. Load tabular data from Amazon S3 to Amazon QuickSight Enterprise edition by directly importing it as a data source. Use the built-in row- level security feature in Amazon QuickSight to provide marketing employees with appropriate data access under compliance controls. Delete Amazon QuickSight data sources after the project is complete.

Answer(s): D



An online gaming company is using an Amazon Kinesis Data Analytics SQL application with a Kinesis data stream as its source. The source sends three non-null elds to the application: player_id, score, and us_5_digit_zip_code. A data analyst has a .csv mapping le that maps a small number of us_5_digit_zip_code values to a territory code. The data analyst needs to include the territory code, if one exists, as an additional output of the Kinesis Data Analytics application.
How should the data analyst meet this requirement while minimizing costs?

  1. Store the contents of the mapping le in an Amazon DynamoDB table. Preprocess the records as they arrive in the Kinesis Data Analytics application with an AWS Lambda function that fetches the mapping and supplements each record to include the territory code, if one exists. Change the SQL query in the application to include the new eld in the SELECT statement.
  2. Store the mapping le in an Amazon S3 bucket and con gure the reference data column headers for the .csv le in the Kinesis Data Analytics application. Change the SQL query in the application to include a join to the le's S3 Amazon Resource Name (ARN), and add the territory code eld to the SELECT columns.
  3. Store the mapping le in an Amazon S3 bucket and con gure it as a reference data source for the Kinesis Data Analytics application. Change the SQL query in the application to include a join to the reference table and add the territory code eld to the SELECT columns.
  4. Store the contents of the mapping le in an Amazon DynamoDB table. Change the Kinesis Data Analytics application to send its output to an AWS Lambda function that fetches the mapping and supplements each record to include the territory code, if one exists. Forward the record from the Lambda function to the original application destination.

Answer(s): C



A company has collected more than 100 TB of log les in the last 24 months. The les are stored as raw text in a dedicated Amazon S3 bucket. Each object has a key of the form year-month-day_log_HHmmss.txt where HHmmss represents the time the log le was initially created. A table was created in Amazon Athena that points to the S3 bucket. One-time queries are run against a subset of columns in the table several times an hour.
A data analyst must make changes to reduce the cost of running these queries. Management wants a solution with minimal maintenance overhead.
Which combination of steps should the data analyst take to meet these requirements? (Choose three.)

  1. Convert the log les to Apace Avro format.
  2. Add a key pre x of the form date=year-month-day/ to the S3 objects to partition the data.
  3. Convert the log les to Apache Parquet format.
  4. Add a key pre x of the form year-month-day/ to the S3 objects to partition the data.
  5. Drop and recreate the table with the PARTITIONED BY clause. Run the ALTER TABLE ADD PARTITION statement.
  6. Drop and recreate the table with the PARTITIONED BY clause. Run the MSCK REPAIR TABLE statement.

Answer(s): B,C,F


Reference:

https://docs.aws.amazon.com/athena/latest/ug/msck-repair-table.html



A company has an application that ingests streaming data. The company needs to analyze this stream over a 5-minute timeframe to evaluate the stream for anomalies with Random Cut Forest (RCF) and summarize the current count of status codes. The source and summarized data should be persisted for future use.
Which approach would enable the desired outcome while keeping data persistence costs low?

  1. Ingest the data stream with Amazon Kinesis Data Streams. Have an AWS Lambda consumer evaluate the stream, collect the number status codes, and evaluate the data against a previously trained RCF model. Persist the source and results as a time series to Amazon DynamoDB.
  2. Ingest the data stream with Amazon Kinesis Data Streams. Have a Kinesis Data Analytics application evaluate the stream over a 5-minute window using the RCF function and summarize the count of status codes. Persist the source and results to Amazon S3 through output delivery to Kinesis Data Firehouse.
  3. Ingest the data stream with Amazon Kinesis Data Firehose with a delivery frequency of 1 minute or 1 MB in Amazon S3. Ensure Amazon S3 triggers an event to invoke an AWS Lambda consumer that evaluates the batch data, collects the number status codes, and evaluates the data against a previously trained RCF model. Persist the source and results as a time series to Amazon DynamoDB.
  4. Ingest the data stream with Amazon Kinesis Data Firehose with a delivery frequency of 5 minutes or 1 MB into Amazon S3. Have a Kinesis Data Analytics application evaluate the stream over a 1-minute window using the RCF function and summarize the count of status codes. Persist the results to Amazon S3 through a Kinesis Data Analytics output to an AWS Lambda integration.

Answer(s): B



Page 20 of 42



Post your Comments and Discuss Amazon DAS-C01 exam with other Community members:

Syam commented on October 07, 2024
fantastic support for certification seekers
AUSTRALIA
upvote

mogi commented on October 07, 2024
Good worksimple question but certification have tough questions
Anonymous
upvote

Julian commented on October 07, 2024
Passed and got a 92% in this exam.
Anonymous
upvote

Tsholofelo commented on October 07, 2024
Tricky question
Anonymous
upvote

Gowtham commented on October 06, 2024
Great questions
UNITED STATES
upvote

Brook commented on October 06, 2024
Great While free AZ-900 exam braindumps might seem tempting, they often come with risks like outdated information or inaccuracies. Investing in reliable study materials, like those from this site ensures you get the latest and most accurate content to help you succeed.
Anonymous
upvote

Yogi commented on October 06, 2024
Simple quesitons
CANADA
upvote

Anderson commented on October 06, 2024
Finally passed this exam. I am certified now and ready for a promotion.
Brazil
upvote

NOOR commented on October 06, 2024
I want to pass my CIA Exam P2 withing the next 2weeks, can I get help?
UNITED ARAB EMIRATES
upvote

Gevo commented on October 05, 2024
First exam is passed. Studying and preparation for second exam now. I purchased 2 study guides with 50% discount. Goo deal.
Singapore
upvote

Ama commented on October 05, 2024
Dump PDF OK
Anonymous
upvote

Marv commented on October 05, 2024
This is Great!
Anonymous
upvote

Aaa commented on October 05, 2024
Best Practice
Anonymous
upvote

sadai commented on October 05, 2024
I really apricate this helpful test
Anonymous
upvote

sadai commented on October 04, 2024
I do not know to say thanks it is really useful
Anonymous
upvote

sadai commented on October 04, 2024
it was really useful thank you so much
Anonymous
upvote

sadai commented on October 04, 2024
Hi it was really helpful for me to improve my mind
Anonymous
upvote

Mohammed Haque commented on October 04, 2024
very useful site for exam prep
UNITED STATES
upvote

Melvin commented on October 04, 2024
Educational
Anonymous
upvote

NJ commented on October 04, 2024
Good Study Material
UNITED STATES
upvote

Tsholofelo commented on October 04, 2024
Mostly challenging question
Anonymous
upvote

Moana commented on October 04, 2024
Preperation
Anonymous
upvote

Nate commented on October 04, 2024
I worked really hard to pass this exam. It is a very hard exam. These questions are you best buddy. So use them.
UNITED STATES
upvote

Dominic commented on October 04, 2024
Lots of comments here asking if any one passed this exam. I did pass this exam. It is tough one. Study hard and use these exam questions and answers. You will be able to pass.
UNITED STATES
upvote

Miss Tech commented on October 04, 2024
@Lucas, hi did you pass?and how many questions were in the Exam because l can only see 47Q here on the dumps,???
Anonymous
upvote

Vani commented on October 04, 2024
Very useful
Anonymous
upvote

Priyanka Prasad commented on October 04, 2024
i need questions
Anonymous
upvote

Jack commented on October 03, 2024
are these still legit?
Anonymous
upvote

Ashok Kumar commented on October 03, 2024
Very good content to prep
UNITED STATES
upvote

User commented on October 03, 2024
By far one of the best free sources of exam dumps. I searched google for free braindumps and boom I got this right away.
UNITED STATES
upvote

Vignesh commented on October 03, 2024
I'm writing next week, are the questions still valid?
CZECH REPUBLIC
upvote

Rama commented on October 03, 2024
All looks good.
Anonymous
upvote

Yaron M commented on October 03, 2024
please stop the pain i cant take this anyomre my wife left me and she took the kids its been 54 years and i still cant pass AZ104 please make the suffering stop
Anonymous
upvote

Varon commented on October 03, 2024
The 2 hardest topics of this exams are: 1) Designing Resilient Architectures and 2) Cost-Optimized Architectures By mastering these areas, you’ll be better prepared for tricky exam questions related to resilient and cost-effective architectures.
INDIA
upvote