Amazon BDS-C00 Exam
AWS Certified Big Data -Speciality (Page 18 )

Updated On: 9-Feb-2026

A company that provides economics data dashboards needs to be able to develop software to display rich, interactive, data-driven graphics that run in web browsers and leverages the full stack of web standards (HTML, SVG and CSS).
Which technology provides the most appropriate for this requirement?

  1. D3.js
  2. Python/Jupyter
  3. R Studio
  4. Hue

Answer(s): C



A customer needs to determine the optimal distribution strategy for the ORDERS fact table in its Redshift schema.
The ORDERS table has foreign key relationships with multiple dimension tables in this schema.
How should the company determine the most appropriate distribution key for the ORDRES table?

  1. Identity the largest and most frequently joined dimension table and ensure that it and the ORDERS table both have EVEN distribution
  2. Identify the target dimension table and designate the key of this dimension table as the distribution key of the ORDERS table
  3. Identity the smallest dimension table and designate the key of this dimension table as the distribution key of ORDERS table
  4. Identify the largest and most frequently joined dimension table and designate the key of this dimension table as the distribution key for the orders table

Answer(s): D



A company has several teams of analytics. Each team of analysts has their own cluster. The teams need to run SQL queries using Hive, Spark-SQL and Presto with Amazon EMR. The company needs to enable a centralized metadata layer to expose the Amazon S3 objects as tables to the analysts. Which approach meets the requirement for a centralized metadata layer?

  1. EMRFS consistent view with a common Amazon DynamoDB table
  2. Bootstrap action to change the Hive Metastore to an Amazon RDS database
  3. s3distcp with the outputManifest option to generate RDS DDL
  4. naming scheme support with automatic partition discovery from Amazon S3

Answer(s): C



The department of transportation for a major metropolitan area has placed sensors on roads at key locations around the city. The goal is to analyze the flow of traffic and notifications from emergency services to identity potential issues and to help planners correct trouble spots.
A data engineer needs a scalable and fault-tolerant solution that allows planners to respond to issues within 30 seconds of their occurrence.
Which solution should the data engineer choose?

  1. Collect the sensor data with Amazon Kinesis Firehose and store it in Amazon Redshift for analysis. Collect emergency services events with Amazon SQS and store in Amazon DynamoDB for analysis
  2. Collect the sensor data with Amazon SQS and store in Amazon DynamoDB for analysis. Collect emergency services events with Amazon Kinesis Firehouse and store in Amazon Redshift for analysis
  3. Collect both sensor data and emergency services events with Amazon Kinesis Streams and use Amazon DynamoDB for analysis
  4. Collect both sensor data and emergency services events with Amazon Kinesis Firehouse and use Amazon Redshift for Analysis

Answer(s): A



An online photo album app has a key design feature to support multiple screens (e.g. desktop, mobile phone, and tablet) with high quality displays. Multiple versions of the image must be saved in different resolutions and layouts.
The image processing Java program takes an average of five seconds per upload, depending on the image size and format. Each image upload captures the following image metadata: user, album, photo label, upload timestamp
The app should support the following requirements:
• Hundreds of user image uploads per second
• Maximum image metadata size of 10 MB
• Maximum image metadata size of 1 KB
• Image displayed in optimized resolution in all supported screens no later than one minute after image upload
Which strategy should be used to meet these requirements?

  1. Write images and metadata to Amazon Kinesis, Use a Kinesis Client Library (KCL) application to run the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  2. Write image and metadata RDS with BLOB data type. Use AWS Data Pipeline to run the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  3. Upload image with metadata to Amazon S3 use Lambda function to run the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  4. Write image and metadata to Amazon kinesis. Use Amazon Elastic MapReduce (EMR) with Spark Streaming to run image processing and save image output to Amazon

Answer(s): D






Post your Comments and Discuss Amazon BDS-C00 exam prep with other Community members:

Join the BDS-C00 Discussion