Free AZ-305 Exam Braindumps (page: 26)

Page 25 of 67

You have an app named App1 that uses an on-premises Microsoft SQL Server database named DB1.

You plan to migrate DB1 to an Azure SQL managed instance.

You need to enable customer managed Transparent Data Encryption (TDE) for the instance. The solution must maximize encryption strength.

Which type of encryption algorithm and key length should you use for the TDE protector?

  1. RSA 3072
  2. AES 256
  3. RSA 4096
  4. RSA 2048

Answer(s): A

Explanation:

TDE protector can only be an asymmetric, RSA, or RSA HSM key. The supported key lengths are 2048 bytes and 3072 bytes.


Reference:

https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview



You are planning an Azure IoT Hub solution that will include 50,000 IoT devices.

Each device will stream data, including temperature, device ID, and time data. Approximately 50,000 records will be written every second. The data will be visualized in near real time.

You need to recommend a service to store and query the data.

Which two services can you recommend? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

  1. Azure Table Storage
  2. Azure Event Grid
  3. Azure Cosmos DB for NoSQL
  4. Azure Time Series Insights

Answer(s): C,D

Explanation:

D: Time Series Insights is a fully managed service for time series data. In this architecture, Time Series Insights performs the roles of stream processing, data store, and analytics and reporting. It accepts streaming data from either IoT Hub or Event Hubs and stores, processes, analyzes, and displays the data in near real time.

C: The processed data is stored in an analytical data store, such as Azure Data Explorer, HBase, Azure Cosmos DB, Azure Data Lake, or Blob Storage.


Reference:

https://docs.microsoft.com/en-us/azure/architecture/data-guide/scenarios/time-series



HOTSPOT (Drag and Drop is not supported)
You are planning an Azure Storage solution for sensitive data. The data will be accessed daily. The dataset is less than 10 GB.

You need to recommend a storage solution that meets the following requirements:

-All the data written to storage must be retained for five years.
-Once the data is written, the data can only be read. Modifications and deletion must be prevented.
-After five years, the data can be deleted, but never modified.
-Data access charges must be minimized.

What should you recommend? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: General purpose v2 with Hot access tier for blobs
Note:
* All the data written to storage must be retained for five years.
* Data access charges must be minimized

Hot tier has higher storage costs, but lower access and transaction costs.

Incorrect:
Not Premium Block Blobs: Higher cost.
Premium storage account type for block blobs and append blobs. Recommended for scenarios with high transaction rates or that use smaller objects or require consistently low storage latency.

Not Cool: Lower storage costs, but higher access and transaction costs.

Box 2: Storage account resource lock
As an administrator, you can lock a subscription, resource group, or resource to prevent other users in your organization from accidentally deleting or modifying critical resources. The lock overrides any permissions the user might have.


Reference:

https://docs.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview
https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/lock-resources



HOTSPOT (Drag and Drop is not supported)
You are designing a data analytics solution that will use Azure Synapse and Azure Data Lake Storage Gen2.

You need to recommend Azure Synapse pools to meet the following requirements:

-Ingest data from Data Lake Storage into hash-distributed tables.
-Implement query, and update data in Delta Lake.

What should you recommend for each requirement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: A dedicated SQL pool
Ingest data from Data Lake Storage into hash distributed tables.

Guidance for designing distributed tables using dedicated SQL pool in Azure Synapse Analytics
You can design hash-distributed and round-robin distributed tables in dedicated SQL pools.

Box 2: A serverless SQL pool
Implement query, and update data in Delta Lake.

You can query Delta Lake files using serverless SQL pool in Azure Synapse Analytics
You can write a query using serverless Synapse SQL pool to read Delta Lake files. Delta Lake is an open-source storage layer that brings ACID (atomicity, consistency, isolation, and durability) transactions to Apache Spark and big data workloads.

The serverless SQL pool in Synapse workspace enables you to read the data stored in Delta Lake format, and serve it to reporting tools. A serverless SQL pool can read Delta Lake files that are created using Apache Spark, Azure Databricks, or any other producer of the Delta Lake format.


Reference:

https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-distribute
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/query-delta-lake-format






Post your Comments and Discuss Microsoft AZ-305 exam with other Community members:

AZ-305 Discussions & Posts