Free DP-300 Exam Braindumps (page: 32)

Page 32 of 76

You have an Azure Stream Analytics job.

You need to ensure that the job has enough streaming units provisioned. You configure monitoring of the SU % Utilization metric.
Which two additional metrics should you monitor? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

  1. Late Input Events
  2. Out of order Events
  3. Backlogged Input Events
  4. Watermark Delay
  5. Function Events

Answer(s): C,D

Explanation:

To react to increased workloads and increase streaming units, consider setting an alert of 80% on the SU Utilization metric. Also, you can use watermark delay and backlogged events metrics to see if there is an impact.

Note: Backlogged Input Events: Number of input events that are backlogged. A non-zero value for this metric implies that your job isn't able to keep up with the number of incoming events. If this value is slowly increasing or consistently non-zero, you should scale out your job, by increasing the SUs.


Reference:

https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-monitoring



You have an Azure Databricks resource.
You need to log actions that relate to changes in compute for the Databricks resource. Which Databricks services should you log?

  1. clusters
  2. jobs
  3. DBFS
  4. SSH
  5. workspace

Answer(s): E

Explanation:

Cloud Provider Infrastructure Logs.
Databricks logging allows security and admin teams to demonstrate conformance to data governance standards within or from a Databricks workspace. Customers, especially in the regulated industries, also need records on activities like:
User access control to cloud data storage Cloud Identity and Access Management roles User access to cloud network and compute

Azure Databricks offers three distinct workloads on several VM Instances tailored for your data analytics workflow—the Jobs Compute and Jobs Light Compute workloads make it easy for data engineers to build and execute jobs, and the All-Purpose Compute workload makes it easy for data scientists to explore, visualize, manipulate, and share data and insights interactively.


Reference:

https://databricks.com/blog/2020/03/25/trust-but-verify-with-databricks.html



Your company uses Azure Stream Analytics to monitor devices.
The company plans to double the number of devices that are monitored.

You need to monitor a Stream Analytics job to ensure that there are enough processing resources to handle the additional load.

Which metric should you monitor?

  1. Input Deserialization Errors
  2. Late Input Events
  3. Early Input Events
  4. Watermark delay

Answer(s): D

Explanation:

The Watermark delay metric is computed as the wall clock time of the processing node minus the largest watermark it has seen so far.

The watermark delay metric can rise due to:
Not enough processing resources in Stream Analytics to handle the volume of input events. Not enough throughput within the input event brokers, so they are throttled.
Output sinks are not provisioned with enough capacity, so they are throttled.


Reference:

https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-time-handling



You manage an enterprise data warehouse in Azure Synapse Analytics.

Users report slow performance when they run commonly used queries. Users do not report performance changes for infrequently used queries.

You need to monitor resource utilization to determine the source of the performance issues. Which metric should you monitor?

  1. Local tempdb percentage
  2. DWU percentage
  3. Data Warehouse Units (DWU) used
  4. Cache hit percentage

Answer(s): A

Explanation:

Tempdb is used to hold intermediate results during query execution. High utilization of the tempdb database can lead to slow query performance.

Note: If you have a query that is consuming a large amount of memory or have received an error message related to allocation of tempdb, it could be due to a very large CREATE TABLE AS SELECT (CTAS) or INSERT SELECT statement running that is failing in the final data movement operation.


Reference:

https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-manage- monitor#monitor-tempdb



Page 32 of 76



Post your Comments and Discuss Microsoft DP-300 exam with other Community members:

laks commented on December 26, 2024
so far seems good
UNITED STATES
upvote

Jack commented on October 24, 2024
Muito bom as perguntas
Anonymous
upvote

TheUser commented on October 23, 2024
So far seems good
Anonymous
upvote

anonymus commented on October 23, 2024
master database differential backup is not supported in sql server
EUROPEAN UNION
upvote

Ntombi commented on October 17, 2024
i find the questions helpful for my exam preparation
Anonymous
upvote

Ntombi commented on October 17, 2024
The questions help me to see if I understood what I have learned
Anonymous
upvote

ntombi commented on October 17, 2024
writing exam at the end of the month
Anonymous
upvote

Raby commented on August 13, 2024
Wonderful work guys. The PDF version helped me pass. Thank you
EUROPEAN UNION
upvote