Free DP-700 Exam Braindumps (page: 8)

Page 7 of 18

You have a Fabric workspace that contains an eventstream named Eventstream1. Eventstream1 processes data from a thermal sensor by using event stream processing, and then stores the data in a lakehouse.
You need to modify Eventstream1 to include the standard deviation of the temperature.
Which transform operator should you include in the Eventstream1 logic?

  1. Expand
  2. Group by
  3. Union
  4. Aggregate

Answer(s): D



You have a Fabric notebook named Notebook1 that has been executing successfully for the last week.
During the last run, Notebook1executed nine jobs.
You need to view the jobs in a timeline chart.
What should you use?

  1. Real-Time hub
  2. Monitoring hub
  3. the job history from the application run
  4. Spark History Server
  5. the run series from the details of the application run

Answer(s): E



You have a Fabric workspace named Workspace1 that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:
Orders
Customer
Employee
The Employee table contains Personally Identifiable Information (PII).
A data engineer is building a workflow that requires writing data to the Customer table, however, the user does NOT have the elevated permissions required to view the contents of the Employee table.
You need to ensure that the data engineer can write data to the Customer table without reading data from the Employee table.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  1. Share Lakehouse1 with the data engineer.
  2. Assign the data engineer the Contributor role for Workspace2.
  3. Assign the data engineer the Viewer role for Workspace2.
  4. Assign the data engineer the Contributor role for Workspace1.
  5. Migrate the Employee table from Lakehouse1 to Lakehouse2.
  6. Create a new workspace named Workspace2 that contains a new lakehouse named Lakehouse2.
  7. Assign the data engineer the Viewer role for Workspace1.

Answer(s): D,E,F



You have an Azure event hub. Each event contains the following fields:
BikepointID
Street
Neighbourhood
Latitude
Longitude
No_Bikes
No_Empty_Docks
You need to ingest the events. The solution must only retain events that have a Neighbourhood value of Chelsea, and then store the retained events in a Fabric lakehouse.
What should you use?

  1. a KQL queryset
  2. an eventstream
  3. a streaming dataset
  4. Apache Spark Structured Streaming

Answer(s): B






Post your Comments and Discuss Microsoft DP-700 exam with other Community members: