Free DP-700 Exam Braindumps (page: 7)

Page 6 of 18

You have a Fabric workspace that contains a lakehouse named Lakehouse1. Data is ingested into Lakehouse1 as one flat table. The table contains the following columns.
You plan to load the data into a dimensional model and implement a star schema. From the original flat table, you create two tables named FactSales and DimProduct. You will track changes in DimProduct.
You need to prepare the data.
Which three columns should you include in the DimProduct table? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  1. Date
  2. ProductName
  3. ProductColor
  4. TransactionID
  5. SalesAmount
  6. ProductID

Answer(s): B,C,F



HOTSPOT (Drag and Drop is not supported)
You have an Azure Event Hubs data source that contains weather data.
You ingest the data from the data source by using an eventstream named Eventstream1. Eventstream1 uses a lakehouse as the destination.
You need to batch ingest only rows from the data source where the City attribute has a value of Kansas. The filter must be added before the destination. The solution must minimize development effort.
What should you use for the data processor and filtering? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

  1. See Explanation section for answer.

Answer(s): A

Explanation:



You have a Fabric warehouse named DW1 that loads data by using a data pipeline named Pipeline1. Pipeline1 uses a Copy data activity with a dynamic SQL source. Pipeline1 is scheduled to run every 15 minutes.
You discover that Pipeline1 keeps failing.
You need to identify which SQL query was executed when the pipeline failed.
What should you do?

  1. From Monitoring hub, select the latest failed run of Pipeline1, and then view the output JSON.
  2. From Monitoring hub, select the latest failed run of Pipeline1, and then view the input JSON.
  3. From Real-time hub, select Fabric events, and then review the details of Microsoft.Fabric.ItemReadFailed.
  4. From Real-time hub, select Fabric events, and then review the details of Microsoft. Fabric.ItemUpdateFailed.

Answer(s): B



You have a Fabric workspace named Workspace1 that contains a notebook named Notebook1.
In Workspace1, you create a new notebook named Notebook2.
You need to ensure that you can attach Notebook2 to the same Apache Spark session as Notebook1.
What should you do?

  1. Enable high concurrency for notebooks.
  2. Enable dynamic allocation for the Spark pool.
  3. Change the runtime version.
  4. Increase the number of executors.

Answer(s): A






Post your Comments and Discuss Microsoft DP-700 exam with other Community members: