Free DP-500 Exam Braindumps (page: 21)

Page 21 of 46

DRAG DROP (Drag and Drop is not supported)
You are configuring Azure Synapse Analytics pools to support the Azure Active Directory groups shown in the following table.




Which type of pool should each group use? To answer, drag the appropriate pool types to the groups. Each pool type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.
Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: Apache Spark pool
An Apache Spark pool provides open-source big data compute capabilities. After you've created an Apache Spark pool in your Synapse workspace, data can be loaded, modeled, processed, and distributed for faster analytic insight.

Box 2: Dedicated SQL Pool
Dedicated SQL Pool - Data is stored in relational tables

Incorrect:
Serverless SQL pool - Data is stored in Data Lake

Box 3: Serverless SQL pool
Serverless SQL pool - Cost is incurred for the data processed per query

Incorrect:
* Dedicated SQL Pool - Cost is incurred for the resources reserved


Reference:

https://docs.microsoft.com/en-us/azure/synapse-analytics/quickstart-create-apache-spark-pool-portal
https://www.royalcyber.com/blog/data-services/dedicated-sql-pool-vs-serverless-sql/



DRAG DROP (Drag and Drop is not supported)
You have a Power BI dataset. The dataset contains data that is updated frequently.

You need to improve the performance of the dataset by using incremental refreshes.

Which four actions should you perform in sequence to enable the incremental refreshes? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Step 1: Create RangeStart and RangeEnd parameters.
Create parameters
In this task, use Power Query Editor to create RangeStart and RangeEnd parameters with default values. The default values apply only when filtering the data to be loaded into the model in Power BI Desktop. The values you enter should include only a small amount of the most recent data from your data source. When published to the service, these values are overridden by the incremental refresh policy.

Step 2: Apply a custom Date/Time filter to the data.
Filter data
With RangeStart and RangeEnd parameters defined, apply a filter based on conditions in the RangeStart and RangeEnd parameters.

Before continuing with this task, verify your source table has a date column of Date/Time data type.

Step 3: Define the incremental refresh policy for the table.
Define policy
After you've defined RangeStart and RangeEnd parameters, and filtered data based on those parameters, you define an incremental refresh policy. The policy is applied only after the model is published to the service and a manual or scheduled refresh operation is performed.

Step 4: Publish the model to the Power BI service.
Save and publish to the service
When your RangeStart and RangeEnd parameters, filtering, and refresh policy settings are complete, be sure to save your model, and then publish to the service.


Reference:

https://docs.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-configure



You develop a solution that uses a Power BI Premium capacity. The capacity contains a dataset that is expected to consume 50 GB of memory.

Which two actions should you perform to ensure that you can publish the model successfully to the Power BI service? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

  1. Restart the capacity.
  2. Publish an initial dataset that is less than 10 G
  3. Increase the Max Offline Dataset Size setting.
  4. Invoke a refresh to load historical data based on the incremental refresh policy.
  5. Publish the complete dataset.

Answer(s): D,E

Explanation:

Enable large datasets
Steps here describe enabling large datasets for a new model published to the service. For existing datasets, only step 3 is necessary.

Create a model in Power BI Desktop. If your dataset will become larger and progressively consume more memory, be sure to configure Incremental refresh.

(E) Publish the model as a dataset to the service.

In the service > dataset > Settings, expand Large dataset storage format, set the slider to On, and then select Apply.

Enable large dataset slider

(D) Invoke a refresh to load historical data based on the incremental refresh policy. The first refresh could take a while to load the history. Subsequent refreshes should be faster, depending on your incremental refresh policy.


Reference:

https://docs.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models



You have a Power BI workspace named Workspace1 in a Premium capacity. Workspace1 contains a dataset.

During a scheduled refresh, you receive the following error message: “Unable to save the changes since the new dataset size of 11,354 MB exceeds the limit of 10,240 MB.”

You need to ensure that you can refresh the dataset.

What should you do?

  1. Change License mode to Premium per user.
  2. Turn on Large dataset storage format.
  3. Connect Workspace1 to an Azure Data Lake Storage Gen2 account.
  4. Change the location of the Premium capacity.

Answer(s): B


Reference:

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models



Page 21 of 46



Post your Comments and Discuss Microsoft DP-500 exam with other Community members:

Summer commented on July 28, 2024
Wonderful site. It helped me pass my exam. Way to go guys!
UNITED STATES
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

siyaa commented on January 19, 2024
helped me understand the material better.
Anonymous
upvote

Bunny commented on June 19, 2023
Good Content
Anonymous
upvote

Demetrius commented on June 01, 2023
Important and useful
Anonymous
upvote

Kartoos commented on April 06, 2023
The practice exam was an important part of my preparation and helped me understand the material better.
FRANCE
upvote