Free DP-500 Exam Braindumps (page: 8)

Page 8 of 46

You have an Azure Synapse Analytics dedicated SQL pool.

You need to ensure that the SQL pool is scanned by Azure Purview.

What should you do first?

  1. Create a data policy.
  2. Create a data share connection.
  3. Search the data catalog.
  4. Register a data source.

Answer(s): D

Explanation:

Steps to register a data source to Azure Purview.
Go to your Microsoft Purview account.

On the left pane, select Sources.

Select Register.

Under Register sources, select Azure Synapse Analytics (multiple).

Select Continue.


On the Register sources (Azure Synapse Analytics) page, do the following:

a) Enter a Name for the data source to be listed in the data catalog.
b) Optionally, choose a subscription to filter down to.
c) In the Workspace name dropdown list, select the workspace that you're working with.
d) In the endpoints dropdown lists, the SQL endpoints are automatically filled in based on your workspace selection.
e) In the Select a collection dropdown list, choose the collection you're working with or, optionally, create a new one.
f) Select Register to finish registering the data source.


Reference:

https://docs.microsoft.com/en-us/azure/purview/register-scan-azure-multiple-sources



You have a deployment pipeline for a Power BI workspace. The workspace contains two datasets that use import storage mode.

A database administrator reports a drastic increase in the number of queries sent from the Power BI service to an Azure SQL database since the creation of the deployment pipeline.

An investigation into the issue identifies the following:

One of the datasets is larger than 1 GB and has a fact table that contains more than 500 million rows.
When publishing dataset changes to development, test, or production pipelines, a refresh is triggered against the entire dataset.

You need to recommend a solution to reduce the size of the queries sent to the database when the dataset changes are published to development, test, or production.

What should you recommend?

  1. Turn off auto refresh when publishing the dataset changes to the Power BI service.
  2. In the dataset, change the fact table from an import table to a hybrid table.
  3. Enable the large dataset storage format for workspace.
  4. Create a dataset parameter to reduce the fact table row count in the development and test pipelines.

Answer(s): B

Explanation:

Hybrid tables
Hybrid tables are tables with incremental refresh that can have both import and direct query partitions. During a clean deployment, both the refresh policy and the hybrid table partitions are copied. When deploying to a pipeline stage that already has hybrid table partitions, only the refresh policy is copied. To update the partitions, refresh the table.

Refreshes are faster - Only the most recent data that has changed needs to be refreshed.

Incorrect:
Not D: A dataset parameter does not reduce fact table row count.

Note: Microsoft recommends using parameters to store connection details such as instance names and database names, instead of using a static connection string.
Parameters have additional uses, such as making changes to queries, filters, and the text displayed in the report.


Reference:

https://docs.microsoft.com/en-us/power-bi/create-reports/deployment-pipelines-best-practices



You have a Power BI Premium capacity.

You need to increase the number of virtual cores associated to the capacity.

Which role do you need?

  1. Power BI workspace admin
  2. capacity admin
  3. Power Platform admin
  4. Power BI admin

Answer(s): D

Explanation:

Change capacity size
Power BI admins and global administrators can change Power BI Premium capacity. Capacity admins who are not a Power BI admin or global administrator don't have this option.


Reference:

https://docs.microsoft.com/en-us/power-bi/enterprise/service-admin-premium-manage



You are attempting to configure certification for a Power BI dataset and discover that the certification setting for the dataset is unavailable.

What are two possible causes of the issue? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

  1. The workspace is in shared capacity.
  2. You have insufficient permissions.
  3. Dataset certification is disabled for the Power BI tenant.
  4. The sensitivity level for the dataset is set to Highly Confidential.
  5. Row-level security (RLS) is missing from the dataset.

Answer(s): B,C

Explanation:

C: As a Power BI admin, you are responsible for enabling and setting up the certification process for your organization. This means:

*(C) Enabling certification on your tenant.
* Defining a list of security groups whose members will be authorized to certify content.
* Providing a URL that points to the documentation for the organization's content certification process, if such documentation exists.

B: Get write permissions on the workspace where the content you want to certify is located. You can request these permissions from the content owner or from anyone with admin permissions on the workspace.


Reference:

https://docs.microsoft.com/en-us/power-bi/admin/service-admin-setup-certification
https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-endorse-content



Page 8 of 46



Post your Comments and Discuss Microsoft DP-500 exam with other Community members:

Summer commented on July 28, 2024
Wonderful site. It helped me pass my exam. Way to go guys!
UNITED STATES
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

siyaa commented on January 19, 2024
helped me understand the material better.
Anonymous
upvote

Bunny commented on June 19, 2023
Good Content
Anonymous
upvote

Demetrius commented on June 01, 2023
Important and useful
Anonymous
upvote

Kartoos commented on April 06, 2023
The practice exam was an important part of my preparation and helped me understand the material better.
FRANCE
upvote