Free DP-500 Exam Braindumps (page: 24)

Page 24 of 46

You have a Power BI workspace that contains one dataset and four reports that connect to the dataset.

The dataset uses import storage mode and contains the following data source:

-A CSV file in an Azure Storage account.
-An Azure Database for PostgreSQL database.

You plan to use deployment pipelines to promote the content from development to test to production. There will be different data source locations for each stage.

What should you include in the deployment pipeline to ensure that the appropriate data source locations are used during each stage?

  1. auto-binding across pipelines
  2. data source rules
  3. selective deployment
  4. parameter rules

Answer(s): D

Explanation:

You can configure data source rules and parameter rules. The following table lists the type of Power BI items you can configure rules for, and the type of rule you can configure for each one.

Note: Create deployment rules
When working in a deployment pipeline, different stages may have different configurations. For example, each stage can have different databases or different query parameters. The development stage might query sample data from the database, while the test and production stages query the entire database.

When you deploy content between pipeline stages, configuring deployment rules enables you to allow changes to content, while keeping some settings intact. For example, if you want a dataset in a production stage to point to a production database, you can define a rule for this. The rule is defined in the production stage, under the appropriate dataset. Once the rule is defined, content deployed from test to production, will inherit the value as defined in the deployment rule, and will always apply as long as the rule is unchanged and valid.

Incorrect:
Not B: Data source rules only work when you change data sources from the same type.



You are planning a Power BI solution for a customer.

The customer will have 200 Power BI users. The customer identifies the following requirements:

-Ensure that all the users can create paginated reports.
-Ensure that the users can create reports containing AI visuals.
-Provide autoscaling of the CPU resources during heavy usage spikes.

You need to recommend a Power BI solution for the customer. The solution must minimize costs.

What should you recommend?

  1. a Power BI Premium per capacity
  2. Power BI Report Server
  3. Power BI Premium per user
  4. Power BI Pro per user

Answer(s): C

Explanation:

Announcing Power BI Premium Per User general availability and autoscale preview for Gen2.

Power BI Premium per user features and capabilities
* Pixel perfect paginated reports are available for operational reporting capabilities based on SSRS technology. Users can create highly formatted reports in various formats such as PDF and PPT, which are embeddable in applications and are designed to be printed or shared.
* Automated machine learning (AutoML) in Power BI enables business users to build ML models to predict outcomes without having to write any code.
* Etc.

Note:
Power BI empowers every business user and business analyst to get amazing insights with AI infused experiences. With Power BI Premium, we enable business analysts to not only analyze and visualize their data, but to also build an end-to-end data platform through drag and drop experiences. Everything from ingesting and transforming data at scale, to building automated machine learning models, and analyzing massive volumes of data is now possible for our millions of business analysts.


Reference:

https://powerbi.microsoft.com/nl-be/blog/announcing-power-bi-premium-per-user-general-availability-and-autoscale-preview-for-gen2/



HOTSPOT (Drag and Drop is not supported)
You need to configure a source control solution for Azure Synapse Analytics. The solution must meet the following requirements:

-Code must always be merged to the main branch before being published, and the main branch must be used for publishing resources.
-The workspace templates must be stored in the publish branch.
-A branch named dev123 will be created to support the development of a new feature.

What should you do? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

  1. See Explanation section for answer.

Answer(s): A

Explanation:





Box 1: main
Code must always be merged to the main branch before being published, and the main branch must be used for publishing resources.

Collaboration branch - Your Azure Repos collaboration branch that is used for publishing. By default, its master. Change this setting in case you want to publish resources from another branch. You can select existing branches or create new.
Each Git repository that's associated with a Synapse Studio has a collaboration branch. (main or master is the default collaboration branch).

Box 2: workspace_publish
A branch named dev123 will be created to support the development of a new feature.
The workspace templates must be stored in the publish branch.

Creating feature branches
Users can also create feature branches by clicking + New Branch in the branch dropdown.

By default, Synapse Studio generates the workspace templates and saves them into a branch called workspace_publish. To configure a custom publish branch, add a publish_config.json file to the root folder in the collaboration branch.


Reference:

https://docs.microsoft.com/en-us/azure/synapse-analytics/cicd/source-control



You have five Power BI reports that contain R script data sources and R visuals.

You need to publish the reports to the Power BI service and configure a daily refresh of datasets.

What should you include in the solution?

  1. a Power BI Embedded capacity
  2. a workspace that connects to an Azure Data Lake Storage Gen2 account
  3. an on-premises data gateway (personal mode)
  4. an on-premises data gateway (standard mode)

Answer(s): C

Explanation:

To schedule refresh of your R visuals or dataset, enable scheduled refresh and install an on-premises data gateway (personal mode) on the computer containing the workbook and R.


Reference:

https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-r-in-query-editor



Page 24 of 46



Post your Comments and Discuss Microsoft DP-500 exam with other Community members:

Summer commented on July 28, 2024
Wonderful site. It helped me pass my exam. Way to go guys!
UNITED STATES
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

Siyya commented on January 19, 2024
might help me to prepare for the exam
Anonymous
upvote

siyaa commented on January 19, 2024
helped me understand the material better.
Anonymous
upvote

Bunny commented on June 19, 2023
Good Content
Anonymous
upvote

Demetrius commented on June 01, 2023
Important and useful
Anonymous
upvote

Kartoos commented on April 06, 2023
The practice exam was an important part of my preparation and helped me understand the material better.
FRANCE
upvote