Free Microsoft DP-700 Exam Questions (page: 5)

You have a Fabric warehouse named DW1. DW1 contains a table that stores sales data and is used by multiple sales representatives.
You plan to implement row-level security (RLS).
You need to ensure that the sales representatives can see only their respective data.
Which warehouse object do you require to implement RLS?

  1. STORED PROCEDURE
  2. CONSTRAINT
  3. SCHEMA
  4. FUNCTION

Answer(s): D

Explanation:

To implement Row-Level Security (RLS) in a Fabric warehouse, you need to use a function that defines the security logic for filtering the rows of data based on the user's identity or role. This function can be used in conjunction with a security policy to control access to specific rows in a table.
In the case of sales representatives, the function would define the filtering criteria (e.g., based on a column such as SalesRepID or SalesRepName), ensuring that each representative can only see their respective data.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1_DEV that contains the following items:
10 reports
Four notebooks
Three lakehouses
Two data pipelines
Two Dataflow Gen1 dataflows
Three Dataflow Gen2 dataflows
Five semantic models that each has a scheduled refresh policy
You create a deployment pipeline named Pipeline1 to move items from Workspace1_DEV to a new workspace named Workspace1_TEST.
You deploy all the items from Workspace1_DEV to Workspace1_TEST.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
Note: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Data from the semantic models will be deployed to the target stage ­ No In a deployment pipeline, data from semantic models (such as cached data) is not typically deployed along with the models themselves.
While the semantic models themselves (including structure and definitions) will be deployed, the actual data (e.g., the results of the model refreshes) will not be automatically transferred. You would need to handle data refreshes separately after the semantic models are deployed.
The Dataflow Gen1 dataflows will be deployed to the target stage ­ Yes Dataflows, including Dataflow Gen1 dataflows, are part of the deployment pipeline and will be deployed to the target stage. These dataflows are part of the solution being deployed, and the pipeline ensures their migration to the target workspace (Workspace1_TEST).
The scheduled refresh policies will be deployed to the target stage ­ No While the scheduled refresh policies for semantic models are part of the configuration, they are not automatically deployed through the pipeline. Deployment pipelines typically move content, such as reports, notebooks, and dataflows, but scheduled refresh settings are not automatically transferred with the deployment. You would need to manually configure the refresh policies in the new workspace (Workspace1_TEST) after the deployment.



You have a Fabric deployment pipeline that uses three workspaces named Dev, Test, and Prod.
You need to deploy an eventhouse as part of the deployment process.
What should you use to add the eventhouse to the deployment process?

  1. GitHub Actions
  2. a deployment pipeline
  3. an Azure DevOps pipeline

Answer(s): C

Explanation:

Correct:
* an Azure DevOps pipeline
Incorrect:
* a deployment pipeline
* an eventstream
* GitHub Actions



You have a Fabric workspace named Workspace1 that contains a warehouse named Warehouse1.
You plan to deploy Warehouse1 to a new workspace named Workspace2.
As part of the deployment process, you need to verify whether Warehouse1 contains invalid references. The solution must minimize development effort.
What should you use?

  1. a database project
  2. a deployment pipeline
  3. a Python script
  4. a T-SQL script

Answer(s): B

Explanation:

A deployment pipeline in Fabric allows you to deploy assets like warehouses, datasets, and reports between different workspaces (such as from Workspace1 to Workspace2). One of the key features of a deployment pipeline is the ability to check for invalid references before deployment. This can help identify issues with assets, such as broken links or dependencies, ensuring the deployment is successful without introducing errors. This is the most efficient way to verify references and manage the deployment with minimal development effort.



Viewing page 5 of 31



Post your Comments and Discuss Microsoft DP-700 exam prep with other Community members:

DP-700 Exam Discussions & Posts