Microsoft DP-600 Exam (page: 5)
Microsoft Implementing Analytics Solutions Using Fabric
Updated on: 15-Feb-2026

Viewing Page 5 of 26

You have a Fabric tenant that contains a semantic model. You need to modify object-level security (OLS) for the model. What should you use?

  1. the Fabric service
  2. Microsoft Power BI Desktop
  3. ALM Toolkit
  4. Tabular Editor

Answer(s): D

Explanation:

Microsoft Fabric Security, Object-level security (OLS)
To create roles on Power BI Desktop semantic models, use external tools such as Tabular Editor. Configure object-level security using tabular editor
In Power BI Desktop, create the model and roles that will define your OLS rules.

On the External Tools ribbon, select Tabular Editor. If you don’t see the Tabular Editor button, install the program. When open, Tabular Editor will automatically connect to your model.



In the Model view, select the drop-down menu under Roles. The roles you created in step one will appear.



Select the role you want to enable an OLS definition for, and expand the Table Permissions.



Set the permissions for the table or column to None or Read.

After you define object-level security for the roles, save your changes. Screenshot of saving role definitions.

In Power BI Desktop, publish your semantic model to the Power BI Service.

8. In the Power BI Service, navigate to the Security page by selecting the more options menu on the semantic model, and assign members or groups to their appropriate roles.


Reference:

https://learn.microsoft.com/en-us/fabric/security/service-admin-object-level-security



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Enterprise. Enterprise contains a semantic model named Model1. Model1 contains a date parameter named Date1 that was created in Power Query.

You build a deployment pipeline named Enterprise Data that includes two stages named Development and Test. You assign the Enterprise workspace to the Development stage.

You need to perform the following actions:
Create a workspace named Enterprise [Test] and assign the workspace to the Test stage. Configure a rule that will modify the value of Date1 when changes are deployed to the Test stage.

Which two settings should you use? To answer, select the appropriate settings in the answer area.

NOTE: Each correct answer is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Add workspace button (with a +sign)
Create a workspace named Enterprise [Test] and assign the workspace to the Test stage.

Box 2: Deployment rule [Upper right corner]


Configure a rule that will modify the value of Date1 when changes are deployed to the Test stage. In the pipeline stage you want to create a deployment rule for, select Deployment rules


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/create-rules



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. Workspace1 contains a warehouse named DW1.

You share DW1 with User1 and assign User1 the default permissions for DW1. What can User1 do?

  1. Build reports by using the default dataset.
  2. Read data from the tables in DW1.
  3. Connect to DW1 via the Azure SQL Analytics endpoint.
  4. Read the underlying Parquet files from OneLake.

Answer(s): A

Explanation:

By default, when a user is granted access to a Microsoft Fabric warehouse (DW1), they receive the Viewer
role. The Viewer role allows users to:
Build reports using the default dataset associated with the warehouse.

Read data from the dataset but not directly from tables unless explicitly granted additional permissions.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace1 and a user named DBUser. Workspace1 contains a lakehouse named Lakehouse1. DBUser does NOT have access to the tenant.

You grant DBUser access to Lakehouse1 as shown in the following exhibit.



Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




DBUser can read the data in Lakehouse1 by using the OneLake endpoint.

DBUser can query the data in Lakehouse1 by using the OneLake file explorer.

DBUser has been granted permission to "Read all Apache Spark" data but not permission to read SQL endpoint data or build reports using the default semantic model.

Since SQL endpoint access is not granted, DBUser cannot use TDS (Tabular Data Stream) or SSMS (SQL Server Management Studio) for querying.

DBUser can still access OneLake data directly, which is why OneLake endpoint and OneLake file explorer are the correct choices.



DRAG DROP (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1.

You have three groups named Group1, Group2, and Group3.

You need to assign a workspace role to each group. The solution must follow the principle of least privilege and meet the following requirements:
Group1 must be able to write data to Workspace1, but be unable to add members to Workspace1. Group2 must be able to configure and maintain the settings of Workspace1.

Group3 must be able to write data and add members to Workspace1, but be unable to delete Workspace1.

Which workspace role should you assign to each group? To answer, drag the appropriate roles to the correct groups. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Group1 (Can write data but cannot add members) → Contributor
Contributors can write, edit, and manage data, but cannot manage workspace settings or add/remove users.

Group2 (Can configure and maintain workspace settings) → Admin
Admins have full control over the workspace, including configuring settings, managing permissions, and maintaining security policies.

Group3 (Can write data and add members but cannot delete the workspace) → Member
Members can add/remove members and write data, but they cannot delete the workspace or configure settings at the admin level.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1 that uses the Premium Per User (PPU) license mode and contains a semantic model named Model1.

Large semantic model storage format is selected for Model1.

You need to ensure that tables imported into Model1 are written automatically to Delta tables in OneLake.

What should you do for Model1 and Workspace1? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Enable OneLake integration Model1
With Microsoft OneLake integration for semantic models, data imported into model tables can also be automatically written to Delta tables in OneLake. The Delta format is the unified table format across all compute engines in Microsoft Fabric. OneLake integration exports the data with all key performance features enabled to provide more seamless data access with higher performance.

Box 2: Change the license mode to Fabric capacity. Workspace1
OneLake integration for semantic models is supported on Power BI Premium P and Microsoft Fabric F SKUs only. It's not supported on Power BI Pro, Premium Per User, or Power BI Embedded A/EM SKUs.

Before enabling OneLake integration, you must have one or more import semantic models in a workspace on a Power BI Premium or Fabric capacity. Import semantic model is a type of data model where data is fully imported into Power BI's in-memory storage, allowing fast and efficient querying.


Reference:

https://learn.microsoft.com/en-us/power-bi/enterprise/onelake-integration-overview



You have a Fabric tenant that contains the workspaces shown in the following table.



You have a deployment pipeline named Pipeline1 that deploys items from Workspace_DEV to Workspace_TEST. In Pipeline1, all items that have matching names are paired.

You deploy the contents of Workspace_DEV to Workspace_TEST by using Pipeline1. What will the contents of Workspace_TEST be once the deployment is complete?

  1. Lakehouse2 Notebook2 SemanticModel1
  2. Lakehouse1 Notebook1 Pipeline1 SemanticModel1
  3. Lakehouse1 Lakehouse2 Notebook1 Notebook2 Pipeline1 SemanticModel1
  4. Lakehouse2 Notebook2 Pipeline1 SemanticModel1

Answer(s): C

Explanation:

Lakehouse1, Notebook1 and Pipeline1 are copied to Workspace_TEST. SemanticModel1 is copied and replaces the old SemanticModel1 in Workspace_TEST.

NOTE:
In a Fabric deployment pipeline, pipelines, lakehouses, notebooks, and semantic models are copied between workspaces using deployment pipelines. These pipelines leverage a "paired" approach, meaning they establish a link between items of the same name and type in different stages of the deployment process.

This process involves assigning a workspace to a specific stage in the pipeline and then either deploying new, unpaired content or pairing existing content. During deployment, the metadata of the reports, dashboards, and semantic models (but not the data itself) is copied to the new workspace. For lakehouses, the entire structure and definitions are copied, and data can be moved using pipelines or dataflows.

Incorrect:
Not A: Pipeline1 should be in Workspace_TEST
In a Fabric deployment pipeline, the process typically involves copying pipelines (and other content) from the
source workspace to the destination workspace. This "copying" is done during the deployment phase, with Fabric making a copy of the items from the source stage to the target stage, maintaining connections between the copied items. This ensures that the destination workspace has a replica of the source, ready for testing or production.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. Workspace1 contains a warehouse named DW1.

You share DW1 with User1 and assign User1 the default permissions for DW1. What can User1 do?

  1. Read data from the tables in DW1.
  2. Build reports by using the default dataset.
  3. Read the underlying Parquet files from OneLake.
  4. Connect to DW1 via the TDS (Tabular Data Stream) endpoint.

Answer(s): D

Explanation:

Assigning a user default permission ("Read") to a warehouse in a Fabric workspace enables them to connect to the SQL analytics endpoint, which is the equivalent of CONNECT permissions in SQL Server. However, this permission only allows them to connect; it doesn't grant them the ability to query tables, views, functions, or stored procedures within the warehouse unless they're also given access to those specific objects through T- SQL GRANT statements.

In essence, the default "Read" permission provides the necessary connectivity to the warehouse's SQL analytics endpoint, but it does not automatically grant access to the data and objects within the warehouse itself. Further permissions need to be granted through T-SQL or Fabric's workspace roles and item permissions system.

NOTE:
Default Read Permission:
When sharing a warehouse, the default permission assigned to a user is "Read".

Connectivity:
This "Read" permission allows the user to connect to the SQL analytics endpoint of the warehouse.

Limited Object Access:
The "Read" permission itself does not grant access to specific objects within the warehouse (tables, views, functions, etc.). To query these objects, the user needs to be granted the appropriate permissions using T-SQL GRANT statements.


Reference:

https://learn.microsoft.com/en-us/fabric/data-warehouse/share-warehouse-manage-permissions



Viewing Page 5 of 26



Share your comments for Microsoft DP-600 exam with other users:

DMZ 6/25/2023 11:56:00 PM

this exam dumps just did the job. i donot want to ruffle your feathers but your exam dumps and mock test engine is amazing.
UNITED KINGDOM


Jose 8/30/2023 6:14:00 AM

nice questions
PORTUGAL


Tar01 7/24/2023 7:07:00 PM

the explanation are really helpful
Anonymous


DaveG 12/15/2023 4:50:00 PM

just passed my exam yesterday on my first attempt. these dumps were extremely helpful in passing first time. the questions were very, very similar to these questions!
Anonymous


A.K. 6/30/2023 6:34:00 AM

cosmos db is paas not saas
Anonymous


S Roychowdhury 6/26/2023 5:27:00 PM

what is the percentage of common questions in gcp exam compared to 197 dump questions? are they 100% matching with real gcp exam?
Anonymous


Bella 7/22/2023 2:05:00 AM

not able to see questions
Anonymous


Scott 9/8/2023 7:19:00 AM

by far one of the best sites for free questions. i have pass 2 exams with the help of this website.
CANADA


donald 8/19/2023 11:05:00 AM

excellent question bank.
Anonymous


Ashwini 8/22/2023 5:13:00 AM

it really helped
Anonymous


sk 5/13/2023 2:07:00 AM

excelent material
INDIA


Christopher 9/5/2022 10:54:00 PM

the new versoin of this exam which i downloaded has all the latest questions from the exam. i only saw 3 new questions in the exam which was not in this dump.
CANADA


Sam 9/7/2023 6:51:00 AM

question 8 - can cloudtrail be used for storing jobs? based on aws - aws cloudtrail is used for governance, compliance and investigating api usage across all of our aws accounts. every action that is taken by a user or script is an api call so this is logged to [aws] cloudtrail. something seems incorrect here.
UNITED STATES


Tanvi Rajput 8/14/2023 10:55:00 AM

question 13 tda - c01 answer : quick table calculation -> percentage of total , compute using table down
UNITED KINGDOM


PMSAGAR 9/19/2023 2:48:00 AM

pls share teh dump
UNITED STATES


zazza 6/16/2023 10:47:00 AM

question 44 answer is user risk
ITALY


Prasana 6/23/2023 1:59:00 AM

please post the questions for preparation
Anonymous


test user 9/24/2023 3:15:00 AM

thanks for the questions
AUSTRALIA


Draco 7/19/2023 5:34:00 AM

please reopen it now ..its really urgent
UNITED STATES


Megan 4/14/2023 5:08:00 PM

these practice exam questions were exactly what i needed. the variety of questions and the realistic exam-like environment they created helped me assess my strengths and weaknesses. i felt more confident and well-prepared on exam day, and i owe it to this exam dumps!
UNITED KINGDOM


abdo casa 8/9/2023 6:10:00 PM

thank u it very instructuf
Anonymous


Danny 1/15/2024 9:10:00 AM

its helpful?
INDIA


hanaa 10/3/2023 6:57:00 PM

is this dump still valid???
Anonymous


Georgio 1/19/2024 8:15:00 AM

question 205 answer is b
Anonymous


Matthew Dievendorf 5/30/2023 9:37:00 PM

question 39, should be answer b, directions stated is being sudneted from /21 to a /23. a /23 has 512 ips so 510 hosts. and can make 4 subnets out of the /21
Anonymous


Adhithya 8/11/2022 12:27:00 AM

beautiful test engine software and very helpful. questions are same as in the real exam. i passed my paper.
UNITED ARAB EMIRATES


SuckerPumch88 4/25/2022 10:24:00 AM

the questions are exactly the same in real exam. just make sure not to answer all them correct or else they suspect you are cheating.
UNITED STATES


soheib 7/24/2023 7:05:00 PM

question: 78 the right answer i think is d not a
Anonymous


srija 8/14/2023 8:53:00 AM

very helpful
EUROPEAN UNION


Thembelani 5/30/2023 2:17:00 AM

i am writing this exam tomorrow and have dumps
Anonymous


Anita 10/1/2023 4:11:00 PM

can i have the icdl excel exam
Anonymous


Ben 9/9/2023 7:35:00 AM

please upload it
Anonymous


anonymous 9/20/2023 11:27:00 PM

hye when will post again the past year question for this h13-311_v3 part since i have to for my test tommorow…thank you very much
Anonymous


Randall 9/28/2023 8:25:00 PM

on question 22, option b-once per session is also valid.
Anonymous