Microsoft DP-600 Exam (page: 5)
Microsoft Implementing Analytics Solutions Using Fabric
Updated on: 15-Dec-2025

Viewing Page 5 of 26

You have a Fabric tenant that contains a semantic model. You need to modify object-level security (OLS) for the model. What should you use?

  1. the Fabric service
  2. Microsoft Power BI Desktop
  3. ALM Toolkit
  4. Tabular Editor

Answer(s): D

Explanation:

Microsoft Fabric Security, Object-level security (OLS)
To create roles on Power BI Desktop semantic models, use external tools such as Tabular Editor. Configure object-level security using tabular editor
In Power BI Desktop, create the model and roles that will define your OLS rules.

On the External Tools ribbon, select Tabular Editor. If you don’t see the Tabular Editor button, install the program. When open, Tabular Editor will automatically connect to your model.



In the Model view, select the drop-down menu under Roles. The roles you created in step one will appear.



Select the role you want to enable an OLS definition for, and expand the Table Permissions.



Set the permissions for the table or column to None or Read.

After you define object-level security for the roles, save your changes. Screenshot of saving role definitions.

In Power BI Desktop, publish your semantic model to the Power BI Service.

8. In the Power BI Service, navigate to the Security page by selecting the more options menu on the semantic model, and assign members or groups to their appropriate roles.


Reference:

https://learn.microsoft.com/en-us/fabric/security/service-admin-object-level-security



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Enterprise. Enterprise contains a semantic model named Model1. Model1 contains a date parameter named Date1 that was created in Power Query.

You build a deployment pipeline named Enterprise Data that includes two stages named Development and Test. You assign the Enterprise workspace to the Development stage.

You need to perform the following actions:
Create a workspace named Enterprise [Test] and assign the workspace to the Test stage. Configure a rule that will modify the value of Date1 when changes are deployed to the Test stage.

Which two settings should you use? To answer, select the appropriate settings in the answer area.

NOTE: Each correct answer is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Add workspace button (with a +sign)
Create a workspace named Enterprise [Test] and assign the workspace to the Test stage.

Box 2: Deployment rule [Upper right corner]


Configure a rule that will modify the value of Date1 when changes are deployed to the Test stage. In the pipeline stage you want to create a deployment rule for, select Deployment rules


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/create-rules



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. Workspace1 contains a warehouse named DW1.

You share DW1 with User1 and assign User1 the default permissions for DW1. What can User1 do?

  1. Build reports by using the default dataset.
  2. Read data from the tables in DW1.
  3. Connect to DW1 via the Azure SQL Analytics endpoint.
  4. Read the underlying Parquet files from OneLake.

Answer(s): A

Explanation:

By default, when a user is granted access to a Microsoft Fabric warehouse (DW1), they receive the Viewer
role. The Viewer role allows users to:
Build reports using the default dataset associated with the warehouse.

Read data from the dataset but not directly from tables unless explicitly granted additional permissions.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace1 and a user named DBUser. Workspace1 contains a lakehouse named Lakehouse1. DBUser does NOT have access to the tenant.

You grant DBUser access to Lakehouse1 as shown in the following exhibit.



Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




DBUser can read the data in Lakehouse1 by using the OneLake endpoint.

DBUser can query the data in Lakehouse1 by using the OneLake file explorer.

DBUser has been granted permission to "Read all Apache Spark" data but not permission to read SQL endpoint data or build reports using the default semantic model.

Since SQL endpoint access is not granted, DBUser cannot use TDS (Tabular Data Stream) or SSMS (SQL Server Management Studio) for querying.

DBUser can still access OneLake data directly, which is why OneLake endpoint and OneLake file explorer are the correct choices.



DRAG DROP (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1.

You have three groups named Group1, Group2, and Group3.

You need to assign a workspace role to each group. The solution must follow the principle of least privilege and meet the following requirements:
Group1 must be able to write data to Workspace1, but be unable to add members to Workspace1. Group2 must be able to configure and maintain the settings of Workspace1.

Group3 must be able to write data and add members to Workspace1, but be unable to delete Workspace1.

Which workspace role should you assign to each group? To answer, drag the appropriate roles to the correct groups. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Group1 (Can write data but cannot add members) → Contributor
Contributors can write, edit, and manage data, but cannot manage workspace settings or add/remove users.

Group2 (Can configure and maintain workspace settings) → Admin
Admins have full control over the workspace, including configuring settings, managing permissions, and maintaining security policies.

Group3 (Can write data and add members but cannot delete the workspace) → Member
Members can add/remove members and write data, but they cannot delete the workspace or configure settings at the admin level.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1 that uses the Premium Per User (PPU) license mode and contains a semantic model named Model1.

Large semantic model storage format is selected for Model1.

You need to ensure that tables imported into Model1 are written automatically to Delta tables in OneLake.

What should you do for Model1 and Workspace1? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Enable OneLake integration Model1
With Microsoft OneLake integration for semantic models, data imported into model tables can also be automatically written to Delta tables in OneLake. The Delta format is the unified table format across all compute engines in Microsoft Fabric. OneLake integration exports the data with all key performance features enabled to provide more seamless data access with higher performance.

Box 2: Change the license mode to Fabric capacity. Workspace1
OneLake integration for semantic models is supported on Power BI Premium P and Microsoft Fabric F SKUs only. It's not supported on Power BI Pro, Premium Per User, or Power BI Embedded A/EM SKUs.

Before enabling OneLake integration, you must have one or more import semantic models in a workspace on a Power BI Premium or Fabric capacity. Import semantic model is a type of data model where data is fully imported into Power BI's in-memory storage, allowing fast and efficient querying.


Reference:

https://learn.microsoft.com/en-us/power-bi/enterprise/onelake-integration-overview



You have a Fabric tenant that contains the workspaces shown in the following table.



You have a deployment pipeline named Pipeline1 that deploys items from Workspace_DEV to Workspace_TEST. In Pipeline1, all items that have matching names are paired.

You deploy the contents of Workspace_DEV to Workspace_TEST by using Pipeline1. What will the contents of Workspace_TEST be once the deployment is complete?

  1. Lakehouse2 Notebook2 SemanticModel1
  2. Lakehouse1 Notebook1 Pipeline1 SemanticModel1
  3. Lakehouse1 Lakehouse2 Notebook1 Notebook2 Pipeline1 SemanticModel1
  4. Lakehouse2 Notebook2 Pipeline1 SemanticModel1

Answer(s): C

Explanation:

Lakehouse1, Notebook1 and Pipeline1 are copied to Workspace_TEST. SemanticModel1 is copied and replaces the old SemanticModel1 in Workspace_TEST.

NOTE:
In a Fabric deployment pipeline, pipelines, lakehouses, notebooks, and semantic models are copied between workspaces using deployment pipelines. These pipelines leverage a "paired" approach, meaning they establish a link between items of the same name and type in different stages of the deployment process.

This process involves assigning a workspace to a specific stage in the pipeline and then either deploying new, unpaired content or pairing existing content. During deployment, the metadata of the reports, dashboards, and semantic models (but not the data itself) is copied to the new workspace. For lakehouses, the entire structure and definitions are copied, and data can be moved using pipelines or dataflows.

Incorrect:
Not A: Pipeline1 should be in Workspace_TEST
In a Fabric deployment pipeline, the process typically involves copying pipelines (and other content) from the
source workspace to the destination workspace. This "copying" is done during the deployment phase, with Fabric making a copy of the items from the source stage to the target stage, maintaining connections between the copied items. This ensures that the destination workspace has a replica of the source, ready for testing or production.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. Workspace1 contains a warehouse named DW1.

You share DW1 with User1 and assign User1 the default permissions for DW1. What can User1 do?

  1. Read data from the tables in DW1.
  2. Build reports by using the default dataset.
  3. Read the underlying Parquet files from OneLake.
  4. Connect to DW1 via the TDS (Tabular Data Stream) endpoint.

Answer(s): D

Explanation:

Assigning a user default permission ("Read") to a warehouse in a Fabric workspace enables them to connect to the SQL analytics endpoint, which is the equivalent of CONNECT permissions in SQL Server. However, this permission only allows them to connect; it doesn't grant them the ability to query tables, views, functions, or stored procedures within the warehouse unless they're also given access to those specific objects through T- SQL GRANT statements.

In essence, the default "Read" permission provides the necessary connectivity to the warehouse's SQL analytics endpoint, but it does not automatically grant access to the data and objects within the warehouse itself. Further permissions need to be granted through T-SQL or Fabric's workspace roles and item permissions system.

NOTE:
Default Read Permission:
When sharing a warehouse, the default permission assigned to a user is "Read".

Connectivity:
This "Read" permission allows the user to connect to the SQL analytics endpoint of the warehouse.

Limited Object Access:
The "Read" permission itself does not grant access to specific objects within the warehouse (tables, views, functions, etc.). To query these objects, the user needs to be granted the appropriate permissions using T-SQL GRANT statements.


Reference:

https://learn.microsoft.com/en-us/fabric/data-warehouse/share-warehouse-manage-permissions



Viewing Page 5 of 26



Share your comments for Microsoft DP-600 exam with other users:

Franklin Allagoa 7/5/2023 5:16:00 AM

i want hcia exam dumps
Anonymous


SSA 12/24/2023 1:18:00 PM

good training
Anonymous


BK 8/11/2023 12:23:00 PM

very useful
INDIA


Deepika Narayanan 7/13/2023 11:05:00 PM

yes need this exam dumps
Anonymous


Blessious Phiri 8/15/2023 3:31:00 PM

these questions are a great eye opener
Anonymous


Jagdesh 9/8/2023 8:17:00 AM

thank you for providing these questions and answers. they helped me pass my exam. you guys are great.
CANADA


TS 7/18/2023 3:32:00 PM

good knowledge
Anonymous


Asad Khan 11/1/2023 2:44:00 AM

answer 10 should be a because only a new project will be created & the organization is the same.
Anonymous


Raj 9/12/2023 3:49:00 PM

can you please upload the dump again
UNITED STATES


Christian Klein 6/23/2023 1:32:00 PM

is it legit questions from sap certifications ?
UNITED STATES


anonymous 1/12/2024 3:34:00 PM

question 16 should be b (changing the connector settings on the monitor) pc and monitor were powered on. the lights on the pc are on indicating power. the monitor is showing an error text indicating that it is receiving power too. this is a clear sign of having the wrong input selected on the monitor. thus, the "connector setting" needs to be switched from hdmi to display port on the monitor so it receives the signal from the pc, or the other way around (display port to hdmi).
UNITED STATES


NSPK 1/18/2024 10:26:00 AM

q 10. ans is d (in the target org: open deployment settings, click edit next to the source org. select allow inbound changes and save
Anonymous


mohamed abdo 9/1/2023 4:59:00 AM

very useful
Anonymous


Tom 3/18/2022 8:00:00 PM

i purchased this exam dumps from another website with way more questions but they were all invalid and outdate. this exam dumps was right to the point and all from recent exam. it was a hard pass.
UNITED KINGDOM


Edrick GOP 10/24/2023 6:00:00 AM

it was a good experience and i got 90% in the 200-901 exam.
Anonymous


anonymous 8/10/2023 2:28:00 AM

hi please upload this
Anonymous


Bakir 7/6/2023 7:24:00 AM

please upload it
UNITED KINGDOM


Aman 6/18/2023 1:27:00 PM

really need this dump. can you please help.
UNITED KINGDOM


Neela Para 1/8/2024 6:39:00 PM

really good and covers many areas explaining the answer.
NEW ZEALAND


Karan Patel 8/15/2023 12:51:00 AM

yes, can you please upload the exam?
UNITED STATES


NISHAD 11/7/2023 11:28:00 AM

how many questions are there in these dumps?
UNITED STATES


Pankaj 7/3/2023 3:57:00 AM

hi team, please upload this , i need it.
UNITED STATES


DN 9/4/2023 11:19:00 PM

question 14 - run terraform import: this is the recommended best practice for bringing manually created or destroyed resources under terraform management. you use terraform import to associate an existing resource with a terraform resource configuration. this ensures that terraform is aware of the resource, and you can subsequently manage it with terraform.
Anonymous


Zhiguang 8/19/2023 11:37:00 PM

please upload dump. thanks in advance.
Anonymous


deedee 12/23/2023 5:51:00 PM

great great
UNITED STATES


Asad Khan 11/1/2023 3:10:00 AM

answer 16 should be b your organizational policies require you to use virtual machines directly
Anonymous


Sale Danasabe 10/24/2023 5:21:00 PM

the question are kind of tricky of you didnt get the hnag on it.
Anonymous


Luis 11/16/2023 1:39:00 PM

can anyone tell me if this is for rhel8 or rhel9?
UNITED STATES


hik 1/19/2024 1:47:00 PM

good content
UNITED STATES


Blessious Phiri 8/15/2023 2:18:00 PM

pdb and cdb are critical to the database
Anonymous


Zuned 10/22/2023 4:39:00 AM

till 104 questions are free, lets see how it helps me in my exam today.
UNITED STATES


Muhammad Rawish Siddiqui 12/3/2023 12:11:00 PM

question # 56, answer is true not false.
SAUDI ARABIA


Amaresh Vashishtha 8/27/2023 1:33:00 AM

i would be requiring dumps to prepare for certification exam
Anonymous


Asad 9/8/2023 1:01:00 AM

very helpful
PAKISTAN