Microsoft DP-600 Exam (page: 4)
Microsoft Implementing Analytics Solutions Using Fabric
Updated on: 12-Feb-2026

Viewing Page 4 of 26

You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:
Table1: A Delta table created by using a shortcut Table2: An external table created by using Spark Table3: A managed table
You plan to connect to Lakehouse1 by using its SQL endpoint. What will you be able to do after connecting to Lakehouse1?

  1. Read Table3.
  2. Update the data Table3.
  3. Read Table2.
  4. Update the data in Table1.

Answer(s): A



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 contains a warehouse named DW1. DW1 contains two tables named Employees and Sales. All users have read access to Dw1.

You need to implement access controls to meet the following requirements:
For the Sales table, ensure that the users can see only the sales data from their respective region. For the Employees table, restrict access to all Personally Identifiable Information (PII).

Maintain access to unrestricted data for all the users.

What should you use for each table? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Column-level security
For the Employees table, restrict access to all Personally Identifiable Information (PII).

Synapse Analytics, Column-level security
Column-Level security allows customers to control access to table columns based on the user's execution context or group membership.

Column-level security simplifies the design and coding of security in your application, allowing you to restrict column access to protect sensitive data. For example, ensuring that specific users can access only certain columns of a table pertinent to their department.

Use cases
Some examples of how column-level security is being used today:
A financial services firm allows only account managers to have access to customer social security numbers (SSN), phone numbers, and other personal data.

A health care provider allows only doctors and nurses to have access to sensitive medical records while preventing members of the billing department from viewing this data.

Box 2: Row-level security (RLS)
For the Sales table, ensure that the users can see only the sales data from their respective region.

SQL Server, Row-level security
Row-level security (RLS) enables you to use group membership or execution context to control access to rows in a database table.

Row-level security simplifies the design and coding of security in your application. RLS helps you implement restrictions on data row access. For example, you can ensure that workers access only those data rows that
are pertinent to their department. Another example is to restrict customers' data access to only the data relevant to their company.

Use cases
Here are design examples of how row-level security (RLS) can be used:
A hospital can create a security policy that allows nurses to view data rows for their patients only.

A bank can create a policy to restrict access to financial data rows based on an employee's business division or role in the company.

A multitenant application can create a policy to enforce a logical separation of each tenant's data rows from every other tenant's rows. Efficiencies are achieved by the storage of data for many tenants in a single table. Each tenant can see only its data rows.


Reference:

https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/column-level-security https://learn.microsoft.com/en-us/sql/relational-databases/security/row-level-security



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. User1 is assigned the Contributor role for Workspace1.

You plan to configure Workspace1 to use an Azure DevOps repository for version control. You need to ensure that User1 can commit items to the repository.

Which two settings should you enable for User1? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  1. Users can sync workspace items with GitHub repositories
  2. Users can create and use Data workflows
  3. Users can create Fabric items
  4. Users can synchronize workspace items with their Git repositories

Answer(s): C,D

Explanation:

To integrate Git with your Microsoft Fabric workspace, you need to set up the following prerequisites for both Fabric and Git.

Fabric prerequisites
To access the Git integration feature, you need a Fabric capacity. A Fabric capacity is required to use all supported Fabric items
In addition, the following tenant switches must be enabled from the Admin portal:
* (C) Users can create Fabric items
* (D) Users can synchronize workspace items with their Git repositories
* For GitHub users only: Users can synchronize workspace items with GitHub repositories
These switches can be enabled by the tenant admin, capacity admin, or workspace admin, depending on your organization's settings.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/git-integration/git-get-started



You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 contains a data pipeline named Pipeline1 and a lakehouse named Lakehouse1.

You perform the following actions:

-Create a workspace named Workspace2.
-
-Create a deployment pipeline named DeployPipeline1 that will deploy items from Workspace1 to Workspace2.
-
-Add a folder named Folder1 to Workspace1. Move Lakehouse1 to Folder1.
-
-Run DeployPipeline1.
-
Which structure will Workspace2 have when DeployPipeline1 is complete?

  1. \Folder1\Pipeline1
    \Folder1\Lakehouse1
  2. \Pipeline1
    \Lakehouse1
  3. \Pipeline1
    \Folder1\Lakehouse1
  4. \Folder1\Lakehouse1

Answer(s): D

Explanation:

The folder structure is copied.

Note 1:
Folders in deployment pipelines
Folders enable users to efficiently organize and manage workspace items in a familiar way. When you deploy content that contains folders to a different stage, the folder hierarchy of the applied items is automatically applied.

In Deployment pipelines, folders are considered part of an item’s name (an item name includes its full path). Note 2: Microsoft Fabric, The deployment pipelines process
The deployment process lets you clone content from one stage in the deployment pipeline to another, typically from development to test, and from test to production.

During deployment, Microsoft Fabric copies the content from the source stage to the target stage. The connections between the copied items are kept during the copy process.

Deploying content from a working production pipeline to a stage that has an existing workspace, includes the following steps:
Deploying new content as an addition to the content already there. Deploying updated content to replace some of the content already there.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



Your company has a finance department.

You have a Fabric tenant, an Azure Storage account named storage1, and a Microsoft Entra group named
Group1. Group1 contains the users in the finance department.

You need to create a new workspace named Workspace1 in the tenant. The solution must meet the following requirements:
Ensure that the finance department users can create and edit items in Workspace1. Ensure that Workspace1 can securely access storage1 to read and write data.

Ensure that you are the only admin of Workspace1. Minimize administrative effort.

You create Workspace1.

Which two actions should you perform next? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  1. Assign the Contributor role to Group1.
  2. Create a workspace identity.
  3. Assign the Admin role to yourself.
  4. Assign the Contributor role to each finance department user.

Answer(s): A,B

Explanation:

A: Ensure that the finance department users can create and edit items in Workspace1. Group1 contains the users in the finance department.

B: Ensure that Workspace1 can securely access storage1 to read and write data.

A Fabric workspace identity is an automatically managed service principal that can be associated with a Fabric workspace. Fabric workspaces with a workspace identity can securely read or write to firewall-enabled Azure Data Lake Storage Gen2 accounts through trusted workspace access for OneLake shortcuts. Fabric items can use the identity when connecting to resources that support Microsoft Entra authentication. Fabric uses workspace identities to obtain Microsoft Entra tokens without the customer having to manage any credentials.

Workspace identities can be created in the workspace settings of any workspace except My workspaces. A workspace identity is automatically assigned the workspace contributor role and has access to workspace items.

Incorrect:
Not D: More administrative effort.


Reference:

https://learn.microsoft.com/en-us/fabric/security/workspace-identity



You have a Fabric tenant that contains the workspaces shown in the following table.



You have a deployment pipeline named Pipeline1 that deploys items from Workspace_DEV to Workspace_TEST. In Pipeline1, all items that have matching names are paired.

You deploy the contents of Workspace_DEV to Workspace_TEST by using Pipeline1. What will the contents of Workspace_TEST be once the deployment is complete?

  1. Lakehouse1 Lakehouse2 Notebook1 Notebook2 Pipeline1 SemanticModel1
  2. Lakehouse1 Notebook1 Pipeline1 SemanticModel1
  3. Lakehouse2 Notebook2 SemanticModel1
  4. Lakehouse2 Notebook2 Pipeline1 SemanticModel1

Answer(s): A

Explanation:

The items in Workspace_DEV is added to Workspace_TEST. The items already in Workspace_TEST are kept.

NOTE: Microsoft Fabric, The deployment pipelines process
The deployment process lets you clone content from one stage in the deployment pipeline to another, typically from development to test, and from test to production.

During deployment, Microsoft Fabric copies the content from the source stage to the target stage. The connections between the copied items are kept during the copy process.

Deploying content from a working production pipeline to a stage that has an existing workspace, includes the following steps:
Deploying new content as an addition to the content already there.

Deploying updated content to replace some of the content already there.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace_DEV. Workspace_DEV contains the semantic models shown in the following table.



Workspace_DEV contains the dataflows shown in the following table.



You create a new workspace named Workspace_TEST.

You create a deployment pipeline named Pipeline1 to move items from Workspace_DEV to Workspace_TEST.

You run Pipeline1.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: No
No - DF1 will be deployed to Workspace_TEST.

DF1 is a Dataflow Gen1 and is configured with a scheduled refresh policy. Gen1 dataflows cannot be integrated into data pipelines.

NOTE: Microsoft Fabric Data Factory, Getting from Dataflow Generation 1 to Dataflow Generation 2 Dataflow Gen2 is the new generation of dataflows. The new generation of dataflows resides alongside the
Power BI Dataflow (Gen1) and brings new features and improved experiences. The following section provides a comparison between Dataflow Gen1 and Dataflow Gen2.



Box 2: Yes
Yes - Data from Model1 will be deployed to Workspace_TEST.

Assign a workspace to an empty stage
When you assign content to an empty stage, a new workspace is created on a capacity for the stage you deploy to. All the metadata in the reports, dashboards, and semantic models of the original workspace is copied to the new workspace in the stage you're deploying to.

After the deployment is complete, refresh the semantic models so that you can use the newly copied content. The semantic model refresh is required because data isn't copied from one stage to another.

Box 3: Yes
Yes - The scheduled refresh policy for Model1 will be deployed to Workspace_TEST:


Reference:

https://learn.microsoft.com/en-us/fabric/data-factory/dataflows-gen2-overview



You have a Fabric tenant.

You are creating a Fabric Data Factory pipeline.

You have a stored procedure that returns the number of active customers and their average sales for the current month.

You need to add an activity that will execute the stored procedure in a warehouse. The returned values must be available to the downstream activities of the pipeline.

Which type of activity should you add?

  1. Get metadata
  2. Switch
  3. Lookup
  4. Append variable

Answer(s): C

Explanation:

The Fabric Lookup activity can retrieve a dataset from any of the data sources supported by Microsoft Fabric. You can use it to dynamically determine which objects to operate on in a subsequent activity, instead of hard coding the object name. Some object examples are files and tables.

Lookup activity reads and returns the content of a configuration file or table. It also returns the result of executing a query or stored procedure. The output can be a singleton value or an array of attributes, which can be consumed in a subsequent copy, transformation, or control flow activities like ForEach activity.


Reference:

https://learn.microsoft.com/en-us/fabric/data-factory/lookup-activity



Viewing Page 4 of 26



Share your comments for Microsoft DP-600 exam with other users:

Anonymous 12/27/2023 12:47:00 AM

answer to this question "what administrative safeguards should be implemented to protect the collected data while in use by manasa and her product management team? " it should be (c) for the following reasons: this administrative safeguard involves controlling access to collected data by ensuring that only individuals who need the data for their job responsibilities have access to it. this helps minimize the risk of unauthorized access and potential misuse of sensitive information. while other options such as (a) documenting data flows and (b) conducting a privacy impact assessment (pia) are important steps in data protection, implementing a "need to know" access policy directly addresses the issue of protecting data while in use by limiting access to those who require it for legitimate purposes. (d) is not directly related to safeguarding data during use; it focuses on data transfers and location.
INDIA


Japles 5/23/2023 9:46:00 PM

password lockout being the correct answer for question 37 does not make sense. it should be geofencing.
Anonymous


Faritha 8/10/2023 6:00:00 PM

for question 4, the righr answer is :recover automatically from failures
UNITED STATES


Anonymous 9/14/2023 4:27:00 AM

question number 4s answer is 3, option c. i
UNITED STATES


p das 12/7/2023 11:41:00 PM

very good questions
UNITED STATES


Anna 1/5/2024 1:12:00 AM

i am confused about the answers to the questions. are the answers correct?
KOREA REPUBLIC OF


Bhavya 9/13/2023 10:15:00 AM

very usefull
Anonymous


Rahul Kumar 8/31/2023 12:30:00 PM

need certification.
CANADA


Diran Ole 9/17/2023 5:15:00 PM

great exam prep
CANADA


Venkata Subbarao Bandaru 6/24/2023 8:45:00 AM

i require dump
Anonymous


D 7/15/2023 1:38:00 AM

good morning, could you please upload this exam again,
Anonymous


Ann 9/15/2023 5:39:00 PM

hi can you please upload the dumps for sap contingent module. thanks
AUSTRALIA


Sridhar 1/16/2024 9:19:00 PM

good questions
Anonymous


Summer 10/4/2023 9:57:00 PM

looking forward to the real exam
Anonymous


vv 12/2/2023 2:45:00 PM

good ones for exam preparation
UNITED STATES


Danny Zas 9/15/2023 4:45:00 AM

this is a good experience
UNITED STATES


SM 1211 10/12/2023 10:06:00 PM

hi everyone
UNITED STATES


A 10/2/2023 6:08:00 PM

waiting for the dump. please upload.
UNITED STATES


Anonymous 7/16/2023 11:05:00 AM

upload cks exam questions
Anonymous


Johan 12/13/2023 8:16:00 AM

awesome training material
NETHERLANDS


PC 7/28/2023 3:49:00 PM

where is dump
Anonymous


YoloStar Yoloing 10/22/2023 9:58:00 PM

q. 289 - the correct answer should be b not d, since the question asks for the most secure way to provide access to a s3 bucket (a single one), and by principle of the least privilege you should not be giving access to all buckets.
Anonymous


Zelalem Nega 5/14/2023 12:45:00 PM

please i need if possible h12-831,
UNITED KINGDOM


unknown-R 11/23/2023 7:36:00 AM

good collection of questions and solution for pl500 certification
UNITED STATES


Swaminathan 5/11/2023 9:59:00 AM

i would like to appear the exam.
Anonymous


Veenu 10/24/2023 6:26:00 AM

i am very happy as i cleared my comptia a+ 220-1101 exam. i studied from as it has all exam dumps and mock tests available. i got 91% on the test.
Anonymous


Karan 5/17/2023 4:26:00 AM

need this dump
Anonymous


Ramesh Kutumbaka 12/30/2023 11:17:00 PM

its really good to eventuate knowledge before appearing for the actual exam.
Anonymous


anonymous 7/20/2023 10:31:00 PM

this is great
CANADA


Xenofon 6/26/2023 9:35:00 AM

please i want the questions to pass the exam
UNITED STATES


Diego 1/21/2024 8:21:00 PM

i need to pass exam
Anonymous


Vichhai 12/25/2023 3:25:00 AM

great, i appreciate it.
AUSTRALIA


P Simon 8/25/2023 2:39:00 AM

please could you upload (isc)2 certified in cybersecurity (cc) exam questions
SOUTH AFRICA


Karim 10/8/2023 8:34:00 PM

good questions, wrong answers
Anonymous