Microsoft DP-600 Exam (page: 4)
Microsoft Implementing Analytics Solutions Using Fabric
Updated on: 15-Dec-2025

Viewing Page 4 of 26

You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:
Table1: A Delta table created by using a shortcut Table2: An external table created by using Spark Table3: A managed table
You plan to connect to Lakehouse1 by using its SQL endpoint. What will you be able to do after connecting to Lakehouse1?

  1. Read Table3.
  2. Update the data Table3.
  3. Read Table2.
  4. Update the data in Table1.

Answer(s): A



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 contains a warehouse named DW1. DW1 contains two tables named Employees and Sales. All users have read access to Dw1.

You need to implement access controls to meet the following requirements:
For the Sales table, ensure that the users can see only the sales data from their respective region. For the Employees table, restrict access to all Personally Identifiable Information (PII).

Maintain access to unrestricted data for all the users.

What should you use for each table? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: Column-level security
For the Employees table, restrict access to all Personally Identifiable Information (PII).

Synapse Analytics, Column-level security
Column-Level security allows customers to control access to table columns based on the user's execution context or group membership.

Column-level security simplifies the design and coding of security in your application, allowing you to restrict column access to protect sensitive data. For example, ensuring that specific users can access only certain columns of a table pertinent to their department.

Use cases
Some examples of how column-level security is being used today:
A financial services firm allows only account managers to have access to customer social security numbers (SSN), phone numbers, and other personal data.

A health care provider allows only doctors and nurses to have access to sensitive medical records while preventing members of the billing department from viewing this data.

Box 2: Row-level security (RLS)
For the Sales table, ensure that the users can see only the sales data from their respective region.

SQL Server, Row-level security
Row-level security (RLS) enables you to use group membership or execution context to control access to rows in a database table.

Row-level security simplifies the design and coding of security in your application. RLS helps you implement restrictions on data row access. For example, you can ensure that workers access only those data rows that
are pertinent to their department. Another example is to restrict customers' data access to only the data relevant to their company.

Use cases
Here are design examples of how row-level security (RLS) can be used:
A hospital can create a security policy that allows nurses to view data rows for their patients only.

A bank can create a policy to restrict access to financial data rows based on an employee's business division or role in the company.

A multitenant application can create a policy to enforce a logical separation of each tenant's data rows from every other tenant's rows. Efficiencies are achieved by the storage of data for many tenants in a single table. Each tenant can see only its data rows.


Reference:

https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/column-level-security https://learn.microsoft.com/en-us/sql/relational-databases/security/row-level-security



You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. User1 is assigned the Contributor role for Workspace1.

You plan to configure Workspace1 to use an Azure DevOps repository for version control. You need to ensure that User1 can commit items to the repository.

Which two settings should you enable for User1? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  1. Users can sync workspace items with GitHub repositories
  2. Users can create and use Data workflows
  3. Users can create Fabric items
  4. Users can synchronize workspace items with their Git repositories

Answer(s): C,D

Explanation:

To integrate Git with your Microsoft Fabric workspace, you need to set up the following prerequisites for both Fabric and Git.

Fabric prerequisites
To access the Git integration feature, you need a Fabric capacity. A Fabric capacity is required to use all supported Fabric items
In addition, the following tenant switches must be enabled from the Admin portal:
* (C) Users can create Fabric items
* (D) Users can synchronize workspace items with their Git repositories
* For GitHub users only: Users can synchronize workspace items with GitHub repositories
These switches can be enabled by the tenant admin, capacity admin, or workspace admin, depending on your organization's settings.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/git-integration/git-get-started



You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 contains a data pipeline named Pipeline1 and a lakehouse named Lakehouse1.

You perform the following actions:

-Create a workspace named Workspace2.
-
-Create a deployment pipeline named DeployPipeline1 that will deploy items from Workspace1 to Workspace2.
-
-Add a folder named Folder1 to Workspace1. Move Lakehouse1 to Folder1.
-
-Run DeployPipeline1.
-
Which structure will Workspace2 have when DeployPipeline1 is complete?

  1. \Folder1\Pipeline1
    \Folder1\Lakehouse1
  2. \Pipeline1
    \Lakehouse1
  3. \Pipeline1
    \Folder1\Lakehouse1
  4. \Folder1\Lakehouse1

Answer(s): D

Explanation:

The folder structure is copied.

Note 1:
Folders in deployment pipelines
Folders enable users to efficiently organize and manage workspace items in a familiar way. When you deploy content that contains folders to a different stage, the folder hierarchy of the applied items is automatically applied.

In Deployment pipelines, folders are considered part of an item’s name (an item name includes its full path). Note 2: Microsoft Fabric, The deployment pipelines process
The deployment process lets you clone content from one stage in the deployment pipeline to another, typically from development to test, and from test to production.

During deployment, Microsoft Fabric copies the content from the source stage to the target stage. The connections between the copied items are kept during the copy process.

Deploying content from a working production pipeline to a stage that has an existing workspace, includes the following steps:
Deploying new content as an addition to the content already there. Deploying updated content to replace some of the content already there.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



Your company has a finance department.

You have a Fabric tenant, an Azure Storage account named storage1, and a Microsoft Entra group named
Group1. Group1 contains the users in the finance department.

You need to create a new workspace named Workspace1 in the tenant. The solution must meet the following requirements:
Ensure that the finance department users can create and edit items in Workspace1. Ensure that Workspace1 can securely access storage1 to read and write data.

Ensure that you are the only admin of Workspace1. Minimize administrative effort.

You create Workspace1.

Which two actions should you perform next? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  1. Assign the Contributor role to Group1.
  2. Create a workspace identity.
  3. Assign the Admin role to yourself.
  4. Assign the Contributor role to each finance department user.

Answer(s): A,B

Explanation:

A: Ensure that the finance department users can create and edit items in Workspace1. Group1 contains the users in the finance department.

B: Ensure that Workspace1 can securely access storage1 to read and write data.

A Fabric workspace identity is an automatically managed service principal that can be associated with a Fabric workspace. Fabric workspaces with a workspace identity can securely read or write to firewall-enabled Azure Data Lake Storage Gen2 accounts through trusted workspace access for OneLake shortcuts. Fabric items can use the identity when connecting to resources that support Microsoft Entra authentication. Fabric uses workspace identities to obtain Microsoft Entra tokens without the customer having to manage any credentials.

Workspace identities can be created in the workspace settings of any workspace except My workspaces. A workspace identity is automatically assigned the workspace contributor role and has access to workspace items.

Incorrect:
Not D: More administrative effort.


Reference:

https://learn.microsoft.com/en-us/fabric/security/workspace-identity



You have a Fabric tenant that contains the workspaces shown in the following table.



You have a deployment pipeline named Pipeline1 that deploys items from Workspace_DEV to Workspace_TEST. In Pipeline1, all items that have matching names are paired.

You deploy the contents of Workspace_DEV to Workspace_TEST by using Pipeline1. What will the contents of Workspace_TEST be once the deployment is complete?

  1. Lakehouse1 Lakehouse2 Notebook1 Notebook2 Pipeline1 SemanticModel1
  2. Lakehouse1 Notebook1 Pipeline1 SemanticModel1
  3. Lakehouse2 Notebook2 SemanticModel1
  4. Lakehouse2 Notebook2 Pipeline1 SemanticModel1

Answer(s): A

Explanation:

The items in Workspace_DEV is added to Workspace_TEST. The items already in Workspace_TEST are kept.

NOTE: Microsoft Fabric, The deployment pipelines process
The deployment process lets you clone content from one stage in the deployment pipeline to another, typically from development to test, and from test to production.

During deployment, Microsoft Fabric copies the content from the source stage to the target stage. The connections between the copied items are kept during the copy process.

Deploying content from a working production pipeline to a stage that has an existing workspace, includes the following steps:
Deploying new content as an addition to the content already there.

Deploying updated content to replace some of the content already there.


Reference:

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/understand-the-deployment-process



HOTSPOT (Drag and Drop is not supported)
You have a Fabric tenant that contains a workspace named Workspace_DEV. Workspace_DEV contains the semantic models shown in the following table.



Workspace_DEV contains the dataflows shown in the following table.



You create a new workspace named Workspace_TEST.

You create a deployment pipeline named Pipeline1 to move items from Workspace_DEV to Workspace_TEST.

You run Pipeline1.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




Box 1: No
No - DF1 will be deployed to Workspace_TEST.

DF1 is a Dataflow Gen1 and is configured with a scheduled refresh policy. Gen1 dataflows cannot be integrated into data pipelines.

NOTE: Microsoft Fabric Data Factory, Getting from Dataflow Generation 1 to Dataflow Generation 2 Dataflow Gen2 is the new generation of dataflows. The new generation of dataflows resides alongside the
Power BI Dataflow (Gen1) and brings new features and improved experiences. The following section provides a comparison between Dataflow Gen1 and Dataflow Gen2.



Box 2: Yes
Yes - Data from Model1 will be deployed to Workspace_TEST.

Assign a workspace to an empty stage
When you assign content to an empty stage, a new workspace is created on a capacity for the stage you deploy to. All the metadata in the reports, dashboards, and semantic models of the original workspace is copied to the new workspace in the stage you're deploying to.

After the deployment is complete, refresh the semantic models so that you can use the newly copied content. The semantic model refresh is required because data isn't copied from one stage to another.

Box 3: Yes
Yes - The scheduled refresh policy for Model1 will be deployed to Workspace_TEST:


Reference:

https://learn.microsoft.com/en-us/fabric/data-factory/dataflows-gen2-overview



You have a Fabric tenant.

You are creating a Fabric Data Factory pipeline.

You have a stored procedure that returns the number of active customers and their average sales for the current month.

You need to add an activity that will execute the stored procedure in a warehouse. The returned values must be available to the downstream activities of the pipeline.

Which type of activity should you add?

  1. Get metadata
  2. Switch
  3. Lookup
  4. Append variable

Answer(s): C

Explanation:

The Fabric Lookup activity can retrieve a dataset from any of the data sources supported by Microsoft Fabric. You can use it to dynamically determine which objects to operate on in a subsequent activity, instead of hard coding the object name. Some object examples are files and tables.

Lookup activity reads and returns the content of a configuration file or table. It also returns the result of executing a query or stored procedure. The output can be a singleton value or an array of attributes, which can be consumed in a subsequent copy, transformation, or control flow activities like ForEach activity.


Reference:

https://learn.microsoft.com/en-us/fabric/data-factory/lookup-activity



Viewing Page 4 of 26



Share your comments for Microsoft DP-600 exam with other users:

Abdullah 9/29/2023 2:06:00 AM

good morning
Anonymous


Raj 6/26/2023 3:12:00 PM

please upload the ncp-mci 6.5 dumps, really need to practice this one. thanks guys
Anonymous


Miguel 10/5/2023 12:21:00 PM

question 16: https://help.salesforce.com/s/articleview?id=sf.care_console_overview.htm&type=5
SPAIN


Hiren Ladva 7/8/2023 10:34:00 PM

yes i m prepared exam
Anonymous


oliverjames 10/24/2023 5:37:00 AM

my experience was great with this site as i studied for the ms-900 from here and got 900/1000 on the test. my main focus was on the tutorials which were provided and practice questions. thanks!
GERMANY


Bhuddhiman 7/20/2023 11:52:00 AM

great course
UNITED STATES


Anuj 1/14/2024 4:07:00 PM

very good question
Anonymous


Saravana Kumar TS 12/8/2023 9:49:00 AM

question: 93 which statement is true regarding the result? sales contain 6 columns and values contain 7 columns so c is not right answer.
INDIA


Lue 3/30/2023 11:43:00 PM

highly recommend just passed my exam.
CANADA


DC 1/7/2024 10:17:00 AM

great practice! thanks
UNITED STATES


Anonymus 11/9/2023 5:41:00 AM

anyone who wrote this exam recently?
SOUTH AFRICA


Khalid Javid 11/17/2023 3:46:00 PM

kindly share the dump
Anonymous


Na 8/9/2023 8:39:00 AM

could you please upload cfe fraud prevention and deterrence questions? it will be very much helpful.
Anonymous


shime 10/23/2023 10:03:00 AM

this is really very very helpful for mcd level 1
ETHIOPIA


Vnu 6/3/2023 2:39:00 AM

very helpful!
Anonymous


Steve 8/17/2023 2:19:00 PM

question #18s answer should be a, not d. this should be corrected. it should be minvalidityperiod
CANADA


RITEISH 12/24/2023 4:33:00 AM

thanks for the exact solution
Anonymous


SB 10/15/2023 7:58:00 AM

need to refer the questions and have to give the exam
INDIA


Mike Derfalem 7/16/2023 7:59:00 PM

i need it right now if it was possible please
Anonymous


Isak 7/6/2023 3:21:00 AM

i need it very much please share it in the fastest time.
Anonymous


Maria 6/23/2023 11:40:00 AM

correct answer is d for student.java program
IRELAND


Nagendra Pedipina 7/12/2023 9:10:00 AM

q:37 c is correct
INDIA


John 9/16/2023 9:37:00 PM

q6 exam topic: terramearth, c: correct answer: copy 1petabyte to encrypted usb device ???
GERMANY


SAM 12/4/2023 12:56:00 AM

explained answers
INDIA


Andy 12/26/2023 9:35:00 PM

plan to take theaws certified developer - associate dva-c02 in the next few weeks
SINGAPORE


siva 5/17/2023 12:32:00 AM

very helpfull
Anonymous


mouna 9/27/2023 8:53:00 AM

good questions
Anonymous


Bhavya 9/12/2023 7:18:00 AM

help to practice csa exam
Anonymous


Malik 9/28/2023 1:09:00 PM

nice tip and well documented
Anonymous


rodrigo 6/22/2023 7:55:00 AM

i need the exam
Anonymous


Dan 6/29/2023 1:53:00 PM

please upload
Anonymous


Ale M 11/22/2023 6:38:00 PM

prepping for fsc exam
AUSTRALIA


ahmad hassan 9/6/2023 3:26:00 AM

pd1 with great experience
Anonymous


Žarko 9/5/2023 3:35:00 AM

@t it seems like azure service bus message quesues could be the best solution
UNITED KINGDOM