Microsoft DP-700 Exam (page: 5)
Microsoft Implementing Data Engineering Solutions Using Fabric
Updated on: 13-Dec-2025

Viewing Page 5 of 25

You have an Azure Data Lake Storage Gen2 account named storage1 and an Amazon S3 bucket named storage2.
You have the Delta Parquet files shown in the following table.

You have a Fabric workspace named Workspace1 that has the cache for shortcuts enabled. Workspace1 contains a lakehouse named Lakehouse1. Lakehouse1 has the following shortcuts:
A shortcut to ProductFile aliased as Products
A shortcut to StoreFile aliased as Stores
A shortcut to TripsFile aliased as Trips
The data from which shortcuts will be retrieved from the cache?

  1. Trips and Stores only
  2. Products and Store only
  3. Stores only
  4. Products only
  5. Products, Stores, and Trips

Answer(s): B

Explanation:

When the cache for shortcuts is enabled in Fabric, the data retrieval is governed by the caching behavior, which generally retains data for a specific period after it was last accessed. The data from the shortcuts will be retrieved from the cache if the data is stored in locations that support caching. Here's a breakdown based on the data's location:
Products: The ProductFile is stored in Azure Data Lake Storage Gen2 (storage1). Since Azure Data
Lake is a supported storage system in Fabric and the file is relatively small (50 MB), this data is most likely cached and can be retrieved from the cache.
Stores: The StoreFile is stored in Amazon S3 (storage2), and even though it is stored in a different cloud provider, Fabric can cache data from Amazon S3 if caching is enabled. This data (25 MB) is likely cached and retrievable.
Trips: The TripsFile is stored in Amazon S3 (storage2) and is significantly larger (2 GB) compared to the other files.
While Fabric can cache data from Amazon S3, the larger size of the file (2 GB) may exceed typical cache sizes or retention windows, causing this file to likely be retrieved directly from the source instead of the cache.



HOTSPOT (Drag and Drop is not supported)
You have a Fabric workspace named Workspace1 that contains the items shown in the following table.

For Model1, the Keep your Direct Lake data up to date option is disabled.
You need to configure the execution of the items to meet the following requirements:
Notebook1 must execute every weekday at 8:00 AM.
Notebook2 must execute when a file is saved to an Azure Blob Storage container.
Model1 must refresh when Notebook1 has executed successfully.
How should you orchestrate each item? To answer, select the appropriate options in the answer area.
Note: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Notebook1 - Add Notebook1 to Pipeline1.
To schedule Notebook1 to run every weekday at 8:00 AM, the best way to orchestrate it is by adding it to Pipeline1 and then configuring a schedule for the pipeline execution. This allows you to leverage the scheduling capabilities of pipelines in Fabric, ensuring that Notebook1 runs at the desired time.
Notebook2 - From Real-Time hub, configure the execution of Notebook2. Since Notebook2 needs to execute when a file is saved to an Azure Blob Storage container, the best way to trigger this is by using Real-Time hub. The Real-Time hub is designed for event-driven scenarios like this, where an external event (e.g., file saved to a blob) can trigger the execution of a notebook.
Pipeline1 - Configure the execution of Pipeline1 by using a schedule. To ensure Model1 refreshes when Notebook1 has executed successfully, you can orchestrate the execution flow by adding Notebook1 to Pipeline1 and configuring the pipeline to execute on a schedule (such as when Notebook1 runs). You can also use dependencies within the pipeline to ensure Model1 only refreshes after Notebook1 executes.
Model1 - Add Model1 to Pipeline1.
To ensure Model1 refreshes when Notebook1 executes, you need to add Model1 to Pipeline1 and configure the execution within the pipeline. By configuring the pipeline so that Model1 refreshes after Notebook1 has successfully executed, you establish the correct dependency. Additionally, ensure that the Keep your Direct
Lake data up to date option is disabled since the refresh is controlled by Pipeline1.



Your company has a sales department that uses two Fabric workspaces named Workspace1 and Workspace2.
The company decides to implement a domain strategy to organize the workspaces.
You need to ensure that a user can perform the following tasks:
Create a new domain for the sales department.
Create two subdomains: one for the east region and one for the west region.
Assign Workspace1 to the east region subdomain.
Assign Workspace2 to the west region subdomain.
The solution must follow the principle of least privilege.
Which role should you assign to the user?

  1. workspace Admin
  2. domain admin
  3. domain contributor
  4. Fabric admin

Answer(s): D



You have a Fabric workspace named Workspace1 that contains a warehouse named DW1 and a data pipeline named Pipeline1.
You plan to add a user named User3 to Workspace1.
You need to ensure that User3 can perform the following actions:
View all the items in Workspace1.
Update the tables in DW1.
The solution must follow the principle of least privilege.
You already assigned the appropriate object-level permissions to DW1.
Which workspace role should you assign to User3?

  1. Admin
  2. Member
  3. Viewer
  4. Contributor

Answer(s): D

Explanation:

To ensure User3 can view all items in Workspace1 and update the tables in DW1, the most appropriate workspace role to assign is the Contributor role. This role allows User3 to:
1. View all items in Workspace1: The Contributor role provides the ability to view all objects within the workspace, such as data pipelines, warehouses, and other resources.
2. Update the tables in DW1: The Contributor role allows User3 to modify or update resources within the workspace, including the tables in DW1, assuming that appropriate object-level permissions are set for the warehouse.
This role adheres to the principle of least privilege, as it provides the necessary permissions without granting broader administrative rights.



You have a Fabric capacity that contains a workspace named Workspace1. Workspace1 contains a lakehouse named Lakehouse1, a data pipeline, a notebook, and several Microsoft Power BI reports.
A user named User1 wants to use SQL to analyze the data in Lakehouse1.
You need to configure access for User1. The solution must meet the following requirements:
Provide User1 with read access to the table data in Lakehouse1.
Prevent User1 from using Apache Spark to query the underlying files in Lakehouse1.
Prevent User1 from accessing other items in Workspace1.
What should you do?

  1. Share Lakehouse1 with User1 directly and select Read all SQL endpoint data.
  2. Assign User1 the Viewer role for Workspace1. Share Lakehouse1 with User1 and select Read all SQL endpoint data.
  3. Share Lakehouse1 with User1 directly and select Build reports on the default semantic model.
  4. Assign User1 the Member role for Workspace1. Share Lakehouse1 with User1 and select Read all SQL endpoint data.

Answer(s): A



Viewing Page 5 of 25



Share your comments for Microsoft DP-700 exam with other users:

Vavz 11/2/2023 6:51:00 AM

questions are accurate
Anonymous


Su 11/23/2023 4:34:00 AM

i need questions/dumps for this exam.
Anonymous


LuvSN 7/16/2023 11:19:00 AM

i need this exam, when will it be uploaded
ROMANIA


Mihai 7/19/2023 12:03:00 PM

i need the dumps !
Anonymous


Wafa 11/13/2023 3:06:00 AM

very helpful
Anonymous


Alokit 7/3/2023 2:13:00 PM

good source
Anonymous


Show-Stopper 7/27/2022 11:19:00 PM

my 3rd test and passed on first try. hats off to this brain dumps site.
UNITED STATES


Michelle 6/23/2023 4:06:00 AM

please upload it
Anonymous


Lele 11/20/2023 11:55:00 AM

does anybody know if are these real exam questions?
EUROPEAN UNION


Girish Jain 10/9/2023 12:01:00 PM

are these questions similar to actual questions in the exam? because they seem to be too easy
Anonymous


Phil 12/8/2022 11:16:00 PM

i have a lot of experience but what comes in the exam is totally different from the practical day to day tasks. so i thought i would rather rely on these brain dumps rather failing the exam.
GERMANY


BV 6/8/2023 4:35:00 AM

good questions
NETHERLANDS


krishna 12/19/2023 2:05:00 AM

valied exam dumps. they were very helpful and i got a pretty good score. i am very grateful for this service and exam questions
Anonymous


Pie 9/3/2023 4:56:00 AM

will it help?
INDIA


Lucio 10/6/2023 1:45:00 PM

very useful to verify knowledge before exam
POLAND


Ajay 5/17/2023 4:54:00 AM

good stuffs
Anonymous


TestPD1 8/10/2023 12:19:00 PM

question 17 : responses arent b and c ?
EUROPEAN UNION


Nhlanhla 12/13/2023 5:26:00 AM

just passed the exam on my first try using these dumps.
Anonymous


Rizwan 1/6/2024 2:18:00 AM

very helpful
INDIA


Yady 5/24/2023 10:40:00 PM

these questions look good.
SINGAPORE


Kettie 10/12/2023 1:18:00 AM

this is very helpful content
Anonymous


SB 7/21/2023 3:18:00 AM

please provide the dumps
UNITED STATES


David 8/2/2023 8:20:00 AM

it is amazing
Anonymous


User 8/3/2023 3:32:00 AM

quesion 178 about "a banking system that predicts whether a loan will be repaid is an example of the" the answer is classification. not regresion, you should fix it.
EUROPEAN UNION


quen 7/26/2023 10:39:00 AM

please upload apache spark dumps
Anonymous


Erineo 11/2/2023 5:34:00 PM

q14 is b&c to reduce you will switch off mail for every single alert and you will switch on daily digest to get a mail once per day, you might even skip the empty digest mail but i see this as a part of the daily digest adjustment
Anonymous


Paul 10/21/2023 8:25:00 AM

i think it is good question
Anonymous


Unknown 8/15/2023 5:09:00 AM

good for students who wish to give certification.
INDIA


Ch 11/20/2023 10:56:00 PM

is there a google drive link to the images? the links in questions are not working.
AUSTRALIA


Joey 5/16/2023 5:25:00 AM

very promising, looks great, so much wow!
Anonymous


alaska 10/24/2023 5:48:00 AM

i scored 87% on the az-204 exam. thanks! i always trust
GERMANY


nnn 7/9/2023 11:09:00 PM

good need more
Anonymous


User-sfdc 12/29/2023 7:21:00 AM

sample questions seems good
Anonymous


Tamer dam 8/4/2023 10:21:00 AM

huawei is ok
UNITED STATES