Microsoft DP-203 Exam (page: 5)
Microsoft Data Engineering on Azure
Updated on: 12-Jan-2026

Viewing Page 5 of 75

HOTSPOT (Drag and Drop is not supported)
You have an Azure Synapse Analytics dedicated SQL pool that contains the users shown in the following table.


User1 executes a query on the database, and the query returns the results shown in the following exhibit.


User1 is the only user who has access to the unmasked data.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: 0
The YearlyIncome column is of the money data type.
The Default masking function: Full masking according to the data types of the designated fields
Use a zero value for numeric data types (bigint, bit, decimal, int, money, numeric, smallint, smallmoney, tinyint, float, real).

Box 2: the values stored in the database
Users with administrator privileges are always excluded from masking, and see the original data without any mask.


Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-overview



You have an enterprise data warehouse in Azure Synapse Analytics.

Using PolyBase, you create an external table named [Ext].[Items] to query Parquet files stored in Azure Data Lake Storage Gen2 without importing the data to the data warehouse.

The external table has three columns.
You discover that the Parquet files have a fourth column named ItemID.

Which command should you run to add the ItemID column to the external table?





Answer(s): C

Explanation:

Incorrect Answers:
A, D: Only these Data Definition Language (DDL) statements are allowed on external tables:

-CREATE TABLE and DROP TABLE
-CREATE STATISTICS and DROP STATISTICS
-CREATE VIEW and DROP VIEW


Reference:

https://docs.microsoft.com/en-us/sql/t-sql/statements/create-external-table-transact-sql



HOTSPOT (Drag and Drop is not supported)
You have two Azure Storage accounts named Storage1 and Storage2. Each account holds one container and has the hierarchical namespace enabled. The system has files that contain data stored in the Apache Parquet format.

You need to copy folders and files from Storage1 to Storage2 by using a Data Factory copy activity. The solution must meet the following requirements:

-No transformations must be performed.
-The original folder structure must be retained.
-Minimize time required to perform the copy activity.

How should you configure the copy activity? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: Parquet
For Parquet datasets, the type property of the copy activity source must be set to ParquetSource.

Box 2: PreserveHierarchy
PreserveHierarchy (default): Preserves the file hierarchy in the target folder. The relative path of the source file to the source folder is identical to the relative path of the target file to the target folder.

Incorrect Answers:
FlattenHierarchy: All files from the source folder are in the first level of the target folder. The target files have autogenerated names.
MergeFiles: Merges all files from the source folder to one file. If the file name is specified, the merged file name is the specified name. Otherwise, it's an autogenerated file name.


Reference:

https://docs.microsoft.com/en-us/azure/data-factory/format-parquet
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-data-lake-storage



You have an Azure Data Lake Storage Gen2 container that contains 100 TB of data.

You need to ensure that the data in the container is available for read workloads in a secondary region if an outage occurs in the primary region. The solution must minimize costs.

Which type of data redundancy should you use?

  1. geo-redundant storage (GRS)
  2. read-access geo-redundant storage (RA-GRS)
  3. zone-redundant storage (ZRS)
  4. locally-redundant storage (LRS)

Answer(s): A



You plan to implement an Azure Data Lake Gen 2 storage account.

You need to ensure that the data lake will remain available if a data center fails in the primary Azure region. The solution must minimize costs. Which type of replication should you use for the storage account?

  1. geo-redundant storage (GRS)
  2. geo-zone-redundant storage (GZRS)
  3. locally-redundant storage (LRS)
  4. zone-redundant storage (ZRS)

Answer(s): D

Explanation:

Zone-redundant storage (ZRS) copies your data synchronously across three Azure availability zones in the primary region.

Incorrect Answers:
C: Locally redundant storage (LRS) copies your data synchronously three times within a single physical location in the primary region. LRS is the least expensive replication option, but is not recommended for applications requiring high availability or durability


Reference:

https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy



Viewing Page 5 of 75



Share your comments for Microsoft DP-203 exam with other users:

Ann 9/15/2023 5:39:00 PM

hi can you please upload the dumps for sap contingent module. thanks
AUSTRALIA


Sridhar 1/16/2024 9:19:00 PM

good questions
Anonymous


Summer 10/4/2023 9:57:00 PM

looking forward to the real exam
Anonymous


vv 12/2/2023 2:45:00 PM

good ones for exam preparation
UNITED STATES


Danny Zas 9/15/2023 4:45:00 AM

this is a good experience
UNITED STATES


SM 1211 10/12/2023 10:06:00 PM

hi everyone
UNITED STATES


A 10/2/2023 6:08:00 PM

waiting for the dump. please upload.
UNITED STATES


Anonymous 7/16/2023 11:05:00 AM

upload cks exam questions
Anonymous


Johan 12/13/2023 8:16:00 AM

awesome training material
NETHERLANDS


PC 7/28/2023 3:49:00 PM

where is dump
Anonymous


YoloStar Yoloing 10/22/2023 9:58:00 PM

q. 289 - the correct answer should be b not d, since the question asks for the most secure way to provide access to a s3 bucket (a single one), and by principle of the least privilege you should not be giving access to all buckets.
Anonymous


Zelalem Nega 5/14/2023 12:45:00 PM

please i need if possible h12-831,
UNITED KINGDOM


unknown-R 11/23/2023 7:36:00 AM

good collection of questions and solution for pl500 certification
UNITED STATES


Swaminathan 5/11/2023 9:59:00 AM

i would like to appear the exam.
Anonymous


Veenu 10/24/2023 6:26:00 AM

i am very happy as i cleared my comptia a+ 220-1101 exam. i studied from as it has all exam dumps and mock tests available. i got 91% on the test.
Anonymous


Karan 5/17/2023 4:26:00 AM

need this dump
Anonymous


Ramesh Kutumbaka 12/30/2023 11:17:00 PM

its really good to eventuate knowledge before appearing for the actual exam.
Anonymous


anonymous 7/20/2023 10:31:00 PM

this is great
CANADA


Xenofon 6/26/2023 9:35:00 AM

please i want the questions to pass the exam
UNITED STATES


Diego 1/21/2024 8:21:00 PM

i need to pass exam
Anonymous


Vichhai 12/25/2023 3:25:00 AM

great, i appreciate it.
AUSTRALIA


P Simon 8/25/2023 2:39:00 AM

please could you upload (isc)2 certified in cybersecurity (cc) exam questions
SOUTH AFRICA


Karim 10/8/2023 8:34:00 PM

good questions, wrong answers
Anonymous


Itumeleng 1/6/2024 12:53:00 PM

im preparing for exams
Anonymous


MS 1/19/2024 2:56:00 PM

question no: 42 isnt azure vm an iaas solution? so, shouldnt the answer be "no"?
Anonymous


keylly 11/28/2023 10:10:00 AM

im study azure
Anonymous


dorcas 9/22/2023 8:08:00 AM

i need this now
Anonymous


treyf 11/9/2023 5:13:00 AM

i took the aws saa-c03 test and scored 935/1000. it has all the exam dumps and important info.
UNITED STATES


anonymous 1/11/2024 4:50:00 AM

good questions
Anonymous


Anjum 9/23/2023 6:22:00 PM

well explained
Anonymous


Thakor 6/7/2023 11:52:00 PM

i got the full version and it helped me pass the exam. pdf version is very good.
INDIA


sartaj 7/18/2023 11:36:00 AM

provide the download link, please
INDIA


loso 7/25/2023 5:18:00 AM

please upload thank.
THAILAND


Paul 6/23/2023 7:12:00 AM

please can you share 1z0-1055-22 dump pls
UNITED STATES