Microsoft DP-300 Exam (page: 11)
Microsoft Administering Azure SQL Solutions
Updated on: 01-Aug-2025

Viewing Page 11 of 61

Testlet 1
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the question button to return to the question.

Overview
Litware, Inc. is a renewable energy company that has a main office in Boston. The main office hosts a sales department and the primary datacenter for the company.

Physical Locations

Existing Environment
Litware has a manufacturing office and a research office is separate locations near Boston. Each office has its own datacenter and internet connection.

The manufacturing and research datacenters connect to the primary datacenter by using a VPN.

Network Environment
The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet.

Identity Environment
Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant.

Database Environment
The sales department has the following database workload:

-An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases.
-A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users.
-An application named SalesSQLDb1App1 uses SalesSQLDb1.

The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1

Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases.

Licensing Agreement
Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance.

Current Problems
SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.

Requirements

Planned Changes
Litware plans to implement the following changes:

-Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB.
-Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01. ResearchDB1 will contain Personally Identifiable Information (PII) data.
-Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1.
-Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.
-Migrate the SERVER1 databases to the Azure SQL Database platform.

Technical Requirements
Litware identifies the following technical requirements:

-Maintenance tasks must be automated.
-The 30 new databases must scale automatically.
-The use of an on-premises infrastructure must be minimized.
-Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.
-All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.

Security and Compliance Requirements
Litware identifies the following security and compliance requirements:

-Store encryption keys in Azure Key Vault.
-Retain backups of the PII data for two months.
-Encrypt the PII data at rest, in transit, and in use.
-Use the principle of least privilege whenever possible.
-Authenticate database users by using Active Directory credentials.
-Protect Azure SQL Database instances by using database-level firewall rules.
-Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on public endpoints.

Business Requirements
Litware identifies the following business requirements:

-Meet an SLA of 99.99% availability for all Azure deployments.
-Minimize downtime during the migration of the SERVER1 databases.
-Use the Azure Hybrid Use Benefits when migrating workloads to Azure.
-Once all requirements are met, minimize costs whenever possible.
-

You need to provide an implementation plan to configure data retention for ResearchDB1. The solution must meet the security and compliance requirements.

What should you include in the plan?

  1. Configure the Deleted databases settings for ResearchSrv01.
  2. Deploy and configure an Azure Backup server.
  3. Configure the Advanced Data Security settings for ResearchDB1.
  4. Configure the Manage Backups settings for ResearchSrv01.

Answer(s): D


Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/long-term-backup-retention-configure




Testlet 1
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the question button to return to the question.

Overview
Litware, Inc. is a renewable energy company that has a main office in Boston. The main office hosts a sales department and the primary datacenter for the company.

Physical Locations

Existing Environment
Litware has a manufacturing office and a research office is separate locations near Boston. Each office has its own datacenter and internet connection.

The manufacturing and research datacenters connect to the primary datacenter by using a VPN.

Network Environment
The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet.

Identity Environment
Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant.

Database Environment
The sales department has the following database workload:

-An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases.
-A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users.
-An application named SalesSQLDb1App1 uses SalesSQLDb1.

The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1

Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases.

Licensing Agreement
Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance.

Current Problems
SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.

Requirements

Planned Changes
Litware plans to implement the following changes:

-Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB.
-Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01. ResearchDB1 will contain Personally Identifiable Information (PII) data.
-Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1.
-Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.
-Migrate the SERVER1 databases to the Azure SQL Database platform.

Technical Requirements
Litware identifies the following technical requirements:

-Maintenance tasks must be automated.
-The 30 new databases must scale automatically.
-The use of an on-premises infrastructure must be minimized.
-Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.
-All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.

Security and Compliance Requirements
Litware identifies the following security and compliance requirements:

-Store encryption keys in Azure Key Vault.
-Retain backups of the PII data for two months.
-Encrypt the PII data at rest, in transit, and in use.
-Use the principle of least privilege whenever possible.
-Authenticate database users by using Active Directory credentials.
-Protect Azure SQL Database instances by using database-level firewall rules.
-Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on public endpoints.

Business Requirements
Litware identifies the following business requirements:

-Meet an SLA of 99.99% availability for all Azure deployments.
-Minimize downtime during the migration of the SERVER1 databases.
-Use the Azure Hybrid Use Benefits when migrating workloads to Azure.
-Once all requirements are met, minimize costs whenever possible.
-

HOTSPOT (Drag and Drop is not supported)
You need to recommend a configuration for ManufacturingSQLDb1 after the migration to Azure. The solution must meet the business requirements.

What should you include in the recommendation? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Box 1: Node majority with witness
As a general rule when you configure a quorum, the voting elements in the cluster should be an odd number. Therefore, if the cluster contains an even number of voting nodes, you should configure a disk witness or a file share witness.

Note: Mode: Node majority with witness (disk or file share)
Nodes have votes. In addition, a quorum witness has a vote. The cluster quorum is the majority of voting nodes in the active cluster membership plus a witness vote. A quorum witness can be a designated disk witness or a designated file share witness.

Box 2: Azure Standard Load Balancer
Microsoft guarantees that a Load Balanced Endpoint using Azure Standard Load Balancer, serving two or more Healthy Virtual Machine Instances, will be available 99.99% of the time.

Scenario: Business Requirements
Litware identifies business requirements include: meet an SLA of 99.99% availability for all Azure deployments.

Incorrect Aswers:
Basic Balancer: No SLA is provided for Basic Load Balancer.

Note: There are two main options for setting up your listener: external (public) or internal. The external (public) listener uses an internet facing load balancer and is associated with a public Virtual IP (VIP) that is accessible over the internet. An internal listener uses an internal load balancer and only supports clients within the same Virtual Network.


Reference:

https://technet.microsoft.com/windows-server-docs/failover-clustering/deploy-cloud-witness https://azure.microsoft.com/en-us/support/legal/sla/load-balancer/v1_0/




Testlet 1
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the question button to return to the question.

Overview
Litware, Inc. is a renewable energy company that has a main office in Boston. The main office hosts a sales department and the primary datacenter for the company.

Physical Locations

Existing Environment
Litware has a manufacturing office and a research office is separate locations near Boston. Each office has its own datacenter and internet connection.

The manufacturing and research datacenters connect to the primary datacenter by using a VPN.

Network Environment
The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet.

Identity Environment
Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant.

Database Environment
The sales department has the following database workload:

-An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases.
-A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users.
-An application named SalesSQLDb1App1 uses SalesSQLDb1.

The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1

Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases.

Licensing Agreement
Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance.

Current Problems
SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.

Requirements

Planned Changes
Litware plans to implement the following changes:

-Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB.
-Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01. ResearchDB1 will contain Personally Identifiable Information (PII) data.
-Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1.
-Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.
-Migrate the SERVER1 databases to the Azure SQL Database platform.

Technical Requirements
Litware identifies the following technical requirements:

-Maintenance tasks must be automated.
-The 30 new databases must scale automatically.
-The use of an on-premises infrastructure must be minimized.
-Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.
-All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.

Security and Compliance Requirements
Litware identifies the following security and compliance requirements:

-Store encryption keys in Azure Key Vault.
-Retain backups of the PII data for two months.
-Encrypt the PII data at rest, in transit, and in use.
-Use the principle of least privilege whenever possible.
-Authenticate database users by using Active Directory credentials.
-Protect Azure SQL Database instances by using database-level firewall rules.
-Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on public endpoints.

Business Requirements
Litware identifies the following business requirements:

-Meet an SLA of 99.99% availability for all Azure deployments.
-Minimize downtime during the migration of the SERVER1 databases.
-Use the Azure Hybrid Use Benefits when migrating workloads to Azure.
-Once all requirements are met, minimize costs whenever possible.
-

What should you do after a failover of SalesSQLDb1 to ensure that the database remains accessible to SalesSQLDb1App1?

  1. Configure SalesSQLDb1 as writable.
  2. Update the connection strings of SalesSQLDb1App1.
  3. Update the firewall rules of SalesSQLDb1.
  4. Update the users in SalesSQLDb1.

Answer(s): B

Explanation:

Scenario: SalesSQLDb1 uses database firewall rules and contained database users.
Incorrect:
Not C: When using public network access for connecting to the database, we recommend using database-level IP firewall rules for geo-replicated databases.
These rules are replicated with the database, which ensures that all geo-secondaries have the same IP firewall rules as the primary. This approach eliminates the need for customers to manually configure and maintain firewall rules on servers hosting the primary and secondary databases.


Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/active-geo-replication-overview#keeping-credentials-and-firewall-rules-in-sync



HOTSPOT (Drag and Drop is not supported)
You have an Azure Data Factory instance named ADF1 and two Azure Synapse Analytics workspaces named WS1 and WS2.

ADF1 contains the following pipelines:

P1:Uses a copy activity to copy data from a nonpartitioned table in a dedicated SQL pool of WS1 to an Azure Data Lake Storage Gen2 account
P2:Uses a copy activity to copy data from text-delimited files in an Azure Data Lake Storage Gen2 account to a nonpartitioned table in a dedicated SQL pool of WS2

You need to configure P1 and P2 to maximize parallelism and performance.

Which dataset settings should you configure for the copy activity of each pipeline? To answer, select the appropriate options in the answer area.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:




P1: Set the Partition option to Dynamic Range.
The SQL Server connector in copy activity provides built-in data partitioning to copy data in parallel.

P2: Set the Copy method to PolyBase
Polybase is the most efficient way to move data into Azure Synapse Analytics. Use the staging blob feature to achieve high load speeds from all types of data stores, including Azure Blob storage and Data Lake Store. (Polybase supports Azure Blob storage and Azure Data Lake Store by default.)


Reference:

https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-sql-data-warehouse
https://docs.microsoft.com/en-us/azure/data-factory/load-azure-sql-data-warehouse



You have the following Azure Data Factory pipelines:
-Ingest Data from System1
-Ingest Data from System2
-Populate Dimensions
-Populate Facts

Ingest Data from System1 and Ingest Data from System2 have no dependencies. Populate Dimensions must execute after Ingest Data from System1 and Ingest Data from System2. Populate Facts must execute after the Populate Dimensions pipeline. All the pipelines must execute every eight hours.

What should you do to schedule the pipelines for execution?

  1. Add a schedule trigger to all four pipelines.
  2. Add an event trigger to all four pipelines.
  3. Create a parent pipeline that contains the four pipelines and use an event trigger.
  4. Create a parent pipeline that contains the four pipelines and use a schedule trigger.

Answer(s): D


Reference:

https://www.mssqltips.com/sqlservertip/6137/azure-data-factory-control-flow-activities-overview/



Viewing Page 11 of 61



Share your comments for Microsoft DP-300 exam with other users:

Mirex 5/26/2023 3:45:00 AM

am preparing for exam ,just nice questions
Anonymous


exampei 8/7/2023 8:05:00 AM

please upload c_tadm_23 exam
TURKEY


Anonymous 9/12/2023 12:50:00 PM

can we get tdvan4 vantage data engineering pdf?
UNITED STATES


Aish 10/11/2023 5:51:00 AM

want to clear the exam.
INDIA


Smaranika 6/22/2023 8:42:00 AM

could you please upload the dumps of sap c_sac_2302
INDIA


Blessious Phiri 8/15/2023 1:56:00 PM

asm management configuration is about storage
Anonymous


Lewis 7/6/2023 8:49:00 PM

kool thumb up
UNITED STATES


Moreece 5/15/2023 8:44:00 AM

just passed the az-500 exam this last friday. most of the questions in this exam dumps are in the exam. i bought the full version and noticed some of the questions which were answered wrong in the free version are all corrected in the full version. this site is good but i wish the had it in an interactive version like a test engine simulator.
Anonymous


Terry 5/24/2023 4:41:00 PM

i can practice for exam
Anonymous


Emerys 7/29/2023 6:55:00 AM

please i need this exam.
Anonymous


Goni Mala 9/2/2023 12:27:00 PM

i need the dump
Anonymous


Lenny 9/29/2023 11:30:00 AM

i want it bad, even if cs6 maybe retired, i want to learn cs6
HONG KONG


MilfSlayer 12/28/2023 8:32:00 PM

i hate comptia with all my heart with their "choose the best" answer format as an argument could be made on every question. they say "the "comptia way", lmao no this right here boys is the comptia way 100%. take it from someone whos failed this exam twice but can configure an entire complex network that these are the questions that are on the test 100% no questions asked. the pbqs are dead on! nice work
Anonymous


Swati Raj 11/14/2023 6:28:00 AM

very good materials
UNITED STATES


Ko Htet 10/17/2023 1:28:00 AM

thanks for your support.
Anonymous


Philippe 1/22/2023 10:24:00 AM

iam impressed with the quality of these dumps. they questions and answers were easy to understand and the xengine app was very helpful to use.
CANADA


Sam 8/31/2023 10:32:00 AM

not bad but you question database from isaca
MALAYSIA


Brijesh kr 6/29/2023 4:07:00 AM

awesome contents
INDIA


JM 12/19/2023 1:22:00 PM

answer to 134 is casb. while data loss prevention is the goal, in order to implement dlp in cloud applications you need to deploy a casb.
UNITED STATES


Neo 7/26/2023 9:36:00 AM

are these brain dumps sufficient enough to go write exam after practicing them? or does one need more material this wont be enough?
SOUTH AFRICA


Bilal 8/22/2023 6:33:00 AM

i did attend the required cources and i need to be sure that i am ready to take the exam, i would ask you please to share the questions, to be sure that i am fit to proceed with taking the exam.
Anonymous


John 11/12/2023 8:48:00 PM

why only give explanations on some, and not all questions and their respective answers?
UNITED STATES


Biswa 11/20/2023 8:50:00 AM

refresh db knowledge
Anonymous


Shalini Sharma 10/17/2023 8:29:00 AM

interested for sap certification
JAPAN


ethan 9/24/2023 12:38:00 PM

could you please upload practice questions for scr exam ?
HONG KONG


vijay joshi 8/19/2023 3:15:00 AM

please upload free oracle cloud infrastructure 2023 foundations associate exam braindumps
Anonymous


Ayodele Talabi 8/25/2023 9:25:00 PM

sweating! they are tricky
CANADA


Romero 3/23/2022 4:20:00 PM

i never use these dumps sites but i had to do it for this exam as it is impossible to pass without using these question dumps.
UNITED STATES


John Kennedy 9/20/2023 3:33:00 AM

good practice and well sites.
Anonymous


Nenad 7/12/2022 11:05:00 PM

passed my first exam last week and pass the second exam this morning. thank you sir for all the help and these brian dumps.
INDIA


Lucky 10/31/2023 2:01:00 PM

does anyone who attended exam csa 8.8, can confirm these questions are really coming ? or these are just for practicing?
HONG KONG


Prateek 9/18/2023 11:13:00 AM

kindly share the dumps
UNITED STATES


Irfan 11/25/2023 1:26:00 AM

very nice content
Anonymous


php 6/16/2023 12:49:00 AM

passed today
Anonymous