Microsoft DP-300 Exam (page: 1)
Microsoft Administering Azure SQL Solutions
Updated on: 13-Dec-2025

Viewing Page 1 of 54

Case Study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs.
When you are ready to answer a question, click the Question button to return to the question.

Overview

Litware, Inc. is a renewable energy company that has a main office in Boston. The main office hosts a sales department and the primary datacenter for the company.

Physical Locations

Litware has a manufacturing office and a research office in separate locations near Boston. Each office has its own datacenter and internet connection.

Existing Environment

Network Environment

The manufacturing and research datacenters connect to the primary datacenter by using a VPN.

The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet.

Identity Environment

Litware has a hybrid Microsoft Entra ID deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Microsoft Entra tenant.

Database Environment

The sales department has the following database workload:

An on-premises server named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases.
A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1.

SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users.
An application named SalesSQLDb1App1 uses SalesSQLDb1.

The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1.

Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases.

Current Problems

Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance.

Current Problems

SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.

Requirements:

Planned Changes

Litware plans to implement the following changes:

Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB. Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01.

ResearchDB1 will contain Personally Identifiable Information (PII) data. Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1.
Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.

Migrate the SERVER1 databases to the Azure SQL Database platform.

Technical Requirements

Litware identifies the following technical requirements:

Maintenance tasks must be automated.

The 30 new databases must scale automatically.

The use of an on-premises infrastructure must be minimized.

Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.

All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.

Security and Compliance Requirements

Litware identifies the following security and compliance requirements:

-Store encryption keys in Azure Key Vault.
-Retain backups of the PII data for two months.
-Encrypt the PII data at rest, in transit, and in use.
-Use the principle of least privilege whenever possible.
-Authenticate database users by using Active Directory credentials.
-Protect Azure SQL Database instances by using database-level firewall rules.
-Ensure that all databases hosted in Azure are accessible from VM1 and VM2 over the Azure backbone network.

Business Requirements

Litware identifies the following business requirements:

-Meet an SLA of 99.99% availability for all Azure deployments.
-Minimize downtime during the migration of the SERVER1 databases.
-Use the Azure Hybrid Use Benefits when migrating workloads to Azure.
-Once all requirements are met, minimize costs whenever possible.

HOTSPOT (Drag and Drop is not supported)

You are planning the migration of the SERVER1 databases. The solution must meet the business requirements.

What should you include in the migration plan? To answer, select the appropriate options in the answer area.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:





Scenario:
Existing
An on-premises server named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases.

Planned changes
Migrate the SERVER1 databases to the Azure SQL Database platform.

Business Requirements
Minimize downtime during the migration of the SERVER1 databases.

Box 1: Premium 4-vCore
Azure Database Migration Service price tier

To minimize migration time with Azure Database Migration Service (DMS), the Premium pricing tier should be used. The Premium tier, with its higher vCore options (e.g., 4 vCore), allows for faster data transfer and parallel processing, leading to quicker migration completion.
While the Standard tier is free and suitable for offline migrations, it may take longer for large databases due to slower processing speed.

Incorrect:
* Standard 2-vCore
* Standard 4-vCore

Box 2: A VPN gateway
Required Azure resource.

A VPN gateway (specifically a Site-to-Site VPN or ExpressRoute) is generally needed for migrating on- premises Microsoft SQL Server databases to Azure SQL Database using the Azure Database Migration Service (DMS). This is because DMS needs a secure and reliable connection between the on-premises SQL Server and the Azure environment where the Azure SQL Database resides.

Security:
A VPN gateway provides a secure and encrypted tunnel for data transmission between your on-premises network and Azure.

Connectivity:
DMS relies on a network connection to access the source database during the migration process. A VPN gateway ensures this connectivity.

Hybrid Environments:
Many migration scenarios involve migrating data between on-premises and cloud environments, requiring a VPN or ExpressRoute to establish a connection between them.


Reference:

https://azure.microsoft.com/en-us/pricing/details/database-migration/ https://learn.microsoft.com/en-us/azure/dms/faq




Case Study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs.
When you are ready to answer a question, click the Question button to return to the question.

Overview

ADatum Corporation is a financial services company that has a main office in New York City.

Existing Environment. Licensing Agreement

ADatum has a Microsoft Volume Licensing agreement that includes Software Assurance.

Existing Environment. Network Infrastructure

ADatum has an on-premises datacenter and an Azure subscription named Sub1.

Sub1 contains a virtual network named Network1 in the East US Azure region.

The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.

Existing Environment. Identity Environment

The on-premises network contains an Active Directory Domain Services (AD DS) forest.

The forest contains a single domain named corp.adatum.com.

The corp.adatum.com domain syncs with a Microsoft Entra tenant named adatum.com.

Existing Environment. Database Environment

The datacenter contains the servers shown in the following table.



DB1 and DB2 are used for transactional and analytical workloads by an application named App1.

App1 runs on Microsoft Entra hybrid joined servers that run Windows Server 2022. App1 uses Kerberos authentication.

DB3 stores compliance data used by two applications named App2 and App3.

DB3 performance is monitored by using Extended Events sessions, with the event_file target set to a file share on a local disk of SVR3.

Resource allocation for DB3 is managed by using Resource Governor.

Requirements. Planned Changes

ADatum plans to implement the following changes:

Deploy an Azure SQL managed instance named Instance1 to Network1.

Migrate DB1 and DB2 to Instance1.

Migrate DB3 to Azure SQL Database.

Following the migration of DB1 and DB2, hand over database development to remote developers who use

Microsoft Entra joined Windows 11 devices.
Following the migration of DB3, configure the database to be part of an auto-failover group.

Requirements. Availability Requirements

ADatum identifies the following post-migration availability requirements:

For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.

Ensure that if a regional disaster occurs, DB1 and DB2 can be recovered from backups.

After the migration, App1 must maintain access to DB1 and DB2.

For DB3, manage potential performance issues caused by resource demand changes by App2 and App3.

Ensure that DB3 will still be accessible following a planned failover.

Ensure that DB3 can be restored if the logical server is deleted.

Minimize downtime during the migration of DB1 and DB2.

Requirements. Security Requirements

ADatum identifies the following security requirements for after the migration:

Ensure that only designated developers who use Microsoft Entra joined Windows 11 devices can access

DB1 and DB2 remotely.
Ensure that all changes to DB3, including ones within individual transactions, are audited and recorded.

Requirements. Management Requirements

ADatum identifies the following post-migration management requirements:

Continue using Extended Events to monitor DB3.

In Azure SQL Database, automate the management of DB3 by using elastic jobs that have database-scoped credentials.

Requirements. Business Requirements

ADatum identifies the following business requirements:

Minimize costs whenever possible, without affecting other requirements.

Minimize administrative effort.

HOTSPOT (Drag and Drop is not supported)

You need to recommend which service and target endpoint to use when migrating the databases from SVR1 to Instance1. The solution must meet the availability requirements.

What should you recommend? To answer, select the appropriate options in the answer area.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:





Box 1: Managed Instance link
Migration service

Managed Instance link feature, which enables near real-time data replication between SQL Server and Azure SQL Managed Instance. The link provides hybrid flexibility and database mobility as it unlocks several scenarios, such as scaling read-only workloads, offloading analytics and reporting to Azure, and migrating to Azure. And, with SQL Server 2022, the link enables online disaster recovery with fail back to SQL Server (currently in preview), as well as configuring the link from SQL Managed Instance to SQL Server 2022.

Incorrect:
* log Replay Service (LRS)
Log Replay Service (LRS), which you can use to migrate databases from SQL Server to Azure SQL Managed Instance. LRS is a free cloud service that's available for Azure SQL Managed Instance and based on SQL Server log-shipping technology.

Consider using LRS in the following cases, when:

You need more control for your database migration project.
*-> There's little tolerance for downtime during migration cutover.
Etc.

However, Tip:
If you require a database to be read-only accessible during the migration, with a much longer time frame for performing the migration and with minimal downtime, consider using the Azure SQL Managed Instance link feature as a recommended migration solution.

* SQL Data Sync
Azure SQL Data Sync does not support Azure SQL Managed Instance or Azure Synapse Analytics at this time.

Box 2: A VNet-local endpoint
Target endpoint

Only VNet-local endpoint is supported to establish a link with SQL Managed Instance.

Scenario:
Availability Requirements
Minimize downtime during the migration of DB1 and DB2.

Planned Changes
Deploy an Azure SQL managed instance named Instance1 to Network1.
Migrate DB1 and DB2 to Instance1.

Existing Environment. Network Infrastructure
ADatum has an on-premises datacenter
ADatum has an on-premises datacenter and an Azure subscription named Sub1. Sub1 contains a virtual network named Network1 in the East US Azure region. The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.

Existing Environment. Database Environment
SVR1: Windows Server 2016 with SQL Server 2016 Enterprise, and has Always On availability group containing databases DB1 and Db2.


Reference:

https://learn.microsoft.com/en-us/azure/azure-sql/managed-instance/managed-instance-link-feature-overview




Case Study

This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs.
When you are ready to answer a question, click the Question button to return to the question.

Overview

ADatum Corporation is a financial services company that has a main office in New York City.

Existing Environment. Licensing Agreement

ADatum has a Microsoft Volume Licensing agreement that includes Software Assurance.

Existing Environment. Network Infrastructure

ADatum has an on-premises datacenter and an Azure subscription named Sub1.

Sub1 contains a virtual network named Network1 in the East US Azure region.

The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.

Existing Environment. Identity Environment

The on-premises network contains an Active Directory Domain Services (AD DS) forest.

The forest contains a single domain named corp.adatum.com.

The corp.adatum.com domain syncs with a Microsoft Entra tenant named adatum.com.

Existing Environment. Database Environment

The datacenter contains the servers shown in the following table.



DB1 and DB2 are used for transactional and analytical workloads by an application named App1.

App1 runs on Microsoft Entra hybrid joined servers that run Windows Server 2022. App1 uses Kerberos authentication.

DB3 stores compliance data used by two applications named App2 and App3.

DB3 performance is monitored by using Extended Events sessions, with the event_file target set to a file share on a local disk of SVR3.

Resource allocation for DB3 is managed by using Resource Governor.

Requirements. Planned Changes

ADatum plans to implement the following changes:

Deploy an Azure SQL managed instance named Instance1 to Network1.

Migrate DB1 and DB2 to Instance1.

Migrate DB3 to Azure SQL Database.

Following the migration of DB1 and DB2, hand over database development to remote developers who use

Microsoft Entra joined Windows 11 devices.
Following the migration of DB3, configure the database to be part of an auto-failover group.

Requirements. Availability Requirements

ADatum identifies the following post-migration availability requirements:

For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.

Ensure that if a regional disaster occurs, DB1 and DB2 can be recovered from backups.

After the migration, App1 must maintain access to DB1 and DB2.

For DB3, manage potential performance issues caused by resource demand changes by App2 and App3.

Ensure that DB3 will still be accessible following a planned failover.

Ensure that DB3 can be restored if the logical server is deleted.

Minimize downtime during the migration of DB1 and DB2.

Requirements. Security Requirements

ADatum identifies the following security requirements for after the migration:

Ensure that only designated developers who use Microsoft Entra joined Windows 11 devices can access

DB1 and DB2 remotely.
Ensure that all changes to DB3, including ones within individual transactions, are audited and recorded.

Requirements. Management Requirements

ADatum identifies the following post-migration management requirements:

Continue using Extended Events to monitor DB3.

In Azure SQL Database, automate the management of DB3 by using elastic jobs that have database-scoped credentials.

Requirements. Business Requirements

ADatum identifies the following business requirements:

Minimize costs whenever possible, without affecting other requirements.

Minimize administrative effort.

HOTSPOT (Drag and Drop is not supported)

You need to recommend a service tier and a method to offload analytical workloads for the databases migrated from SVR1. The solution must meet the availability and business requirements.

What should you recommend? To answer, select the appropriate options in the answer area.

Note: Each correct selection is worth one point.

Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:





Box 1: Premium
Service tier

The read scale-out feature allows you to offload read-only workloads using the compute capacity of one of the read-only replicas, instead of running them on the read-write replica. This way, some read-only workloads can be isolated from the read-write workloads, and don't affect their performance. The feature is intended for the applications that include logically separated read-only workloads, such as analytics. In the Premium and Business Critical service tiers, applications could gain performance benefits using this additional capacity at no extra cost.

Incorrect:
* Business Critical more expensive.

Box 2: Read-scale-out

Incorrect:
* Geo-replicated secondary replicas
Scenario: For DB1 and DB2, offload analytical workloads to a read-only database replica in the *same Azure region*.

* A failover group read-only listener

Scenario:
Requirements. Availability Requirements
ADatum identifies the following post-migration availability requirements:
For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.

Business Requirements
Minimize costs whenever possible, without affecting other requirements.
Minimize administrative effort.

Existing Environment. Database Environment
SVR1: Windows Server 2016 with SQL Server 2016 Enterprise, and has Always On availability group containing databases DB1 and Db2.
DB1 and DB2 are used for transactional and analytical workloads by an application named App1.


Reference:

https://learn.microsoft.com/en-us/azure/azure-sql/database/read-scale-out



You have 20 Azure SQL databases provisioned by using the vCore purchasing model.

You plan to create an Azure SQL Database elastic pool and add the 20 databases.

Which three metrics should you use to size the elastic pool to meet the demands of your workload? Each correct answer presents part of the solution.

Note: Each correct selection is worth one point.

  1. total size of all the databases
  2. geo-replication support
  3. number of concurrently peaking databases * peak CPU utilization per database
  4. maximum number of concurrent sessions for all the databases
  5. total number of databases * average CPU utilization per database

Answer(s): A,C,E

Explanation:

CE: Estimate the vCores needed for the pool as follows:
For vCore-based purchasing model: MAX(<Total number of DBs X average vCore utilization per DB>, <Number of concurrently peaking DBs X Peak vCore utilization per DB)
A: Estimate the storage space needed for the pool by adding the number of bytes needed for all the databases in the pool.


Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-pool-overview



DRAG DROP (Drag and Drop is not supported)

You have a resource group named App1Dev that contains an Azure SQL Database server named DevServer1. DevServer1 contains an Azure SQL database named DB1. The schema and permissions for DB1 are saved in a Microsoft SQL Server Data Tools (SSDT) database project.

You need to populate a new resource group named App1Test with the DB1 database and an Azure SQL Server named TestServer1. The resources in App1Test must have the same configurations as the resources in App1Dev.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



You have a SQL pool in Azure Synapse that contains a table named dbo.Customers. The table contains a column name Email.

You need to prevent nonadministrative users from seeing the full email addresses in the Email column. The users must see values in a format of aXXX@XXXX.com instead.

What should you do?

  1. From the Azure portal, set a mask on the Email column.
  2. From the Azure portal, set a sensitivity classification of Confidential for the Email column.
  3. From Microsoft SQL Server Management Studio, set an email mask on the Email column.
  4. From Microsoft SQL Server Management Studio, grant the SELECT permission to the users for all the columns in the dbo.Customers table except Email.

Answer(s): A



You have an Azure Databricks workspace named workspace1 in the Standard pricing tier. Workspace1 contains an all-purpose cluster named cluster1.

You need to reduce the time it takes for cluster1 to start and scale up. The solution must minimize costs.

What should you do first?

  1. Upgrade workspace1 to the Premium pricing tier.
  2. Configure a global init script for workspace1.
  3. Create a pool in workspace1.
  4. Create a cluster policy in workspace1.

Answer(s): C

Explanation:

You can use Databricks Pools to Speed up your Data Pipelines and Scale Clusters Quickly.
Databricks Pools, a managed cache of virtual machine instances that enables clusters to start and scale 4 times faster.


Reference:

https://databricks.com/blog/2019/11/11/databricks-pools-speed-up-data-pipelines.html



Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Synapse Analytics dedicated SQL pool that contains a table named Table1.

You have files that are ingested and loaded into an Azure Data Lake Storage Gen2 container named container1.

You plan to insert data from the files into Table1 and transform the data. Each row of data in the files will produce one row in the serving layer of Table1.

You need to ensure that when the source data files are loaded to container1, the DateTime is stored as an additional column in Table1.

Solution: In an Azure Synapse Analytics pipeline, you use a Get Metadata activity that retrieves the DateTime of the files.

Does this meet the goal?

  1. Yes
  2. No

Answer(s): A

Explanation:

You can use the Get Metadata activity to retrieve the metadata of any data in Azure Data Factory or a Synapse pipeline. You can use the output from the Get Metadata activity in conditional expressions to perform validation, or consume the metadata in subsequent activities.


Reference:

https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity



Viewing Page 1 of 54



Share your comments for Microsoft DP-300 exam with other users:

Anonymous 9/14/2023 4:47:00 AM

please upload this
UNITED STATES


Naveena 1/13/2024 9:55:00 AM

good material
Anonymous


WildWilly 1/19/2024 10:43:00 AM

lets see if this is good stuff...
Anonymous


Lavanya 11/2/2023 1:53:00 AM

useful information
UNITED STATES


Moussa 12/12/2023 5:52:00 AM

intéressant
BURKINA FASO


Madan 6/22/2023 9:22:00 AM

thank you for making the interactive questions
Anonymous


Vavz 11/2/2023 6:51:00 AM

questions are accurate
Anonymous


Su 11/23/2023 4:34:00 AM

i need questions/dumps for this exam.
Anonymous


LuvSN 7/16/2023 11:19:00 AM

i need this exam, when will it be uploaded
ROMANIA


Mihai 7/19/2023 12:03:00 PM

i need the dumps !
Anonymous


Wafa 11/13/2023 3:06:00 AM

very helpful
Anonymous


Alokit 7/3/2023 2:13:00 PM

good source
Anonymous


Show-Stopper 7/27/2022 11:19:00 PM

my 3rd test and passed on first try. hats off to this brain dumps site.
UNITED STATES


Michelle 6/23/2023 4:06:00 AM

please upload it
Anonymous


Lele 11/20/2023 11:55:00 AM

does anybody know if are these real exam questions?
EUROPEAN UNION


Girish Jain 10/9/2023 12:01:00 PM

are these questions similar to actual questions in the exam? because they seem to be too easy
Anonymous


Phil 12/8/2022 11:16:00 PM

i have a lot of experience but what comes in the exam is totally different from the practical day to day tasks. so i thought i would rather rely on these brain dumps rather failing the exam.
GERMANY


BV 6/8/2023 4:35:00 AM

good questions
NETHERLANDS


krishna 12/19/2023 2:05:00 AM

valied exam dumps. they were very helpful and i got a pretty good score. i am very grateful for this service and exam questions
Anonymous


Pie 9/3/2023 4:56:00 AM

will it help?
INDIA


Lucio 10/6/2023 1:45:00 PM

very useful to verify knowledge before exam
POLAND


Ajay 5/17/2023 4:54:00 AM

good stuffs
Anonymous


TestPD1 8/10/2023 12:19:00 PM

question 17 : responses arent b and c ?
EUROPEAN UNION


Nhlanhla 12/13/2023 5:26:00 AM

just passed the exam on my first try using these dumps.
Anonymous


Rizwan 1/6/2024 2:18:00 AM

very helpful
INDIA


Yady 5/24/2023 10:40:00 PM

these questions look good.
SINGAPORE


Kettie 10/12/2023 1:18:00 AM

this is very helpful content
Anonymous


SB 7/21/2023 3:18:00 AM

please provide the dumps
UNITED STATES


David 8/2/2023 8:20:00 AM

it is amazing
Anonymous


User 8/3/2023 3:32:00 AM

quesion 178 about "a banking system that predicts whether a loan will be repaid is an example of the" the answer is classification. not regresion, you should fix it.
EUROPEAN UNION


quen 7/26/2023 10:39:00 AM

please upload apache spark dumps
Anonymous


Erineo 11/2/2023 5:34:00 PM

q14 is b&c to reduce you will switch off mail for every single alert and you will switch on daily digest to get a mail once per day, you might even skip the empty digest mail but i see this as a part of the daily digest adjustment
Anonymous


Paul 10/21/2023 8:25:00 AM

i think it is good question
Anonymous


Unknown 8/15/2023 5:09:00 AM

good for students who wish to give certification.
INDIA