Microsoft DP-100 Exam (page: 20)
Microsoft Designing and Implementing a Data Science Solution on Azure
Updated on: 16-Feb-2026

Viewing Page 20 of 102

HOTSPOT (Drag and Drop is not supported)
You create an Azure Machine Learning workspace and set up a development environment. You plan to train a deep neural network (DNN) by using the
Tensorflow framework and by using estimators to submit training scripts.
You must optimize computation speed for training runs.
You need to choose the appropriate estimator to use as well as the appropriate training compute target configuration.
Which values should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:


Box 1: Tensorflow
TensorFlow represents an estimator for training in TensorFlow experiments.
Box 2: 12 vCPU, 112 GB memory..,2 GPU,..
Use GPUs for the deep neural network.


Reference:

https://docs.microsoft.com/en-us/python/api/azureml-train-core/azureml.train.dnn



HOTSPOT (Drag and Drop is not supported)
You have an Azure Machine Learning workspace named workspace1 that is accessible from a public endpoint. The workspace contains an Azure Blob storage datastore named store1 that represents a blob container in an Azure storage account named account1. You configure workspace1 and account1 to be accessible by using private endpoints in the same virtual network.
You must be able to access the contents of store1 by using the Azure Machine Learning SDK for Python. You must be able to preview the contents of store1 by using Azure Machine Learning studio.
You need to configure store1.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:


Box 1: Regenerate the keys of account1.
Azure Blob Storage support authentication through Account key or SAS token.
To authenticate your access to the underlying storage service, you can provide either your account key, shared access signatures (SAS) tokens, or service principal
Box 2: Update the authentication for store1.
For Azure Machine Learning studio users, several features rely on the ability to read data from a dataset; such as dataset previews, profiles and automated machine learning. For these features to work with storage behind virtual networks, use a workspace managed identity in the studio to allow Azure Machine
Learning to access the storage account from outside the virtual network.
Note: Some of the studio's features are disabled by default in a virtual network. To re-enable these features, you must enable managed identity for storage accounts you intend to use in the studio.
The following operations are disabled by default in a virtual network:
- Preview data in the studio.


Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-access-data



HOTSPOT (Drag and Drop is not supported)
You are using an Azure Machine Learning workspace. You set up an environment for model testing and an environment for production.
The compute target for testing must minimize cost and deployment efforts. The compute target for production must provide fast response time, autoscaling of the deployed service, and support real-time inferencing.
You need to configure compute targets for model testing and production.
Which compute targets should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

  1. See Explanation section for answer.

Answer(s): A

Explanation:


Box 1: Local web service
The Local web service compute target is used for testing/debugging. Use it for limited testing and troubleshooting. Hardware acceleration depends on use of libraries in the local system.
Box 2: Azure Kubernetes Service (AKS)
Azure Kubernetes Service (AKS) is used for Real-time inference.
Recommended for production workloads.
Use it for high-scale production deployments. Provides fast response time and autoscaling of the deployed service


Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/concept-compute-target



DRAG DROP (Drag and Drop is not supported)
You are using a Git repository to track work in an Azure Machine Learning workspace.
You need to authenticate a Git account by using SSH.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

  1. See Explanation section for answer.

Answer(s): A

Explanation:



Authenticate your Git Account with SSH:
Step 1: Generating a public/private key pair
Generate a new SSH key
1. Open the terminal window in the Azure Machine Learning Notebook Tab.
2. Paste the text below, substituting in your email address.
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
This creates a new ssh key, using the provided email as a label.
> Generating public/private rsa key pair.
Step 2: Add the public key to the Git Account
In your terminal window, copy the contents of your public key file.
Step 3: Clone the Git repository by using an SSH repository URL
1. Copy the SSH Git clone URL from the Git repo.
2. Paste the url into the git clone command below, to use your SSH Git repo URL. This will look something like: git clone git@example.com:GitUser/azureml-example.git
Cloning into 'azureml-example'.


Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/concept-train-model-git-integration



You use Azure Machine Learning to train a model based on a dataset named dataset1.
You define a dataset monitor and create a dataset named dataset2 that contains new data.
You need to compare dataset1 and dataset2 by using the Azure Machine Learning SDK for Python.
Which method of the DataDriftDetector class should you use?

  1. run
  2. get
  3. backfill
  4. update

Answer(s): C

Explanation:

A backfill run is used to see how data changes over time.


Reference:

https://docs.microsoft.com/en-us/python/api/azureml-datadrift/azureml.datadrift.datadriftdetector.datadriftdetector



Viewing Page 20 of 102



Share your comments for Microsoft DP-100 exam with other users:

Ashfaq Nasir 1/17/2024 1:19:00 AM

best study material for exam
Anonymous


gayathiri 7/6/2023 12:10:00 AM

i need dump
UNITED STATES


ryo 9/10/2023 2:27:00 PM

very helpful
MEXICO


Freddie 12/12/2023 12:37:00 PM

helpful dump questions
SOUTH AFRICA