Explanation:

Box 1: SAP Database Option (DMO) with System Move
Database Migration Option (DMO)
DMO of Software Update Manager (SUM) in a nutshell
SUM does not only offer to update software components, but can also execute a database migration (for ABAP based systems)
DMO is the combination of software update with the database migration To only migrate the database (without software update), the approach "DMO without system update" can be used
The standard DMO is an inplace-procedure, as it keeps the Application Server level unchanged, only changes the database (host)
To combine the database migration with a move of the system to a different environment, "DMO with system move" can be used
Specifically for the conversion to SAP S/4HANA, DMOVE2S4 can be used as alternative for "DMO with system move"
DMO allows an easy reset of the procedure at any point in time DMO allows migration to SAP HANA DB, and (depending on scenario) to other database types as well SUM 1.0 (for target systems based on SAP BASIS 7.40 and lower) allows to execute a Unicode conversion as part of the migration
Incorrect:
* Azure Database Migration Service
Azure Database Migration Service enables seamless migrations from multiple database sources to Azure Data platforms with minimal downtime. The service uses the Data Migration Assistant to generate assessment reports that provide recommendations to guide you through the changes required before performing a migration. When you're ready to begin the migration process, Azure Database Migration Service performs all the required steps.
* Data Migration Assistant (DMA)
The Data Migration Assistant (DMA) helps you upgrade to a modern data platform by detecting compatibility issues that can impact database functionality in your new version of SQL Server or Azure SQL Database.
Note: With Microsoft Azure, you can migrate your existing SAP application running on IBM Db2 for Linux, UNIX, and Windows (LUW) to Azure virtual machines. With SAP on IBM Db2 for LUW, administrators and developers
can still use the same development and administration tools, which are available on-premises.
Box 2: AzCopy
The size of the files, 50 GB, makes it suitable for AzCopy rather than for Databox.
Use AzCopy from a Windows or Linux command line to easily copy data to and from Blob Storage, Azure File Storage, and Azure Table Storage with optimal performance. AzCopy supports concurrency and parallelism, and the ability to resume copy operations when interrupted. You can also use AzCopy to copy data from AWS to Azure. For programmatic access, the Microsoft Azure Storage Data Movement Library is the core framework that powers AzCopy. It's provided as a .NET Core library.
Incorrect:
* Azure Migrate
The Azure Migrate: Discovery and assessment tool discovers and assesses on-premises VMware VMs, Hyper- V VMs, and physical servers for migration to Azure. Here's what the tool does: Azure readiness: Assesses whether on-premises servers, SQL Servers and web apps are ready for migration to Azure.
* Azure Data Box
The Microsoft Azure Data Box cloud solution lets you send terabytes of data into and out of Azure in a quick, inexpensive, and reliable way. The secure data transfer is accelerated by shipping you a proprietary Data Box storage device. Each storage device has a maximum usable storage capacity of 80 TB and is transported to your datacenter through a regional carrier. The device has a rugged casing to protect and secure data during the transit.
You can order the Data Box device via the Azure portal to import or export data from Azure. Once the device is received, you can quickly set it up using the local web UI. Depending on whether you will import or export data, copy the data from your servers to the device or from the device to your servers, and ship the device back to Azure. If importing data to Azure, in the Azure datacenter, your data is automatically uploaded from the device to Azure. The entire process is tracked end-to-end by the Data Box service in the Azure portal.
Use cases
Data Box is ideally suited to transfer data sizes larger than 40 TBs in scenarios with no to limited network connectivity. The data movement can be one-time, periodic, or an initial bulk data transfer followed by periodic transfers.