Streamline your migration to Azure Cosmos DB for MongoDB (vCore-based) with a tool built for efficiency, reliability, and ease of use. Whether you're migrating data online or offline, this tool delivers a seamless experience tailored to your requirements. It can either use mongodump
and mongorestore
for data movement or employ the MongoDB driver to read data from the source and write it to the target. You don’t need to learn or use these command-line tools yourself.
-
Flexible Migration Options
Supports both online and offline migrations to suit your business requirements. It can either usemongodump
andmongorestore
for data movement or employ the MongoDB driver to read data from the source and write it to the target. -
User-Friendly Interface
No steep learning curve—simply provide your connection strings and specify the collections to migrate. -
Automatic Resume
Migration resumes automatically in case of connection loss, ensuring uninterrupted reliability. -
Private Deployment
Deploy the tool within your private virtual network (vNet) for enhanced security. (Updatemain.bicep
for vNet configuration.) -
Standalone Solution
Operates independently, with no dependencies on other Azure resources. -
Scalable Performance
Select your Azure Web App pricing plan based on performance requirements:- Default: B1
- Recommended for large workloads: Premium v3 P1V3 (Update
main.bicep
accordingly.)
-
Customizable
Modify the provided C# code to suit your specific use cases.
Effortlessly migrate your MongoDB collections while maintaining control, security, and scalability. Begin your migration today and unlock the full potential of Azure Cosmos DB!
You can deploy the utility either by compiling the source files or by using the precompiled binaries.
- Azure Subscription
- Azure CLI Installed
- PowerShell
This option involves cloning the repository and building the C# project source files locally on a Windows machine. If you’re not comfortable working with code, consider using Option 2 below.
-
Install .NET SDK
-
Clone the repository:
https://github.com/AzureCosmosDB/MongoMigrationWebBasedUtility
-
Open PowerShell.
-
Navigate to the cloned project folder.
-
Run the following commands in PowerShell:
# Variables to be updated $resourceGroupName = <Replace with Existing Resource Group Name> $webAppName = <Replace with Web App Name> $projectFolderPath = <Replace with path to cloned repo on local> # Paths - No changes required $projectFilePath = "$projectFolderPath\MongoMigrationWebApp\MongoMigrationWebApp.csproj" $publishFolder = "$projectFolderPath\publish" $zipPath = "$publishFolder\app.zip" # Login to Azure az login # Set subscription (optional) # az account set --subscription "your-subscription-id" # Deploy Azure Web App Write-Host "Deploying Azure Web App..." az deployment group create --resource-group $resourceGroupName --template-file main.bicep --parameters location=WestUs3 webAppName=$webAppName # Configure Nuget Path. Execute only once on a machine dotnet nuget add source https://api.nuget.org/v3/index.json -n nuget.org # Build the Blazor app Write-Host "Building Blazor app..." dotnet publish $projectFilePath -c Release -o $publishFolder -warnaserror:none --nologo # Delete the existing zip file if it exists if (Test-Path $zipPath) { Remove-Item $zipPath -Force } # Archive published files Compress-Archive -Path "$publishFolder\*" -DestinationPath $zipPath -Update # Deploy files to Azure Web App Write-Host "Deploying to Azure Web App..." az webapp deploy --resource-group $resourceGroupName --name $webAppName --src-path $zipPath --type zip Write-Host "Deployment completed successfully!"
-
Open
https://<WebAppName>.azurewebsites.net
to access the tool. -
Enable the use of a single public IP for consistent firewall rules or Enable Private Endpoint if required.
-
Download
app.zip
from the latest release athttps://github.com/AzureCosmosDB/MongoMigrationWebBasedUtility/releases
-
Open PowerShell.
-
Run the following commands in PowerShell:
# Variables to be updated $resourceGroupName = <Replace with Existing Resource Group Name> $webAppName = <Replace with Web App Name> $zipPath = <Replace with full path of downloaded zip file on local> # Login to Azure az login # Set subscription (optional) # az account set --subscription "your-subscription-id" # Deploy Azure Web App Write-Host "Deploying Azure Web App..." az deployment group create --resource-group $resourceGroupName --template-file main.bicep --parameters location=WestUs3 webAppName=$webAppName # Deploy files to Azure Web App Write-Host "Deploying to Azure Web App..." az webapp deploy --resource-group $resourceGroupName --name $webAppName --src-path $zipPath --type zip Write-Host "Deployment completed successfully!"
-
Open
https://<WebAppName>.azurewebsites.net
to access the tool. -
Enable the use of a single public IP for consistent firewall rules or Enable Private Endpoint if required.
-
Create a Virtual Network:
- Go to Create a resource in the Azure Portal.
- Search for Virtual Network and click Create.
- Provide a Name for the VNet.
- Choose the desired Region (ensure it matches the Web App's region for integration).
- Define the Address Space (e.g.,
10.0.0.0/16
).
-
Add a Subnet:
- In the Subnet section, create a new subnet.
- Provide a Name (e.g.,
WebAppSubnet
). - Define the Subnet Address Range (e.g.,
10.0.1.0/24
). - Set the Subnet Delegation to
Microsoft.Web
for VNet integration.
-
Create the VNet:
- Click Review + Create and then Create.
-
Go to the Web App:
- Navigate to your Azure Web App in the Azure Portal.
-
Enable VNet Integration:
- In the left-hand menu, select Networking.
- Under Outbound Traffic, click VNet Integration.
- Click Add VNet and choose an existing VNet and subnet.
- Ensure the subnet is delegated to Microsoft.Web.
-
Save the configuration.
-
Create a Public IP Address:
- Go to Create a resource in the Azure Portal.
- Search for Public IP Address and click Create.
- Assign a name and ensure it's set to Static.
- Complete the setup.
-
Create a NAT Gateway:
- Go to Create a resource in the Azure Portal.
- Search for NAT Gateway and click Create.
- Assign a name and link it to the Public IP Address created earlier.
- Attach the NAT Gateway to the same subnet used for the Web App.
-
Save the configuration.
- In your firewall (e.g., Azure Firewall, third-party), allow traffic from the single public IP address used by the NAT Gateway.
- Test access to MongoDB Source and Destination that have been configured with the new IP in their firewall rules.
Ensure you have:
- An existing Azure Virtual Network (VNet).
- A subnet dedicated to private endpoints (e.g.,
PrivateEndpointSubnet
). - Permissions to configure networking and private endpoints in Azure.
- Open the Azure Portal.
- In the left-hand menu, select App Services.
- Click the desired web app.
- In the web app's blade, select Networking.
- Under the Private Endpoint section, click + Private Endpoint.
Follow these steps in the Add Private Endpoint Advanced wizard:
- Name: Enter a name for the private endpoint (e.g.,
WebAppPrivateEndpoint
). - Region: Ensure it matches your VNet’s region.
- Resource Type: Select
Microsoft.Web/sites
for the web app. - Resource: Choose your web app.
- Target Sub-resource: Select
sites
.
- Virtual Network: Select your VNet.
- Subnet: Choose the
PrivateEndpointSubnet
. - Integrate with private DNS zone: Enable this to link the private endpoint with an Azure Private DNS zone (recommended).
- Click Next to review the configuration.
- Click Create to finalize.
- Return to the Networking tab of your web app.
- Verify the private endpoint status under Private Endpoint Connections as
Approved
.
If using a private DNS zone:
-
Validate DNS Resolution: Run the following command to ensure DNS resolves to the private IP:
nslookup <WebAppName>.azurewebsites.net
-
Update Custom DNS Servers (if applicable): Configure custom DNS servers to resolve the private endpoint using Azure Private DNS.
- Deploy a VM in the same VNet.
- From the VM, access the web app via its URL (e.g.,
https://<WebAppName>.azurewebsites.net
). - Confirm that the web app is accessible only within the VNet.
- From the home page (https://.azurewebsites.net), select New Job.
- In the "New Job Details" pop-up, provide the necessary details and select OK.
- Choose the migration tool: either "Mongo Dump/Restore" or "Mongo Driver".
- The job will automatically start if no other jobs are running.
Migrations can be done in two ways:
-
Offline Migration: A snapshot based bulk copy from source to target. New data added/updated/deleted on the source after the snapshot isn't copied to the target. The application downtime required depends on the time taken for the bulk copy activity to complete.
-
Online Migration: Apart from the bulk data copy activity done in the offline migration, a change stream monitors all additions/updates/deletes. After the bulk data copy is completed, the data in the change stream is copied to the target to ensure that all updates made during the migration process are also transferred to the target. The application downtime required is minimal.
For online jobs, ensure that the oplog retention size of the source MongoDB is large enough to store operations for at least the duration of both the download and upload activities. If the oplog retention size is too small and there is a high volume of write operations, the online migration may fail or be unable to read all documents from the change stream in time.
The job processes collections in the order they are added. Since larger collections take more time to migrate, it’s best to arrange the collections in descending order of their size or document count.
-
From the home page
-
Select the eye icon corresponding to the job you want to view.
-
On the Job Details page, you will see the collections to be migrated and their status in a tabular format.
-
Depending on the job's status, one or more of the following buttons will be visible:
- Resume Job: Visible if the migration is paused. Select this to resume the current job. The app may prompt you to provide the connection strings if the cache has expired.
- Pause Job: Visible if the current job is running. Select this to pause the current job. You can resume it later.
- Update Collections: Select this to add/remove collections in the current job. You can only update collections for a paused job. Removing a collection that is partially or fully migrated will lead to the loss of its migration details, and you will need to remigrate it from the start.
- Cut Over: Select this to cut over an online job when the source and target are completely synced. Before cutting over, ensure to stop write traffic to the source and wait for the Change Stream Lag to become zero for all collections. Once cut over is performed, there is no rollback.
-
The Monitor section lists the current actions being performed.
-
The Logs section displays system-generated logs for debugging. You can download the logs by selecting the download icon next to the header.
Note: An offline job will automatically terminate once the data is copied. However, an online job requires a manual cut over to complete.
Change Stream Lag refers to the time difference between the timestamp of the last processed change and the current time. During an online migration, the lag will be high immediately after the upload completes, but it should decrease as change stream processing starts, eventually reaching zero. If the lag does not reduce, consider the following:
- Ensure the job is not paused and is processing requests. Resume the job if necessary.
- Monitor for new write operations on the source. If no new changes are detected, the lag will increase. However, this is not an issue since all changes have already been processed.
- Check if the transactions per second on the source are very high; in this case, you may need a larger app service plan or a dedicated web app for the collection.
- From the home page
- Select the bin icon corresponding to the job you want to remove.
- From the home page
- Select the download icon next to the job title to download the job details as JSON. This may be used for debugging purposes.