Azure migration across international waters

Several years ago I setup an Azure subscription in Australia and became a BizSpark member. The BizSpark benefits are an excellent way to get something built on a shoestring budget and I’d highly recommend it to new startups. See here for more info on BizSpark.

I recently received a friendly email letting me know my BizSpark membership was expiring and options for continued service – the time had come to pay my own way. Given I now live in the USA and didn’t want to deal with currency conversion fees by associating a US credit card with the Australian subscription, I would need to move my data to a US subscription.

While it’s possible to transfer a subscription to a new owner at the click of a button, it’s not possible to transfer an Azure subscription to a new owner in a different country. It seemed I would have to manually migrate the data and services across.

Step 0: Plan resource groups

The original Azure subscription was a hot mess to say the least, following no logical naming pattern for resource groups or services. They also weren’t grouped appropriately.

This made it very difficult to work out which resource was application insights vs. the web app vs. the storage account vs. any number of other services the application was leveraging across Azure.

I wanted to do better with the new subscription and fortunately, the Patterns & Practices team have published a handy set of guidelines which I followed religiously and would highly recommend anyone reading this to consider following also.

Step 1: Move the data

My original Azure subscription had several storage accounts, using Blob, Table and Queue instances.

While I didn’t have a huge amount of data to transfer, I did want to avoid the additional step of downloading it locally (or to a VM) then uploading it to the new subscription where possible.

Fortunately, there’s a nice tool called AzCopy that does 90% of this!

The missing 10% is that you can’t copy an entire storage account across, so you need to copy each container. You’ll also be unable to copy tables directly from one account to the other, instead needing to export them to blobs/disk.

Blobs
Given AzCopy can’t copy an entire storage account, you’ll have to copy containers one-by-one. I wrote a little code to generate the AzCopycommands which saved a lot of time, here’s a snippet:

var sb = new StringBuilder();
var sourceUri = "https://<srcstoragename>.blob.core.windows.net/";//TODO: replace <srcstoragename> with the correct string
var destinationUri = "https://<deststoragename>.blob.core.windows.net/"; //TODO: replace <deststoragename> with the correct string
var sourceKey = ""; //TODO: source account key
var destinationKey = ""; //TODO: destination account key

//list all blob containers
var containers = _CloudBlobClient.ListContainers();

sb.AppendLine("=Containers=");
sb.AppendLine();

foreach (var container in containers)
{
    var azCopy = $"azcopy /XO /Source:{sourceUri}{container.Name} /SourceKey:{sourceKey} /Dest:{destinationUri}{container.Name} /DestKey:{destinationKey}";                

    sb.AppendLine(azCopy);
    sb.AppendLine();
}

System.IO.File.WriteAllText(@"C:\Users\<youraccount>\Documents\azcopy.txt", sb.ToString());//TODO: replace <youraccount> 

The above will result in a text file being generated with commands you can copy and paste into the console one-by-one.

They’ll look something like this:

azcopy /XO /Source:https://<srcstoragename>.blob.core.windows.net/www /SourceKey:<srckey> /Dest:https://<deststoragename>.blob.core.windows.net/www /DestKey:<destKey>

Note that the /XO flag will cause resources to not be copied if the last modified time of the source is the same or older than the destination.

Tables
Tables are a little more work to copy across. AzCopy doesn’t provide a way to copy directly to a new account like it does with blobs, so you’ll need to export somewhere (Blob, Azure VM, local PC, etc) then import from there to the new subscription.

This time we’ll have 2 commands for each table:

azcopy /Source:https://<srcstoragename>.table.core.windows.net/<tablename> /Manifest:<tablename>.manifest /SourceKey:<destKey> /Dest:C:\tables
azcopy /Source:C:\tables /Manifest:<tablename>.manifest /Dest:https://<deststoragename>.table.core.windows.net/<tablename> /DestKey:<destKey> /EntityOperation:"InsertOrReplace"

The first command exports from the source table to a local folder, the second takes the exported table and imports it into the new subscription.

Step 2: Migrate Cloud Service to App Service

Given I needed to shift things around, I took this as an opportunity to evaluate if continuing to use Cloud Services made sense given there have been few features/improvements for them lately.

After some light reading, I decided App Service was the path forward.

Fortunately, Cloud Service Web Roles map nicely to App Service . The major difference is the location of the app settings, App Service Web Apps just use web.config like usual instead of the .csdef files in the Cloud Service.

Migration turned out to be some simple renaming, from:

var someSetting = RoleEnvironment.GetConfigurationSettingValue("TheSetting");

To:

var someSetting = System.Configuration.ConfigurationManager.AppSettings["TheSetting"].ToString()

The Cloud Service Worker Role took a little more effort to migrate – in the end I decided to port it to a WebJob, although I think probably it could have been hosted as a Web App also.

Because my Worker Role manages many tasks (sending email, checking for actions from a queue, etc) it was designed to run some work, then sleep for 15 minutes to reduce transactions and cost to run.

Out of the box, the WebJob SDK doesn’t support time based triggers, only queue/blob triggers which would have meant a lot more work to re-architect.

After some searching I discovered the NuGet package Microsoft.Azure.WebJobs.Extension which includes a TimerTrigger, exactly what I needed!

It looks something like this:

public static Task RunWorkAsync([TimerTrigger("00:15:00", RunOnStartup =true)] TimerInfo timerInfo)
{
    var worker = new WorkerRole();
    worker.OnStart();
    return worker.RunAsync();
} 

Now every 15 minutes the WebJob will call into my WorkerRole code and everything functions essentially the same as it did when it was a Cloud Service.

Step 3: Test everything works and is pointing to the new subscription

The last step was simple, update connection strings in web.config to point to the new subscription, update CNAME and A Record to point to the new location and smile as everything works!