How to Migrate a Web Application to Windows Azure and SQL Azure

Step-by-step instructions for getting your web app and database up and running on Azure

Robin Shahan

April 26, 2011

22 Min Read
white birds flying over other birds on a beach

In my role as director of engineering for GoldMail, I was fortunate enough to be given an opportunity to do a major migration to Windows Azure and SQL Azure last year. As I tend to do, I took a long running jump and leaped off the cliff with no bungee cord. (My mother says, "No brains, no headaches," but my father says, "No guts, no glory." I've always liked my father better.)

There is a lot of information out there about Azure and a lot of theory, snippets of code, blogs full of code, and so on, but it seemed like I could never find exactly what I was looking for. So in this article, I will share the knowledge I've gained by discussing the bits that I use over and over again in my Azure-related projects:

  • Turn a web application into an Azure web role

  • Migrate a SQL Server database to SQL Azure

  • Set up and use trace logging

  • Handle SQL Azure connectivity issues

Tips Before You Get Started

To migrate a web application, you first need to confirm that your application does not require any software installed on the web server and see if you have any special Microsoft IIS configurations. If you have either of those conditions, you will need to figure out a way to live without it or figure out how to replace it. For example, I was using some third-party software that was installed on the web server. I could not install that software on my instance in the cloud, so I had to change my application to take up the slack and handle it.

I had completely forgotten about that software until I published my web application. The Azure instance would start up and then fail, start up and then fail, and so on, not unlike my first lesson driving a car with a standard transmission. When I replaced the component, it published the first time without fail.

I also used URL rewrites, but in this case, I found I could install the URL Rewrite Module and add the rewrite configuration to my web.config file.
With Azure, you have to publish your entire web application every single time, no matter how small the modification. Change a word, change a page, or add 20 pages, it doesn't matter—you have to publish the whole thing.

If your web application is completely static—all HTML and images and no .aspx pages, you can actually publish it to blob storage and host it there. If you do that, then you can just replace single paes when you need to. Of course none of us is lucky enough to have that situation, and if we were, our jobs would probably be less interesting.

Creating and Publishing an Application to Azure

Let's create a new web application, then convert it to a cloud application that can be published to Azure. After you do this, you can proceed to convert your own web applications.

  1. Run Visual Studio as an administrator.

  2. In the menu, select File, New, Project.

  3. When the New Project dialog comes up, be sure .NET 4 is selected at the top, then select ASP.Net Web Application, fill in CvtWeb for the name, select a location, and click OK.

  4. At this point, you can run the app, and it will show a blank page with a fancy header. If you'd like to do some customization to make it more attractive, go ahead. I'll be waiting here when you're finished.

  5. Now let's make it into an Azure-able application. Right-click on the solution and select Add, New Project.

  6. On the Add New Project dialog, under the templates, select Cloud. Make sure you have .NET Framework 4 selected at the top, and select Windows Azure Project. Change the name to AzureCvtWeb and click OK.

  7. You will now be prompted with the New Windows Azure Project dialog. This is important: Do not add any roles to the solution. Leave the right side of the dialog blank, and click OK.

  8. Now your Solution Explorer should look like Figure 1. You should have a cloud project and a regular web project with no roles.

  1. Now what we want to do is convert our existing web application into a WebRole. To do this, right-click Roles and select Add, Web Role Project in Solution. It should show you a dialog called Associate with Role Project and a list of all the projects in the application. Of course, we only have one: WebCvt. Select that one and click OK. You should be able to see that your WebCvt is now listed as a role in your cloud project, as displayed in Figure 2.

  1. Set the AzureCvtWeb project as the startup project, and hit F5 to run the cloud project. You should see the icon for Windows Azure in your system tray, and if you hover over it, it will tell you that the Compute Emulator and Storage Emulator have started. You should also see our fancy web application running in a browser window with an address similar to  http:// 127.0.0.1:81. Close the web browser window when you have finished admiring your hard work.

  2. To publish your awesome web application to Azure, change the value for the setting Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString in the ServiceConfiguration.cscfg file to point to your storage account on Azure rather than the development fabric. The value should look something like this, with the pertinent information replaced with your own values:

DefaultEndpointsProtocol=https;AccountName=YourAccountName;AccountKey=ReallyLongRandomLookingStringOfCharacters
  1. Right-click the cloud project and publish it.

  2. Leave the house and go down to the local coffee shop and get a cup of coffee. If you're lucky, when you get back, the app will have finished publishing. I'm not complaining, just pointing it out so you don't think there's something wrong if it takes more than a couple of minutes. In the "old days" (a few months ago in tech time), you could have gone to lunch, so Microsoft is improving this over time, and I expect they will continue to improve it further, which will impact the business of coffee shops across the globe.

  3. When you publish, if you're watching the Windows Azure Activity log in Visual Studio and the instance starts and fails and starts over, then fails and starts over, you have a problem. In the MSDN Windows Azure Forum, there is a very helpful thread pinned to the top of the forum with troubleshooting advice.

  4. After the app has finished publishing successfully, try it out by clicking on the website URL, which is displayed in the Windows Azure Activity Log; it will be something like yourservicename.cloudapp.net.

When you use this method of converting your website into a cloud application rather than creating it from scratch, the conversion process does not add WebRole.cs to the project. The WebRole.cs that is added when you start from scratch inherits from RoleEntryPoint, and it's where you would put any code that you want run when the role starts or stops, such as diagnostics configuration. Just add a class and have it inherit from RoleEntryPoint, and then you can add overrides for the events, such as OnStart and OnStop.

If your own website doesn't work the first time you try to publish it, you'll have to do some investigation and see whether you're using some kind of customized IIS configuration, or extra software, or are including some assemblies that won't work on Windows Server 2008. Refer to the aforementioned thread in the MSDN Forums for troubleshooting help.

So that's how you can take your current web application and publish it to Azure: Simply add a cloud project, then designate the current web project as the web role, and publish it. After you get that to work, you can work on the fine details, such as number of instances, diagnostics, and instance size. By the way, this method also works for Silverlight applications.

SQL Server Database Migration

How do you migrate your data from SQL Server to SQL Azure? If your data doesn't have to be relational, and you have time, you can re-architect your application to use Windows Azure table storage. If you really want it in SQL Server, there is a SQL Server Migration Wizard on CodePlex that you can use, which you can download at sqlazuremw.codeplex.com. The main screen is displayed in Figure 3.

Not every feature in SQL Server is available in SQL Azure. Anything that accesses a folder or file outside of SQL Server will not work, which means that CLR routines will not work. The MSDN article " SQL Server Feature Limitations (SQL Azure Database)" discusses some of the limitations. Another thing to know is that all your tables must have a clustered index; the migration wizard will add one for you if you don't already have one.

The Migration Wizard will analyze and/or migrate your database. You can just do an analysis, or you can analyze and migrate. I don't recommend migrating without analyzing unless you're psychic and know it's going to work. That would be like jumping out of a plane after letting your ex-spouse check your parachute rather than checking it yourself, even though you know he/she is your primary beneficiary.

You can select the whole database or specific tables, stored procedures, and so on. You can run the analysis and see whether you have any conflicts that would keep the database from running in SQL Azure—without actually trying to do the migration. This is the easiest way to figure out whether your database will migrate. After you've fixed any problems you find, you can then use the Migration Wizard to finally migrate your database to SQL Azure.

Another feature that is missing in SQL Azure is the ability to back up and restore your database. Microsoft gets a lot of feedback about this, so I am certain they will add this capability in the future, but in the meantime, you have to be inventive. In SQL Azure, your database is replicated, which means Microsoft retains three copies of your database at all times. This means that if you accidentally delete the Customer table from the database, you now have three databases without a Customer table. That doesn't give me the warm fuzzies; how about you?

You can use the SQL Azure Migration Wizard in reverse to back up your database to a local instance, but it won't do incremental changes. Plus, remember that you have to pay the cost of transferring the data between Azure and your local desktop, which could add up if you have a lot of data.

In this situation, I use Red Gate Software's SQL Compare Pro and SQL Data Compare Pro products. SQL Compare Pro compares the schema of two databases (local and SQL Azure), while SQL Data Compare Pro compares the data; the differences are then migrated back to my local copy of the database. After the databases are synced, I back up my local copy. This means if I'm really fast, I can restore the Customer table before anybody realizes there's something wrong.

Adding Diagnostics

What if you have a problem and you can't debug it in the development fabric, or you just want to log some information for tracking that things are (or aren't) working? You can configure your Azure role to write the trace diagnostics to a Windows Azure storage table. Then you can view the logs in Visual Studio or by using a third-party tool such as Cerebrata Software's Azure tools. I'll show you how to set up your role to write diagnostics data, including trace logging, IIS logs, and performance counters and will provide some information about viewing the data.

To add diagnostics, you must have SQL Server or SQL Server Express installed on your machine. This is used as a repository when testing in the development fabric. Windows Azure looks for a default SQL Server Express instance to create the local storage the first time. If you do not have this, you will need to use the DSInit command to initialize the development storage. This command can be found under the Windows Azure SDK folder in C:Program Files, and should be run at command level.

For example, let's say you have the SQL Server 2008 Developer Edition installed but not SQL Server Express. If you installed a default instance of SQL Server, you would use the following command to initialize the development storage. (Yes, that's a colon and a dot after sqlinstance.)

C:Program FilesWindows Azure Sdkv1.3bindevstoredsinit /sqlinstance:.

If your database server is named, you will need to use something like this:

  C:Program FilesWindows Azure SDKv1.3bindevstoredsinit       /sqlinstance:mysqlservername

Now that that's set up, let's go look at some code. Let's add Diagnostics configuration to the web application we created earlier and then have it do some trace logging.

The service needs to know where to write the diagnostics. It's more helpful if it writes it to your diagnostics account rather than someone else's, unless you know them and they don't mind exporting it and sending it to you upon request. If you didn't do so earlier, add the diagnostics connection string to the Service Configuration file. If it's already there, you may need to modify it. It should look like Figure 4 for a production deployment, with the obvious substitutions made.

Figure 5 shows what it will be if you are testing in the development fabric.

In the Service Definition file, rather than defining a specific variable, check for an import for Diagnostics as displayed in Figure 6, and if it isn't found, put it right before the closing statement for the role, either or .

          

If it is a web role, make sure you have tracing defined in your web.config file. It goes in the configuration section of the web.config and should look like Figure 7.

                                                   

Now let's add diagnostics to our web role, WebCvt. Diagnostics are configured in the class that is the Role Entry Point. As I noted previously, I won't have this because I didn't need it just to migrate my web application, so I'm going to add one.

Add references to the project for these Windows Azure assemblies:

  • Microsoft.WindowsAzure.Diagnostics

  • Microsoft.WindowsAzure.ServiceRuntime

  • Microsoft.WindowsAzure.StorageClient

Right-click the web role project (CvtWeb) and add a class called WebRole.cs; inherit from RoleEntryPoint, as displayed in Figure 8.

public class WebRole : RoleEntryPoint     {    }

In the top of your new WebRole class, add the using statements displayed in Figure 9 for the Windows Azure assemblies that you're going to use.

using Microsoft.WindowsAzure.ServiceRuntime;using Microsoft.WindowsAzure.Diagnostics.Management;using Microsoft.WindowsAzure.Diagnostics;using Microsoft.WindowsAzure;

Next, add an OnStart method as displayed in Figure 10; this will run when the role starts up. This is where we will put the diagnostics configuration information.

public override bool OnStart(){    return base.OnStart();}

Figure 11 shows the code required to configure the Diagnostics. This goes in the OnStart method of the web role before calling base.OnStart(). I've put explanatory comments inline.

// Obtain a reference to the initial default configuration.String wadConnectionString =     "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";CloudStorageAccount storageAccount =     CloudStorageAccount.Parse(  RoleEnvironment.GetConfigurationSettingValue(wadConnectionString));RoleInstanceDiagnosticManager roleInstanceDiagnosticManager =  storageAccount.CreateRoleInstanceDiagnosticManager(    RoleEnvironment.DeploymentId,     RoleEnvironment.CurrentRoleInstance.Role.Name,       RoleEnvironment.CurrentRoleInstance.Id);//get a configuration instance that you can modify//for a worker role, you can get the current one; for a web role,//  you need to get a default instance. This is because of a bug//  in Windows Azure with the IIS logs.DiagnosticMonitorConfiguration config =     DiagnosticMonitor.GetDefaultInitialConfiguration();//for worker roles, use this instead: //DiagnosticMonitorConfiguration config =//    roleInstanceDiagnosticManager.GetCurrentConfiguration();  // Change the polling interval – it's checking to see if there are any//   diagnostic configuration changesconfig.ConfigurationChangePollInterval = TimeSpan.FromSeconds(30.0);// Set the transfer interval for all logs to 1 minute.// This means the logging will accumulate for a minute and then be //   transferred to table storage.config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0);// Set the transfer interval for the IIS logs to be copied to blob storage.config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1.0); // Configure monitoring of a Windows performance counter// and add it to the configuration.// This will poll the %cpu every 60 seconds // and write it to the diagnostics every 120 seconds.PerformanceCounterConfiguration perfConfig    = new PerformanceCounterConfiguration();perfConfig.CounterSpecifier = @"Processor(*)% Processor Time";perfConfig.SampleRate = TimeSpan.FromSeconds(60.0);config.PerformanceCounters.DataSources.Add(perfConfig);config.PerformanceCounters.ScheduledTransferPeriod =     TimeSpan.FromSeconds((double)120);//set the configuration to be usedroleInstanceDiagnosticManager.SetCurrentConfiguration(config);

In your production version, you should consider putting the timespan values used to set up the performance monitoring logging in your Service Configuration file and change this method to retrieve them from there. Then if you think it is impacting the performance, you can change it to sample less frequently without completely republishing the service.

After you've set this up and published your application, if the tables are not defined in Table Storage, they will be magically created. With the code in Figure 11, you will get a table for trace logging and a table for the performance monitoring.

With a little more code, you can also collect information on Windows events and diagnostics infrastructure logs in table storage.

In this code, IIS logs are automatically transferred to blob storage, so that you can easily view them. IIS Failed Request logs can also be automatically transferred to blob storage, but Microsoft introduced a bug in version 1.3 of the Tools/SDK that caused this to stop working for both sets of logs. The IIS logs were fixed in SDK 1.4, but the IIS Failed Request logs were not. If you are using version 1.3 or 1.4 of the SDK, the IIS Failed Request logs won't even be created because of a permission problem with the directories, so you can't even use Remote Desktop to log into the instance and view those logs. For a workaround that enables you to view these logs in blob storage with Azure SDK 1.4, check out the article " Azure SDK 1.4 and IIS Logging".

To actually write something to the diagnostics, just use Trace statements. Put some Trace statements like those in Figure 12 in the page_load event of the Default.aspx page in our web application.

        System.Diagnostics.Trace.TraceInformation(        "[Page_Load] It's about time you checked me out!");        System.Diagnostics.Trace.TraceInformation(        "[Page_Load] What time is it here? It's {0:G}",         DateTime.Now);

Publish the web application to the cloud again.

Open the website and load the page a few times.

Wait a couple of minutes while the diagnostics are transferred to Windows Azure storage, then you can view the diagnostics.

There are several ways to view the diagnostics data. You can view them in Visual Studio if you add the storage account to Windows Azure Storage in the Server Explorer. The problem with this is it shows you the first thousand records, and if you ask to display more, it will show you the rest. There is no middle ground. Also, there's no way to remove records.

A free tool that you can use to view the information in your storage account is the Azure Storage Explorer on CodePlex. I use the Azure Diagnostics Manager and the Cloud Storage Studio from Cerebrata. The ADM has a feature where you can connect to your hosted service and look at the diagnostics for each service you have running. If you have multiple services, this is invaluable—it makes the task much easier because you can see just the diagnostics for that one service. The Cloud Storage Studio lets you look at the diagnostics as a whole, filter it by any of the table fields, and remove records by a number of different selection criteria.

Using the ADM to look at the trace logging from just my web application, I see the data displayed in Figure 13.

And remember the performance counter we added for %CPU each minute? I can view that in chart format as displayed in Figure 14.

This is obviously not a high-load web application! You can also write your own code to retrieve the data using the Windows Azure Storage APIs.

SQL Azure Connection Problems

What you will soon find (if you haven't seen it already) is that connecting to SQL Azure can fail more often than you would expect. There are four major reasons for this.

  • Active connections time out after 30 minutes if they are not used, but they are not removed from the connection pool. So you could call to open a connection and get what is essentially a dead connection from the connection pool. Then, when you execute your stored procedure you get a transport-level error and your method crashes.

  • Microsoft retains three instances of your database. One is the active one that you are executing against; the other two are replicas. When an update is applied to the server by Microsoft, the instance that is active switches around as updates are applied to the non-active instances; this process is called reconfiguration. This happens 15 to 17 times during the update process. While the switching is going on, you can have a few seconds where you cannot connect to the database.

  • Microsoft uses a multi-tenant model: They have databases from multiple customers on the same node. If one of the nodes goes down, you can have connectivity issues while it switches to one of your other instances on a different node. Microsoft will also throttle a database's performance if it hits one of the threshold markers. If a database on your node is throttled, it can impact your performance, even if it is not your database being throttled.

  • Load balancing is performed hourly. This is not caused by anything the customer does—it is done to maintain high availability. Microsoft monitors the different nodes, and can move partitions from a very busy node to a less busy node. During the move, you can have connectivity problems.

You must handle the problem of connection failures if you are going to use SQL Azure. You have to add "exponential retry code" to every section of code that accesses the database. This basically means if the first try fails, try again. If the second try fails, wait some amount of time and try again. If the third try fails, wait some longer amount of time and try again. You get to pick how much time you wait between tries, and how many tries you make. You also need to be sure you handle the case where it won't succeed no matter how many times you retry. (I like to call this case "total failure.") For example, if your customer is authenticating, you might return a message asking him to try again. If you are processing entries from a queue, you might want to requeue the entry that is failing so it eventually gets processed.

The AppFabric team has published best practices recommendations for SQL Azure that you might find helpful: here and here. I actually discovered this problem in my trace logging and wrote my own retry code before I found these articles. If you use the AppFabric team's best practices, I recommend that you add two things to that framework:

  • Add the ability to pass in a string to be written with the diagnostics information, and be sure you're writing the try#. You might want to track what method was running and some kind of identification of the record you were trying to access. This enabled me to see a pattern where I was having this problem and add extra handling, and also to see a pattern in cases where it was totally failing.

  • Add the ability to pass in a Boolean indicating whether or not to write the SqlContext to the trace logging. The SqlContext is a GUID that identifies the connection; this allows Microsoft to look on their side and see all activity for that connection. This information is very helpful to Microsoft if you have to call them for help with your access to SQL Azure. To get the SqlContext, you have to make a call to the database right after you open the connection. For this reason, I didn't want to add it to every database call in case it impacted performance. Making this change to the AppFabric team's framework will give you the flexibility to decide when you want to retrieve and log this information.

Figure 15 shows how to get the SqlContext; I put this in right after opening the SqlConnection (cnx) and before executing my stored procedure. For more information about connection management in SQL Azure, see the TechNet article " SQL Azure: Connection Management in SQL Azure".

    // Grab sessionId from new connection    using (SqlCommand cmd = cnx.CreateCommand())    {    cmd.CommandText = "SELECT CONVERT(NVARCHAR(36),      CONTEXT_INFO())";    sqlContextID = cmd.ExecuteScalar().ToString();    }

I am certain that Microsoft's SQL Azure team is taking a closer look at the SQL Azure connection failures problem to see how they can alleviate it. SQL Azure is still a pretty young technology, and I'm sure Microsoft will continue to make it better with each successive release.

Go Forth into Azure

Now you have enough information to convert your web application, put diagnostics in your Windows Communication Foundation (WCF) services and .aspx pages, convert your database, and add code to manage the SQL Azure connections. Why are you still reading this? Go, Azure is waiting!

Robin Shahan has over 20 years of experience developing complex, business-critical applications for Fortune 100 companies such as Chevron and AT&T. She is currently the director of engineering for GoldMail, where she recently migrated their entire infrastructure to Microsoft Azure. Robin regularly speaks at various .NET User Group events on Microsoft Azure and her company's migration experience. Robin has Bachelor's Degrees in both Chemical Engineering and Computer Science from Texas A&M University. She can be found on twitter as @RobinDotNet, and you can read exciting and riveting articles about ClickOnce deployment and Microsoft Azure on her blog at http://robindotnet.wordpress.com.

More Articles on Windows Azure Development

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like