Moving OdeToCode to Azure - The Database

Thursday, January 3, 2013 now runs on Windows Azure.

Once the decision was made to move, the first effort was investigating how to move the OdeToCode database from SQL Server 2008 to an Azure SQL Database.  At this moment in time, Azure SQL databases have most of the same features as regular SQL Server databases with one notable exception: there is no ability to run BACKUP or RESTORE commands. However, there are at least 5 different options for moving a database to Azure listed in the Microsoft documentation, as well as a few third party tools.

The first recommended option in the Microsoft documentation is to use the "Generate Scripts" option of SQL Server Management Studio. Since this option only migrates the database schema (no data), I didn't give it a try.

The second option is to use the Microsoft Sync Framework. Since I wasn't interested in setting up a long term synchronization capabilities, I didn't give this option a try, either.

The third option is to use SQL Management Studio to "Export a Data-tier Application". This option looked promising because it is similar to a backup and restore operation. The Export operation creates a BACPAC file, which you can place into Azure blob storage to import into a new Azure SQL database. Setup is simple and just requires right-clicking on a database in SQL Server Management Studio and finding the export option under the Tasks menu.

Creating a BACPAC

Unfortunately, I was never able to export my local database because of the following error: "statements in user dbo's definition cannot be deployed". Since the dbo user is a special user in SQL Server, there are no options to change or recreate the dbo user, or at least no options I could discover. The OdeToCode database was originally a SQL Server 2000 database, and perhaps there are some legacy pieces the export routine cannot cope with. I'll also mention that once the database made it to Azure, it was simple to export the Azure database and import locally.

After failing with BACPAC I tried the fourth option of using SQL Server Integration Services. I should mention that I have a history with SSIS dating back to 2005.  Most of the history is filled with tears and gnashing of teeth, but I was willing to give SSIS another try in 2012. I gave up after 3 failed attempts. SSIS continues to be on my list of things to avoid unless I'm under extreme duress (it goes on the list after the vegetable celery, but before Philadelphia International Airport).


Just moments away from writing a one-time custom import/export, I tried the fifth and final option of downloading the "SQL Database Migration Wizard". Be careful with the big, purple "Download" link on Codeplex, because the default download is (currently) v3.9, which requires SQL Server 2008 bits. I had to hunt around in the downloads area to find v4, which works when you only have SQL 2012 bits installed.

The SQL Database Migration Wizard isn't the prettiest application in the world, but it has three endearing qualities:

1) It works

2) It's simple

3) It works

SQl Database Migration Wizard

The tool generates all the SQL scripts needed to recreate a database, and uses the SQL bulk copy (bcp) tool to export and import data (creating one .bcp file per table). The end result was a working Azure SQL database for to use.


BACPAC might be the path for the future, but BACPAC exports only seem to work with fresh, new databases. The SQL Database Migration Wizard on CodePlex is a solid tool that works with new databases as well as legacy, quirky databases.

Three cheers for community tools!

gravatar Robert Royall Thursday, January 3, 2013
If you're using SSMS 2008 or 2008 R2, there should be an option you can pick in Generate Scripts that allows you to export your data. When you get to the Set Scripting Options screen, there's a button marked Advanced - click on that and about halfway down the list in the dialog that pops up is an option called "types of data to script" that lets you pick to generate a script with just schema, just data, or both combined. I use this at work to backport databases created in 2008 R2 to one of our servers that still has 2005 installed on it.
gravatar Mike Lawton Thursday, January 3, 2013
The first option to Generate Scripts actually can do data, but it's set to schema by default and Microsoft has not-so-cleverly hidden the option in the advanced area. Click the advanced button and the very last option under "General" allows you to switch from "Schema only" to "Data only" or what you would have needed, "Schema and Data". Such beautiful functionality hidden off the side of the map where there be dragons.
gravatar erwinroots Thursday, January 3, 2013
I share your pain with Integration Services.
gravatar James Barrow Thursday, January 3, 2013
Another issue with BACPAC is that you need clustered indexes on all tables to be able to export. While its a good thing to have you might inherit a legacy database and it could be painful to have to add them. Slightly off topic, I also recommend trying out SQL Server Data Tools (SSDT) for development of the schema since you can target Azure and pick up issues at build time, which is useful. Hopefully the tooling will support static data and deployment of said static data automatically, instead of only offering a single post-deploy script that you need to maintain (we import different MERGE scripts for static data tables).
gravatar friism Thursday, January 3, 2013
AppHarbor maintains a console tool that uses the SQL Server Bulkcopy API: You have to move the schema manually, though.
gravatar Craig Friday, January 4, 2013
Something you could try (I know you have already done it but for other readers sake), import the database into a Database Project in VS2012. It then has an option to synchronise with Azure. This is how I manage changes in my database which is run on Azure for production.
Comments are closed.

My Pluralsight Courses

K.Scott Allen OdeToCode by K. Scott Allen
What JavaScript Developers Should Know About ECMAScript 2015
The Podcast!