Best-Practices for Database Migrations

Data migration, even when dealing with a single database is rarely as simple as it sounds especially if the database that is being migrated supports one or more production applications. In this blog, we will discuss best practices to migrate a database in the following two scenarios:

database migrations

  • Between different versions of the same database with the same schema.
  • Between different versions of the same database or from one database to another with a different schema.

 

In a later blog, we will discuss the issues around more complex migrations, for example, on-premise to cloud migrations and data center migrations, but let us start by focusing on the more common scenario of migrating a single database.

 Steps To a Successful Database Migration

Migrating a database with minimum risk and downtime involves four steps: schema build, initial data load, change data capture, and finally, data validation and repair.

1 – Schema Build

The first step in a database migration project is to create the database tables, define integrity constraints and indexes as well as any necessary programming code such as stored procedures and database triggers.

For a homegrown application it may be necessary to manually perform the schema migration. Any programming code to ensure a well-functioning application like stored procedures or database triggers may have to be reviewed when changing database products. During a database upgrade the programming code will most likely continue to work on the newer database version but there may be opportunities for code optimizations.

If a packaged application is upgraded then the application vendor should be able to provide the schema definition and any programming code optimized for the new database.

2 – Initial Data Load

The second step in the database migration is the initial data load. To perform the initial data load, disable referential integrity constraints and any database triggers during the initial load. In order to eventually re-enable integrity constraints, it may be necessary to take an outage on the source system to get a consistent image of all tables. Alternatively, it may be possible to perform the initial load based on a consistent backup of the source system given sufficient hardware resources to host the additional system.

3 – Change Data Capture

Depending on the database size and available system resources as well as, most importantly, good planning, it may take days to perform the initial data load. However, when mission-critical application data is being migrated, it may not be possible to take such applications offline for more than a few minutes at a time. To facilitate this, it may be necessary to perform ongoing change data capture on the application and apply incremental changes to the target database system.

Ongoing change data capture and incremental changes applied to the target database is also a very important way to mitigate application migration risk. Prior to switching the application from the old to the new database, users can perform read-only/reporting tests on the destination system to validate performance and to perform final application testing, and post-migration the data flow can be reversed to keep the old system updated for some time until the organization is entirely happy with the new environment.

4 – Data Validation and Repair

User-driven random tests may be a good way to ensure the destination system is in sync with the source database prior to the cutover, but nothing beats a computer-driven exact analysis of data accuracy. Data anomalies may have to be addressed prior to switching databases.

In today’s data-focused business environment, almost every organization faces data migration challenges and we hope that this best practice blog helps you navigate these challenges successfully.

 

If you are interested in learning more about how to execute a sucessful database migration, following are additional resources that may interest you:

 

Database Migration Case Study:  Teach for America

How Teach for America used HVR for Database Migration with Minimal Downtime

About: Teach for America is a not-for-profit organization dedicated to ensuring that all children have access to an excellent education.  Teach for America relies heavily on its website to recruit prospective teachers. Prospective teachers use the site as a portal to obtain information about the program and apply to it.

So, when the time came to migrate the Oracle database that supported the website to Microsoft SQL Server, the organization needed to minimize downtime. The IT team relied on HVR to simplify and speed this database migration process.

Challenge:

  • Migrate production database from Oracle to SQL Server with minimal downtime

Databases: 

  • Oracle
  • SQL Server

 

HVR_Case-Study_TeachforAmerica_3Dcover

 

 

 

 

 

 

 

 

Best Practice Guide to Database Migrations

Most organizations plan for outages like upgrades and migrations when they can afford downtime. But as downtime windows become shorter, organizations face enormous pressure not only to complete the task quickly but also to ensure the new system is fit for purpose and thoroughly tested.

This guide will help you migrate successfully.
cloud_data_migration

You will learn:

  • How to extend your timeframe for migrations
  • Reduce or eliminate downtime
  • Mitigate data loss

Interested in the guide? You may access it via this page.

 

 

 

 

 

About Mark

Mark Van de Wiel is the CTO for HVR. He has a strong background in data replication as well as real-time Business Intelligence and analytics.

Discussion

© 2018 HVR

Live Demo Contact Us