What is the best data migration strategy?

The most basic definition of data migration is moving data from one location to another. Data migration can take many forms, but it generally occurs during the upgrade or update of a system.

On the surface, data migration appears to be quite simple, but it is actually a very challenging process with a lot of potential problems for those involved. In order to be successful and avoid these pitfalls, it must be handled carefully and with adequate consideration. The right data migration strategy must also be chosen for the right project, to make sure the best possible chance of success ensues. 

There are two main data migration strategies, the big bang approach and the trickle approach. In this blog, we are going to discuss each of these approaches in more detail and examine which one is the “best” for particular projects and teams.

Let’s start by defining each one in simple terms.

The two data migration strategies 

The big bang approach 

The first strategy we have is the big bang. All necessary data is migrated from the source to the target destination in a short timeframe using the big bang approach. During this timeframe, the existing and new systems are unavailable to users and experience what is known as “downtime”. It is a “big bang” in that everything happens at once and it happens in quite a dynamic and all-encompassing way.

There are four stages to the big bang approach, which are: 

 

The design stage: in which the plan for the migration project is created and designed. This will include scope, budget, and proposed timeframe for the migration process. Risk should be evaluated here too with proposed solutions to manage the risk. 

The development and testing phase: At this stage, no data is migrated yet but the tools for carrying out the migration are tested and the architecture is implemented. This helps to ensure that the next stage goes as smoothly as possible without loss, error, or corruption.

 

The “big bang”: The third stage is when migration begins. In the big bang approach, this includes shutting down involved systems and moving the data all at once, regardless of how long it takes. 

 

User acceptance testing: Testing is essential in data migration. There are lots of things that can go wrong through the migration process, such as semantics issues. It is imperative that data is tested thoroughly before the new system can be rolled out to end users. 

The big bang approach described above has positives and negatives involved. On the plus side, it is generally considered to be more cost effective as only one system is being dealt with at a time and all of the migration work happens within one solid period of time. It is a less complicated system of migration, so more members of staff are able to work with the approach and complete the migration. Thirdly, it is time-limited and so it is easier to give a proposed completion time for customers and clients. 

On the flip side, the big bang approach also has associated cons. Downtime for systems is not ideal for end users and the downtime can run up costs due to profit loss. Also, as everything is occurring at one time in one process, one error can cause a much bigger failure as everything is connected. If data were to be lost in a big bang operation, it could be a catastrophic loss.

The trickle approach 

The second data migration strategy we are going to discuss is that of the trickle approach. 

The trickle approach to data migration is also known as phased or iterative migration. The trickle approach involves running both new and old systems concurrently instead of moving data in one continuous step. Due to the slow but steady movement of data, both systems remain up and running without any interruptions to customers.

This approach to data migration is often compared to an Agile approach to software development, as it is structured with smaller goals and deadlines. Rather than one big push, this approach to migration can take a long time as these small goals are met. 

The downsides to this approach centre around this complexity. The migration takes a longer time, you need to run multiple live environments and systems at once, and there can be syncing and porting issues when lining everything up. It requires an expert team that has the team and budget to complete something with this scope. 

On the other hand, the positives lie in the fact that no downtime is required for the involved systems. Everything can stay live as the data is gradually migrated from one side to another. The incremental approach also means that small bugs and errors can be dealt with without the rest of the data suffering. If there is a problem, it is cordoned off in its own section. This means that big errors and losses are less likely to occur.

The bottom line? 

Each approach to data migration has its strengths and weaknesses. The most important thing will be picking the strategy that best suits the scope, purpose, and structure of your own project, as each one works better for certain scenarios. There is really no “best” strategy, only the strategy that works best for you.

When in doubt, ask tech experts for their opinion.

Ready to accelerate your technology project?

Chat to our team of experts and let's see how we can help you.