Tuesday, March 12, 2013

Could External Quality Assurance Hamper Your Chance of Data Migration Success?


I’m not sure of the exact rate for failures in data migration projects. Along the way I’ve seen Gartner report that somewhere around 83% of migrations either fail or overrun budgets and schedules and if memory serves I believe I’ve read that Forrester reported the success rate at around 16%.  The exact number probably depends upon who is doing the reporting, whom they survey and how candid the responses they receive are. Whatever the case, the number is big, scarily big.

To my way of thinking any area of a major project where the weight of historical evidence suggests that somewhere between 8 and 9 of every 10 attempts will be significantly challenged should be subject to two things:

  •  External quality assurance processes in an attempt to make sure that the chance of success isn’t derailed by things not being done as they should. Adding another voice of experience or another set of eyes if you will; and

  •  Some form of contribution to the wider data migration community of practice to help understand where things go wrong and over time (as a collective drawing from the positive and negative experience across many projects) look to evolve the methodologies used to undertake data migrations and lift the success rate.


Unfortunately, at least in my experience at least, the two items often work at cross-purposes. All too often I’ve seen the first endeavour block or even derail the second.  Quality assurance efforts will often be established as a risk mitigation exercise. That same aversion to risk often results in lack of comfort and confidence in anything which can’t be shown to have been done many times before. An established methodology is preferred over that which might be construed as cutting or bleeding edge.  That’s all well and good but, chances are, if you are following an established approach then that approach has been followed by a fair number of those 83% of projects that failed (to some degree) before yours.

This resistance to any attempt to stray from the well-worn path hinders the adoption and evolution of new concepts in so doing prevents them from gaining wider acceptance, development and enhancement over time by the wider crowd of data migration practitioners.

So we, as those practitioners, have two choices. We can accept that we can do little to change accepted practice, keep our heads down, collect our pay cheques and hope that luck or our best efforts place us in the lucky 17%, or we can look to find ways to not only increases the chances of success for our project but also contribute to the longer term average success rate of data migration projects in general. If we do want change then we must also recognise that radical shifts in methodology just won’t be possible; governance and quality assurance processes simply won’t allow that. Instead I think we must look for chance to use new techniques to build upon more accepted methodologies, filling the gaps or shoring up the problem areas that pose the biggest problems in our particular current projects.  This could take any number of forms from using lead indicators alongside lag indicators to build gradually build confidence across a project or the gradual introduction of new and improved approaches to the techniques and timing of reconciliation. 

Whatever, and however, we may go about this I hope that over time as a community of practitioners we can slowly build acceptance for new techniques, new methodologies and new measurement paradigms and over time slowly shift what is deemed to be acceptable and common practice. Who knows, maybe sometime before the end of my career we may actually see a failure rate that doesn’t send cold shivers down the collective spines of project managers everywhere. 

No comments:

Post a Comment