Industry Expertise: How to Achieve Zero Downtime Data Migration

Every telecommunications company will, at some time, need to consolidate inventory systems or replace a legacy inventory management solution. One common reason is to address new technology challenges such as 5G rollouts or hybrid, VNF-based networks. These technologies have requirements that many current management systems are not able to handle. Virtual network functions, 5G, FTTx, and flex-grid optical networks all require more advanced methods than most older systems can provide.

A strategic goal of many operators is to accommodate new, comprehensive planning and network automation use cases, which may not be possible with existing tools. Manual documentation, which includes Excel, AutoCAD, and cable plans, are still in use but not compatible with the increasing dynamics in modern networks and the final goal of network automation. Eliminating fragmented system landscapes that have accumulated over time is another frequently cited reason. Often such fragmented solutions work in silos, with some combination of proprietary databases and specific inventory managements for data center IP network and DWDM network employed. Having multiple parallel systems like this is a drag on efficiency.

Whatever the trigger, the data within the inventory management system is an asset that must be properly documented. This data is the single source of truth for the network and supports critical business functions such as asset management, capacity management, equipment and service planning and rollouts, impact analysis, alarm enrichment and many others. Important decisions are made based on this data, so the system’s availability and its accuracy are of the utmost importance. A smooth, zero-downtime data migration and efficient go-live of the new system is essential to prevent operational damage and business impacts.

The Delta Migration Methodology

The preferred process uses a continuous delta migration methodology. In this approach, the new system runs concurrently with the existing system and only the delta between the existing and new databases is migrated at each migration run. This eliminates the need for downtime as the two databases are synchronized.

The process involves comparing the full data between the source and destination platforms but only creating and changing data that are new or different. As the read and compare operation works much faster than the create operation, smaller, faster data transfers and migration cycles can occur to enable the continual adaptation of migration rules.

Since both systems run in parallel and data is continuously being aligned, the ongoing delta migration process does not require any downtime. When the migration quality reaches the required level, the old system can simply be switched off and users can proceed with using the data in the new system. A zero-downtime migration will have been achieved.

Keep in mind, this can only be realized with a framework that uploads and aligns data with the new application. The framework should encompass interfaces with data sources like NMS, EMS, Managers, BSS/OSS or any other database. This framework is a software concept that will govern the entire migration process, which encompasses the upload, transformation, and alignment of any kind of entities, attributes, or relations. The alignment process can be run based on a predefined schedule or on demand. The framework should also log the results of the process and inform users of successful data uploads, data clashes or any other errors that may need to be handled by a planner or operator.

Beside alignment rules, which can be defined in a graphical ETL tool or written as JAVA code, the framework should provide configuration options—for example, mapping tables to map source data to the new system, or black- or whitelists to include and/or exclude entities from migration.

The most important feature of such a framework is the calculation of deltas between the source and the new target system. Such delta calculation would speed up the alignment process between the source systems and target system dramatically, as only missing entities must be created in the target system. This accelerated procedure allows the running of an alignment process several times a day. For each, run migration rules and mapping tables can be adjusted according to the latest results. The process can be repeated until the required migration quality has been achieved.

Bernd Pruessing, Senior Solutions Consultant at FNT Software, discussed how to successfully execute a zero-downtime data migration in Pipeline Magazine. Read his full article here.


We are always looking for ways to improve. If you need more information or would like to get in touch with our experts, contact us here.