A Path To Efficient Data Migration in Core Banking Codex2287 PDF
A Path To Efficient Data Migration in Core Banking Codex2287 PDF
Source &
Requirement Configure Trial Load in
Target
Analysis Migration Tool Data Load Target System
Profiling
Change
Data
Control
Cleansing
Board
Project
Initiation
TRAINING
Figure 1
• Identify stakeholders/teams: The next step is >> Phased: Data is moved to the target system
to distinguish key partners and team members. in a phased manner. For new customers,
It is important that the relevant stakeholder, records are created directly in the target
functional and technical teams are included at system.
an early stage to eliminate the risk of gaps in
the target system. Also, there are numerous
>> Parallel run: Transactions are posted on
both the source and target system until the
aspects that require business sign-offs and
migration is executed fully. Reconciliation
commitments. A RACI matrix can help delegate
is done at the end of each day until all the
the roles and responsibilities at an early stage
data is migrated.
of the project, thus setting expectations for all
the stakeholders. Environmental factors influence the choice
of migration. The source and target system
• Strategy planning: Early stage planning determine the conversion methodology and the
helps to identify the potential issues/risks data migration tools to be used. The extract,
that may occur later in the project, enabling transformation, load (ETL) tool tends to be
banks to plan for risk mitigation. One of the preferred over other technologies for its ability to
important aspects that needs to be addressed handle large and complex volumes of data.
and planned up-front is the rollout strategy.
As Figure 2 (next page) shows, there are three Release planning is critical to determine when
basic strategies to consider for a rollout plan: a given functionality will be delivered so that
>> Big Bang: Migration is done in one single sample ETL iteration can be planned before a
major release to ascertain any potential impact
operation. It is usually undertaken over
of the migration.
the weekend. This is preferred for low data
volumes.
➤
BIG BANG PHASED APPROACH PARALLEL RUN
In this type of data migration In this type of data migration, In this type of data migration
the entire data is migrated in the data is moved in a phased the data is migrated in parallel
DESCRIPTION
one go. manner. along with business operations
module wise.
Suitable when the volume of Suitable when the volume of Suitable when the volume of
IDEAL data is low such as in the case data is large such as in the case data is large such as in the case
SCENARIOS of small banks. of large banks. of large banks.
Figure 2
Rollback strategies should be planned to regain system. This phase is a good opportunity to get
the original state of the system in situations rid of redundant/unwanted data. The following
where migration has been inadequate. The activities should be performed to ensure the
decision to roll back should be taken before the sanctity of the data:
target system goes live. A detailed rollback plan
consists of criteria for rollback, steps to roll back • Data profiling (source & target): Data profiling
the target and source systems to their prior state helps automate the identification of data and
and testing of the system in its rolled-back state. metadata while enabling the correction of
inconsistencies, redundancies and inaccura-
• Change management board: The change cies. Source and target data are profiled to
management board is responsible for initiatives discover data structure and relationships.
to establish an effective change management
process. The board must vet all the changes
• Product listing: Product listing is a process in
which each corresponding product in the new
required and schedule them for implementa-
system is mapped to the existing product in the
tion based on the analysis.
legacy system. In some cases, the products are
• Project schedule and initiation: Once the rationalized to fit the supported products in
size of the task is understood, a proper project the new target system.
governance structure needs to be imple-
mented. It has been observed that 85% of
• Data mapping: The data conversion process
begins with data mapping, which essential-
the migrations fail or experience delays.2 An
ly entails mapping the legacy system data
optimal project delivery structure helps plan elements to that of the target system. This
for sufficient contingency. The structure must process should ensure a comprehensive
include project timelines, deliverables and mapping between the source and the target
milestones. systems. All the data fields that are going to
Detailed Analysis be migrated must be examined in terms of
The analysis phase is probably the trickiest part the data types, field length, system-specific
of the process. The data can be analyzed well only rules and integrity checks. Data mapping is an
if it is understood well. Migration should be driven iterative process and for every change in the
based on the target system and not the source design or rule of the system, changes should
be incorporated in the mapping specification.
BALANCES
Savings Account
Savings For effective migration of different
Type 1 Netted Savings Balance
Account portfolios, balances should be netted.
Savings Account
Type 2
Figure 3
Figure 4
>> Unit testing: This includes verifying the Reconcile & Go-Live
scope, data mapping, target system re- This is the post-migration stage where the target
quirements and migration tool used. Also, system is ready for use.
each unit will be tested as part of the func-
During the cutover period, the source system
tional end-to-end strategy.
has to be brought to a logical accounting stage.
>> Post-migration testing: This is done once During the migration process, no new transac-
the migration is executed. It includes test- tions must be entered into the legacy system and
ing the throughput of the migration process data must be frozen. Once the data migration
and reconciliation. is executed, reconciliation checks need to be
>> UAT: The functional test on migrated data performed to ensure there are no mismatches
in the target system is validated as per the before going live. Once the target system goes
requirement specifications. live, it has to be monitored to gauge success and
note any improvements required.
Migrate
This is the final stage of migration into the target Migration is usually executed over a weekend, and
system. A cutover period is defined for the entire all the stakeholders are informed of the planned
migration process to be executed. This is usually downtime. The channels interfaced to the system
the period between the shutting down of the will not be functional during this period. If the
legacy system and the new banking system going bank is required to support any immediate
live. Data in the legacy system is frozen and then transactions, they can be carried out in an offline
extracted during this period. The process outlined mode by creating records that are fed in to the
below is followed for successful migration: system later.
Footnotes
1 assets1.csc.com/big_data/downloads/DMI_For_SAP_Banking_Flyer.pdf
2 www.dmnews.com/dataanalytics/data-migrations-challenges/article/412805/
References
• www-05.ibm.com/cz/businesstalks/pdf/Core_Banking_Modernization_Point_of_View.PDF
• www.banktech.com/core-systems/12-best-practices-for-a-core-banking-upgrade/d/d-id/1294650?
• www.oracle.com/us/industries/financial-services/idc-ap772404u-wp-488545.pdf
• www.oracle.com/technetwork/middleware/oedq/successful-data-migration-wp-1555708.pdf
• www-935.ibm.com/services/multimedia/The_value_of_transforming_core_banking_systems.pdf
• www.mckinsey.com/insights/business_technology/overhauling_banks_it_systems
• support.sas.com/rnd/migration/planning/validation/parallel.html
Ravishankar Natarajan is a Product Consultant within Cognizant’s Banking and Financial Services’ Product
Solutions Practice. He has more than 12 years of experience as a consultant implementing various core
banking products. Ravi’s areas of expertise include retail banking, private banking, credit risk and data
migration. He can be reached at [email protected].
Senil Abraham is a Business Analyst within Cognizant’s Banking and Financial Services’ Product Solutions
Practice. He has more than three years of experience in IT engagements, mainly in the retail banking and
insurance domains. Senil can be reached at [email protected].
Rajdeep Bhaduri is a Lead Product Consultant within Cognizant’s Banking and Financial Services’
Product Solutions Practice. He has more than 17 years of experience in leading business and IT
engagements, mainly in the private banking and capital markets domains. He can be reached at
[email protected].
About Cognizant
Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business
process services, dedicated to helping the world’s leading companies build stronger businesses.
Headquartered in Teaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction,
technology innovation, deep industry and business process expertise, and a global, collaborative work-
force that embodies the future of work. With over 100 development and delivery centers worldwide and
approximately 255,800 employees as of September 30, 2016, Cognizant is a member of the
NASDAQ-100, the S&P 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top
performing and fastest growing companies in the world. Visit us online at www.cognizant.com or follow
us on Twitter: Cognizant.
© Copyright 2016, Cognizant. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by any
means, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Cognizant. The information contained herein is
subject to change without notice. All other trademarks mentioned herein are the property of their respective owners.
Codex 2287