Historically, growing a data warehouse with a complex architecture slows down the implementation of new applications. Clients demand new requirements; the backlog is growing. Refactoring projects or completely new developments are based on an analysis of the existing architecture. A concept of a new solution had to be developed in the entire IT context. In the following project, we accompanied the analysis of an extensive, existing data warehouse and conducted its integration from an enterprise architecture perspective.
- Data warehouse was developed over 20 years ago
- More than 100 parts extensively inbound and more than 80 outbound interfaces
- Around 1000 end users, at both database and report level
- Incomplete and outdated documentation
Incorrect data = incorrect core business?
Our client, active in the telecommunications sector, aimed to standardize the existing system landscape. In the future, various products, such as telephones, internet and television, will be managed via a single system. Different systems (order management, billing, logistics, address management) had to be transferred into a uniform solution. The plan was to transition the existing system landscape into the new operative CRM with interfaces to the network infrastructure.
If the data were to be incorrectly migrated, this would have lead to complex problems with possible medial effects e.g. incorrect shutdown of active customers, claims for damages, incorrect invoice printing.
The Ventum Migration Approach - a combination of classic and agile methods.
In the context of a migration project, numerous questions had to be clarified prior to the start of development:
- Which data had to be transferred or could be used by the target system?
- If only parts were taken over, which parts?
- What criteria was used to determine these subsets?
- Was information filtered specifically?
- How was data from multiple source systems synchronized and consolidated?
- How was the old data cleaned up?
- How were existing orders migrated?
Additionally, compliance regulations and audit compliance had to be observed; complete traceability of the migration was indispensable (reconciliation).
Various tools were used in the project, namely SAP Data Services and OpenTalend, as well as Oracle PL/SQL packages for customer data consolidation.
During the analysis and conceptualization phase, a set of decisions were made concerning the migration path, data dependencies within the target system, the data cleansing approach and many other aspects. Based on previous knowledge of the existing systems, familiarization with the data models was indispensable.
In addition, measurements of the expected workload and running times were be carried out. A realistic timeframe had to be defined for the final migration and integrated into the overall project plan.
If customer data is was consolidated from different systems, complex business rules had to be defined, e.g. to correct different spellings in name fields and to transfer only the most current and correct data.
The aim of the project was to achieve the highest possible degree of automation and to build up the migration in an iterative and robust manner.
The advantages of an agile approach became obvious during development. The basic object was continually extended to the final object, a data package to migrate data into the target system. This procedure allowed regression tests and qualitative statements for the upcoming cut-over weekend.
The chosen approach started with customer data, followed by address and device data and further relevant data sources. Finally, pending orders (open work orders) were considered.
After each test migration, the results were evaluated to determine the migration rate. The goal was to reduce the manual rework rate of the call center to a minimum.
Furthermore, data quality problems were transferred to the relevant departments for direct data cleansing and sustainability was measured through further test migrations.
The main factors to be observed in the cut-over are those that have an impact on the migration run time.
Were the source systems available with sufficient performance?
Were all relevant contact persons within reach (war-rooms)?
Were all users removed from the source systems in order to guarantee a uniform database?
During the execution, the performance of individual migration sections was measured and compared with target values from the test migrations in order to adhere to the critical timeframe of the overall project.
Following the go-live phase, the migration team received questions from the departments and call centers on individual data records. They required the information quickly, since the inquiries were mainly about customer contact data.
Within the scope of the project, a reporting system was developed for this purpose in order to provide departments with information on data origin and transformation (self-service). Complex analyses were performed by the migration team.
99.97% migration rate - minimal manual post-processing and faster focus on core business.
Due to the high migration rate, post-processing had to be significantly reduced. Following the changeover, the focus of the customer's employees was on the core business, sales and customer service.
Ventum provided expert knowledge in a variety of enterprise applications and host developments to this project, in order to shorten the analysis phase. By focusing on data quality and the chosen step-by-step migration approach, this subproject was able to make a significant contribution to the success of the project.
- Fully automated data migration of 99.97% of customer accounts within the scheduled timeframe
- Amortization of the entire project within 2 years, simply by saving postage costs
- Improvement of service quality through comprehensive data cleansing
- Release of documentation through internal audit