Historically growing data warehouse with complex architecture slows down the implementation of new applications. Clients demand new requirements, the backlog is growing. Refactoring projects or completely new developments are based on an analysis of the existing architecture. A concept of a new solution must be developed in the entire IT context. In the following project, we accompanied the analysis of an extensive, existing data warehouse and conducted its integration from an enterprise architecture perspective.
- Data warehouse was developed over 20 years ago
- More than 100 parts extensively inbound and more than 80 outbound interfaces
- Around 1000 end users, at both database and report level
- Incomplete and outdated documentation
Incorrect data = incorrect core business?
Our customer in the telecommunications sector wants to standardize the existing system environment. In the future, various products such as telephony, Internet and television will be managed via one system. Other systems (order management, billing, logistics, address management) have to be transferred into a standardized solution. The plan is to convert the existing system environment to the new operative CRM with interfaces to the network infrastructure .
If the data is incorrectly migrated, this can lead to complex problems with possible consequences, for example: incorrect shutdown of active customers, damage claims, incorrect invoice printing.
The Ventum migration approach - a combination of classic and agile methods.
As part of a migration project, a number of questions need to be clarified in advance before development can begin:
- Which data should be transferred to the target system?
- If only some of the data parts are transferred, which one?
- What criteria are used to define these subsets?
- Is information filtered?
- How are data from multiple source systems synchronized and consolidated?
- How to clean up the old data?
- How should existing orders be migrated?
Additionally, compliance regulations and audit compliance must be taken into account. a complete traceability of the migration is absolutely essential (reconciliation).
Different tools were used in the project, especially SAP Dataservices and OpenTalend as well as Oracle PL/SQL packages for customer data consolidation.
In the analysis and conception phase many factors must be defined for example: migration path, data dependencies regarding the target system, data cleansing approach. Depending on previous knowledge of the existing systems, getting familiar with the data models can be necessary.
Next step is to measure the expected quantity structures and running times. A realistic time window must be defined for the final migration and integrated into the project plan.
If customer data is being consolidated from different systems, complex business rules have to be defined, e.g. to correct different spellings in name fields and to transfer only the most current and best data. The aim of the project was to achieve the highest possible degree of automation in order to design the migration that is repeatable and robust.
The benefits of an agile approach become evident during development. Starting with the basic objects, the data package to load data into the target system grows continuously. This approach allows regression tests and qualitative statements for the upcoming cut-over weekend.
The selected approach was started with customer data, in the next steps extended with address and device data and additional relevant data sources were processed. Finally, pending orders (open work orders) were taken into account.
After each test migration, the results were evaluated to determine the migration rate. The goal was to reduce the manual rework from call center to a minimum.
In addition, data quality problems were submitted to the relevant departments for direct data cleansing and sustainability was measured by further test migrations.
The main factors to be observed in the cut-over are those that have an impact on the migration run time.
- Are the source systems available with satisfactory performance?
- Are all relevant contact persons reachable (war-room)?
- Are all users banned from the source systems in order to guarantee a uniform database?
During execution, the performance of individual migration phases is measured and compared with values from the test migrations in order not to jeopardize the critical time window of the project.
After the go-live, the migration team anticipates questions from the departments and call centers regarding specific data sets. It is necessary to provide a prompt response, as the inquiries are usually about customer contact data.
Within the scope of the project, a reporting system was developed for this purpose in order to provide departments information about data origin and transformation (self-service). Complex analyses were implemented by the migration team.
99.97% migration rate - minimal manual post-processing and faster focus on core business
Due to the high migration rate, post-processing is reduced to a minimum. The client focus was, after the migration, from the day 1 on the core business, sales and customer service.
Ventum provided expertise in a variety of enterprise applications and host development which shortened the analysis phase. By focusing on data quality and the chosen step-by-step migration approach, this sub project was able to make a significant contribution to the success of the project.
- Fully automated data migration of 99.97% customer accounts within the planned time window
- Roundup of customer accounts for consolidated invoicing
- Amortization of the entire project within 2 years simply by saving on postage costs
- Improvement of service quality through comprehensive data cleansing
- Release of documentation by internal audit