Old Data vs. Big Data: Data Management and Legacy Systems

02/12/2021 minute read Rob Anderson

Data is moving closer to the people on the front lines, with one major obstacle - legacy systems. Will big data melt big iron?

The use cases for big data generally apply to large, Fortune 1000-type companies. However, operational and transactional data to support big data architecture and reporting is often trapped in legacy databases that don’t integrate with SQL Server, Oracle Database, Db2, or PostgreSQL.

Legacy Data Management: What's In There?

Pre-relational databases from VSAM to Adabas or IDMS can be a treasure trove for data scientists. Details of how fast user and supply chain interactions occur, to how often, when, where, and with who- it's all there. Some refer to this as dark data, underutilised information assets that have been collected for single purpose and then archived. But given the right circumstances, that data can be mined for other reasons. Infinity Property & Casualty Corp., for example, realised it had years of adjusters' reports that could be analysed and correlated to instances of fraud. It built an algorithm out of that project and used the data to reap $12 million in subrogation recoveries.

The Immovable Object vs. The Irresistible Force

Around 70% of Fortune 500 companies are still reliant on the mainframe, and most - if not all - of these entities support pre-relational databases. The process of migrating data can be risky, expensive, and disruptive. But at the same time, the granular insight achieved through this data is what will keep these companies alive, enabling efficiencies, new product development, and closer relationships with their customers. As old data meets big data, something has to give. It seems that the insurance and financial verticals will help us find out what. These companies have the most data available, as well as the biggest challenges around new product development, customer churn, and integration. Some choose to modernise legacy applications while keeping them on the mainframe, like Progressive Insurance. Legacy data management was a key aspect of Desjardins General Insurance Group's strategy for growth, spurring a $45M project that analysts call the largest data modernisation project in insurance history. However, many shy away from large modernisation projects, fearful of disruption or failure. Companies cannot switch off those systems or simply import the data into modern platforms.

So What To Do?

The good news is technology has caught up, and there are ways to get legacy data out of the mainframe without a migration. Companies House used this approach to legacy data management, syncing pre-relational IDMS databases to Oracle data warehouses for better visibility and reporting. The big change now is not that everyone is an IT manager there are still plenty of ways companies will control devices, access to computers, and data but that everyone is a consumer of a lot of data. Making that easy on them will most likely be a winning strategy for data management moving forward.

Further Resources