5 Myths About Legacy Modernization

Back to news

Myth 1: Legacy applications are dinosaurs that just need to be replaced

There seems to be a cult of modernity surrounding software, but in other areas of life we like to say “They don’t build ’em like they used to.” The truth is probably somewhere in between. Old houses and old furniture are often of superior quality not because the skills of craftsmen have since declined, but because it is the good things that we keep. An inferior piece of furniture became firewood, but something really good survived to become an antique. The same is true with legacy software. Unimportant or poorly designed applications were likely discarded years ago. The applications that survived long enough to become legacy applications are central to operations and contain decades worth of vital business logic. These are heirlooms left by a previous generation, but if we want them to be functional–not metaphorical museum pieces–we may need to replace the knobs and oil the hinges.

Myth 2: Modern programming languages can’t handle batch processing

Just as many believe legacy applications aren’t good for anything, there are also a lot of legacy programmers who don’t believe it is possible for modern languages to handle batch processing as well the legacy applications that were purpose-built for such functions. The truth is, until quite recently, they were right. But starting in 2009 with the release of Spring Batch, a number of lightweight frameworks became available that brought the reliability, robustness, and functionality of legacy batch processing to Java environments. At Blu Age, we have developed an open-source solution that brings this functionality to .NET environments as well.

Myth 3: Legacy modernization takes years before you see results

Okay. Okay. It’s not fast, but it can be made faster and it can be structured so that it allows for incremental results. Modern tooling automates the modernization process, increasing speed. Furthermore, at Blu Age the first step to any modernization process (perhaps after the initial proof-of-concept) is to analyze the overall code environment to plan an incremental approach. With an incremental approach, supported by modern tooling, you get results in weeks or months, not years. Your budget can be broken down into discrete projects instead of having to allocate for multiyear expenditures.

Myth 4: Legacy modernization is too expensive

Well, it’s true, it’s not exactly cheap, and if it is, the result will probably be something like JOBOL, i.e. modern code like Java that still retains the legacy or procedural flair of the old code like COBOL. But the real question is: what is not modernizing costing you? Each time you kick the can down the road, your firm’s technical debt increases. You may be able to wrap your COBOL and get it to the cloud, but will it be ready to integrate with the next step in your company’s digital transformation?

It is also important to remember that modern toolchain approaches not only increase the speed of the process but also reduces cost.

Myth 5: Legacy modernization projects always fail.

So this is the big one, and, unfortunately, there are a lot of statistics to back it up. The key here is to plan your project to minimize the risk of failure. Long multiyear projects often fail, because financial priorities change in the meantime and big line items with no results to speak for them often get axed during lean budget periods. Moreover, big multiyear projects are an integration nightmare. An incremental approach breaks large projects down into smaller units and breaks these units down into unique iterations which are integrated and tested continuously to ensure functional equivalence throughout the process, reducing the risk of failure.

Within a pre-defined and limited perimeter in terms of budget, time, source code, datasets and KPIs, we can demonstrate the capabilities of our solutions. Contact us for a Proof-of-Concept.

Back to news