Prior to doing just about anything with your COBOL code, you must first understand exactly what's there - how much code, what it does and who in your ecosystem it affects.
When modernisation hits the technical road map, its best to start with an automated assessment. Prior to doing just about anything with your COBOL codebase, you must first understand exactly what's there- how much code, what it does and who in your ecosystem it affects.
What's so hard about the assessment stage?
A challenge for many companies is that little documentation of the mainframe infrastructure exists. The core business functionality that is built into these legacy systems is highly complex and extremely difficult to understand without the use of specially designed tools or folklore from the water cooler 40 years ago.
Since figuring out what's going on in there seems so tough to do, many organisations opt to skip the assessment altogether. The theory behind leapfrogging this crucial stage lies within the assumption that "we know the languages we're using, we know the rough size of the footprint, and our mainframe has powered core operations successfully for so long, what else is there to know?" These assumptions are common and as a result, so are the abysmal statistics that come from most modernisation efforts.
A recently-published report on the application modernisation industry claimed that 93% of IT leaders found their organisation's modernisation effort challenging. Given these numbers, it makes sense to second-guess the idea of skipping an assessment. But outside of the basics that you already know, what real value could an assessment possibly provide?
A COBOL code assessment is so much more than LOC count
Lines of Code (or LOC) is an important number. It gives a 30,000 foot view of what you're up against, but it doesn't tell you much else. Assessing a legacy system goes much deeper because a line count isn't going to provide a great deal of value when the rubber meets the road. When planning a project- setting timelines, wrangling resources and communicating cross-functionally, you need to understand where there might be roadblocks, what downstream systems might be affected, how difficult your code is to interpret, what the business uses and doesn't use, the list goes on and on. Without arming yourself with a complete picture of the beast you're tasked with slaying, your modernisation journey might not be successful.
What exactly goes into technical inventory?
You might be surprised to find that it is extremely difficult to ensure you aren't missing something when you start digging into your mainframe portfolio. A good assessment service will not only provide validation that an inventory is complete, but will also traverse the application logic and validate that all external resources that could be referenced from the code-base are accounted for. The inventory and the review of relationships between assets should cover:
- A COBOL code analysis, which catalogues the application's content including copybooks, JCL source, and procedures
- Details of the missing resources and their relationship to the in-scope inventory
- Details of additional system artifacts such as DDLs, BMS screens, etc.
Hidden factors to consider
Upon gathering and cataloguing a complete technical inventory, the assessment you choose should be able to map the relationships between the programs, procedures, and code you're targeting. It is not uncommon to uncover ancillary systems or pockets of code that may be out of scope with respect to your assessment, but which might be adversely affected by the modernisation effort itself. The most powerful methods for mapping these relationships in digestible ways are:
- Diagrams - A top-notch COBOL code assessment will not only provide a theoretical picture of the legacy environment, but a visual one. A collection of diagrams that outline internal and external program interactions, logical flows, database interactions, and an understanding of how all of those wonderful green screens contribute to the madness is a must. These visualisations are particularly important if you want to extract business rules as you go.
- Resource usage details - It pays to see how programs, procedures, and the data tier interact with one another, but without the ability to drill into specific resource usage (called variables, dependencies, etc.) you aren't going to get the best bang for your buck. It pays to understand not only the high-level, but also the details of how the pieces of your legacy environment work with (and against) each other.
- Inventory complexity details - If an LOC count is the 30,000 foot view, a set of complexity reports is boots on the ground. Not only will it illustrate potential "tough spots" in the code, but cross-referenced with diagrams and resource usage, inventory complexity understanding is the most effective weapon against scope creep in a modernisation effort.
- Flags for dead and unused code - You would be amazed at the amount of COBOL code sitting on mainframes that hasn't been traversed in decades. On average, our customers find that 40% of their code-base is no longer used or completely unreachable. Pruning this tree will pay big dividends in the project scope department.
What to look for in a vendor-provided COBOL code assessment
Many assessments, such as our Automated Assessment service provide the opportunity for business rules extraction along with the must-haves we highlighted above. Business rules extraction opens the door for many other opportunities, such as re-engineering or simply the accurate documentation of the legacy environment as-is. To learn more about how COBOL code assessment can reduce the risk of a project-gone-wrong, download our Assessment whitepaper or drop us a line.