MOVING FROM YOUR LEGACY SYSTEM - COBOL CONVERSION, COST, AND CONSEQUENCES

Moving from Your Legacy System - COBOL Conversion, Consequences, and Benefits

Introduction

In 1960, Common Business-Oriented Language (COBOL), one of the first programming languages, was developed. Fifty-five years later, more than an estimated 200 billion lines of COBOL code are still in functional existence. COBOL is primarily used for business, administrative, and large-scale systems, and its applications are seen universally, such as in banking, health care, and government agencies like the Social Security Administration and the Internal Revenue Service (IRS). Since COBOL began in 1960, however, hundreds of other programming languages have been created, most notably Java, C++, and Python. As young developers find newer, dynamic coding languages more lucrative, COBOL-run systems are headed toward what could easily be the biggest skills gap the software industry has ever faced. As a result, organizations continue to lose employees who develop and maintain their COBOL systems. Maintaining a legacy COBOL system is a feasible solution, but organizations need to reevaluate their IT strategy to ensure their approach going into the future is proactive and feasible.  

The Rise and Fall of COBOL

Upon COBOL’s development, computer usage was effectively quantified into three categories, all powered with their own respective coding languages: supercomputing, artificial intelligence development, and business operations. COBOL, used for business operations, was specifically designed for the un-savvy programmer- each action performed in COBOL must be explicitly laid out, making COBOL a more approachable yet verbose language as compared to its 1960’s language counterparts. Industries such as banking and health care inundated their fields with computer automation, using COBOL to write their growing programs. Even in the present day, COBOL performs backend transaction processing and batch processing capabilities very well. Its computing time is much faster than compared to modern languages and runs with very low error ratings.

Although the latest version of COBOL was released in 2014, in the past twenty years COBOL has become widely known as an outdated legacy language. The majority of US universities have stopped offering COBOL courses in their computer programming curricula because young developers are more focused on contemporary languages like Java and C++ which comprise the majority of jobs in the computer programming market, as shown in the figure above.  This lack of education and interest coupled with an ever-increasing age of current COBOL developers (the average age of a COBOL programmer is between 45-55 years old (as shown in the figure below)) has led to an enormous demographics gap between skills offered versus employer needs, thus creating a favorable labor market for current COBOL developers. As years progress, the number of COBOL programmers continues to decrease while the cost to maintain COBOL systems continues to rise, driving development and maintenance labor costs for organizations (with often decreasing IT budgets) higher and higher.

 

Stuck between a Rock and a Hard Place

Many argue that COBOL is here for the long run because organizations that rely on it play such an integral part in society. It is unimaginable to think what would happen if services like health care, security validation, point of sale transactions, stock trading, and even traffic lights suddenly stopped working. All these systems rely on COBOL precisely because it is that- reliable. Mainframe processing and COBOL programming have been historically dependable systems, thereby making the desire to move away from them minimal, especially when such systems contain sensitive information. The saying “if it isn’t broken, don’t fix it” is often applied to the COBOL predicament – although old, COBOL is not a broken, bug-ridden language and consequently, upper level management concerned with the reliability of their programs are hesitant and often resistant to migrate away from an aging language. Companies often find themselves in two situations moving into the future, which we call a rock and a hard place.

The Rock

Option #1: Stay with COBOL as your programming language and operate it on a mainframe. This represents a low risk option. COBOL, while criticized for its age and wordiness, still maintains strong functionalities that some say are unparalleled in the programming world. Additionally, companies like IBM and Micro Focus continue to develop products that support and enhance COBOL programs and even help some colleges offer COBOL courses.

However, under this option companies face impending shortages of developers to enhance and maintain their systems, and increases in labor costs due to the smaller supply of said developers. Additionally, mainframe costs increase annually due to increasing complexity and maintenance. This option, while costly, is a predictable alternative because while companies will need to plan for an increasing budget, they know their system will continue to perform well.

The Hard Place

Option #2: Replace COBOL with a more dynamic coding language. This represents a high risk alternative. When one considers the process of legacy conversion, risks grow exponentially. This involves analyzing business rules embedded in the code, choosing the appropriate migration language, implementing migration, integrating new code with the existing mainframe or switching from a mainframe to a new system, and testing for bugs and performance. Accurately translating the meaning behind the source language into the syntax of the target language is very difficult. Although some code conversion is advertised as automated, extensive manpower is required to ensure the output from the target language is the same as the source language. To ensure truly successful conversion, the ideal conversion process would be one that is custom tailored for the specific software being translated, which is extremely expensive. COBOL may contain capabilities not supported in the target language, and vice versa, both of which can result in lost capabilities and an inefficient translation. Additionally, contemporary languages are built to run on other systems such as servers. Migrating from COBOL to a modern language will most likely require a systemic architectural change. This process, although lengthy, can be successfully done. Systems like the New York Stock Exchange’s Euronext migrated from COBOL, stating the changing environment of their system along with the obsolescence of mainframes in the trading industry drove them to adopt a new processing platform and server system.

This alternative has clear cost drivers with extensive labor costs for converting code, code analysts, system integrators, and system testers. These costs don’t include the risk involved in switching virtually every part of a system’s code. All of these components, and the costs associated with them, make the legacy conversion process one that is high cost and risk with conditionally successful results.

What Now?

Looking at these options, it seems as though both choices lead to huge expenditures, and ultimately don’t provide a feasible solution. Due to the prominence of organizations that so heavily depend on COBOL and their risk-averse attitude, it seems the obsolescence of COBOL is not in sight. However, the current average COBOL developer will be looking to retire within the next twenty to twenty-five years. This begs the question, how can these organizations, while maintaining COBOL find a way to look at this foreshadowing and act proactively? It is clear that the highest cost driver is the cost of waiting. Not planning an acquisition strategy to handle the aging of COBOL and its developers leads to escalating costs. Every year seasoned COBOL developers retire, and their knowledge about the systems they helped create retires with them. Organizations running COBOL need to be proactive about this diminishing institutional knowledge, so when the time comes to migrate away from the ever-fading language they will be prepared, with the knowledge of half-century coding transferred to younger developers and a plan that will guide them through the process of making those paramount decisions.

 

References

Anthes, Gary. "Cobol Coders: Going, Going, Gone?" Computerworld. Computerworld Inc, 9 Oct. 2006. Web. 9 July 2015. <http://www.computerworld.com/article/2554071/it-careers/cobol-coders--going--going--gone-.html>.

Asay, Matt. "All The Rich Kids Are Into COBOL- But Why?" ReadWrite. Wearable World Inc, 14 Sept. 2014. Web. 9 July 2015. <http://readwrite.com/2014/09/17/cobol-programming-language-hot-or-not>.

Mitchell, Robert L. "Cobol on the Mainframe: Does It Have a Future?" Techworld. ComputerWorld US, 15 Mar. 2012. Web. 9 July 2015. <http://www.techworld.com/apps/cobol-on-mainframe-does-it-have-future-3344704/>.

Trikha, Ritika. "The Inevitable Return of COBOL." HackerRank. 2014 InterviewStreet, Inc, 06 July 2015. Web. 9 July 2015. <http://blog.hackerrank.com/the-inevitable-return-of-cobol/>.

US COURTS SHARED SERVICES PROJECT

US COURTS SHARED SERVICES PROJECT

A process for identifying, analyzing, and communicating opportunities to share services

Introduction

The US Courts, in its mission to deliver Justice to our citizenry, is compelled to provide the best service possible with ever dwindling resources.  Tightening budgets across the Unites States Government has forced agencies to do more with less and in some cases less with less.  Unfortunately, the US courts does not have the luxury of doing the latter.  Therefore smart operational steps are required to ensure that resources are not wasted through redundancy or inefficiency.

Figure 1, High Level Process

Figure 1, High Level Process

In this paper we introduce a process that could help the US Courts focus their analysis of potential shared services and foster two way communication with courts at all levels of the federal judiciary.  The paper is laid out in three broad sections shown in Figure 1 below.  We begin tackling the shared services problem by: (1) researching lessons learned; then (2) analyzing information both currently on-hand and within reach at the courts; and finish the process (3) by communicating these findings and encouraging collaboration with and between court units. 

 

Lessons Learned Research

The goal of our suggested research is to gain a deep understanding of the service sharing information that currently exists and take specific lessons from them.  By exploring the processes that individual Court units, or districts, have developed, the Administrative Office (AO) will be able to implement proven ideas while also avoiding pitfalls.  The AO should create business cases around this information and pointedly communicate both the benefits and risks experienced by Court units who have already consolidated services.

Our preliminary information survey has uncovered a great deal of information on the subject of service consolidation – to include the US Courts.  Therefore, our recommended first step would be to aggregate service consolidation documents from within the US Courts then proceed to collect Federal Government, and non-profit sector information.  These documents can be obtained from multiple sources such as US Courts data repositories, US courts data calls, GAO reports, industry publications, and agency websites.  Logapps also maintains an extensive library of past projects with some being service consolidation business cases.

Analysis Approach

Our approach to analyzing opportunities for shared services at US courts begins with the identification of candidates through data analysis and continues with cost benefit and business case analysis.   

Cost Accounting Database.  The AO will need a flexible cost analysis system that enables them to understand Shared Service usage, quantify overhead implications of Shared Services, identify high efficiency courts, and determine centers of excellence.  However, the amount of data produced by the large number of court units and cost service categories produces the challenge of having a tremendous amount of financial records that are difficult to handle using basic spreadsheets.  Our proposed tool for dealing with this large record set is a cost accounting database that can analyze service costs across the US Courts.  The current database tool developed for the Accounting and Financial Systems Division (AFSD) is a solid platform for cost and statistical analysis.[1]  The database should include a court Work Breakdown Structure (WBS) that defines each generic service or process that a court employs in its operations.  A general example of such a WBS tree is shown in Figure 2 below.  The WBS is vital in that it provides the Courts with a common understanding of each service’s sub-activities and scope.  As a tool, it further allows the analysts to identify and eliminate early on those service elements that are off the table and those that are best candidates.

 “The initiative needs to be based on a clearly presented business-case or cost-benefit analysis and grounded in accurate and reliable data, both of which can show stakeholders why a particular initiative is being considered and the range of alternatives considered.” - GAO, Streamlining Government, May 2012

Financial output ratios such as dollar per employee Full time Equivalent or dollar per square foot would be produced.  These measures, as such, will allow the AO to identify opportunities between federal courts by identifying which courts are performing exceptionally well and which are faring poorly.   Other outputs may include:

Figure 2, WBS Tree

Figure 2, WBS Tree

  • Administrative costs as a percentage of total court unit expenditures
  • Cost benchmarking that allows comparison of court units to an aggregate of like-size courts - overhead and administrative costs, direct, indirect
  • Detailed cost summaries by district with a drill down capability to court unit costs
  • Cost-Benefit Analysis (CBA). 

A cost benefit analysis can help the US Courts ensure that they are making a true resource-saving commitment when considering shared services.  Individual cost-benefit analyses should be pursued to determine not only the cost but also the benefit and risk of a change in process.  As with most process change initiatives some investment in time, money, or both must be made before the change can be implemented.  A CBA allows you to see if it’s truly worth it. 

Following a structured approach, the analysts would analyze data collected through the existing financial system, interviews of stakeholders, and data calls.  The analysis should begin with the definition of the scope of service and list the basic operational assumptions.  From there it would assess the cost of the upfront investment and the resulting change in recurring   operational costs.  Risk must also be considered from all conceivable areas: implementation, operational, organizational, and political to name a few.  Finally, cost savings would be calculated as well as the proper metrics of interest such as Return on Investment (ROI), payback period, and Net Present Value (NPV).

Taken as a whole, the combination of data analysis and the more qualitative cost benefit analysis can offer a balanced look at the opportunities afforded by consolidation.  As we have found on past projects, service cost data is oftentimes inconclusive or missing altogether.  Therefore, a qualitative and cost accounting approach becomes necessary.  As the GAO noted in a recent report:

“A lack of accurate data should not, however, necessarily preclude agencies from considering the costs and benefits of consolidation. Agencies can work to analyze the information they have at hand on likely costs and benefits, as an analysis of this information can reasonably indicate the likelihood that a consolidation will offer more benefits than costs.”[2]

Collaboration and Communication

A Shared Services Portal, or Gateway, would allow two-way communication between the AO and key court unit stakeholders, and between court units themselves.  The goal of the site is to provide stakeholders with a centralized access point to Shared Services information and tools.  The entry point would be available to all stakeholders over the US Courts intranet.

The site would be designed around the needs of key stakeholders with the goal of helping them understand the objectives of shared services and the potential benefits that can be realized.  Information collected through the Lessons Learned Research and Cost-Benefit Analysis steps would be made available here. 

While the details would need time and collective effort to develop, some potential elements of the site may include the following:

-          An on-line cost model that would allow court users to visualize their savings potential.

-          Provide a venue for Courts’ feedback and build a dialogue based on the feedback.

-          Create an environment for collaborative discussion through social media outlets.

-          Strategic Sourcing Groups.  A portal that allows Court stakeholders to form and organize buying groups for the same contracted services or products.

-          Communicate Lessons Learned and Case Studies from the community on shared services. 

-          Share latest news on shared services from government and industry (both good & bad).

-          Solicit feedback through other mechanisms such as surveys and interviews.

-          Provide expert opinions from the community (industry, government, academia, & non-profit).

The AO should be the ultimate facilitator who gives freedom to individual courts to post material in opposition or cooperation.

 


[1] Currently has the capability to categorize data by court size, court unit type, district, and other categories. 

[2] United States Government Accountability Office (GAO), Questions to Consider When Evaluating Proposals to Consolidate Physical Infrastructure and Management Functions.  May 2012