Imagine you run a public sector agency that provides a comprehensive range of social security payments to millions of beneficiaries nationwide, including subsidies for housing, family and childcare support, education and more. The problem? Every application you use to intake, process and pay claims, along with the tools used to track and report these transactions—plus every other daily internal business function—are tied to an antiquated technology.
What’s more, supporting this dated technology is costing your agency millions—funds you have to account for to the taxpayer, and which you could be redirecting for other mission-critical uses. These are just some of the challenges confronting organizations with databases and strategic applications stuck on mainframes running outdated languages like COBOL, Assembler and more.
Staff attrition means vital organizational knowledge down the drain
An even greater hazard: the staffers these organizations rely on to support their mainframes are moving towards retirement. Not only will new hires never understand the business rules and tribal knowledge needed to run these systems, they’re not even trained in COBOL and other decades-old programming languages, since today’s databases run on modern languages such as C++ and Java.
So how can you modernize and transform your mainframe to ensure you have the agility, flexibility and scalability you need now and into the future—while potentially adding millions in savings to your bottom line?
Set your system’s vital functions free
Strategic applications underpin the operational activities of any organization, playing a key role in all day-to-day operational tasks. They’re also the building blocks of your organization’s IT ecosystem, with virtually all business functions dependent upon their functioning reliably.
In other words, strategic applications are what make your organization’s business and IT operations possible. This also means applications that are outdated can dramatically slow operations, especially when they’re running enterprise-level workloads.
Yet mainframe systems remain the backbone many private and public sector organizations. Why? Until recently, they were the only option to execute transactional workloads fast enough to keep pace with business. But business has changed; today it requires far greater agility and scalability—and mainframes simply lack the capacity to adapt to such minute-by-minute changes.
The most daunting challenge of modernizing your mainframe
Migrating applications built on monolithic mainframe architecture, which are incompatible with today’s cloud-based microservices, can be a time-consuming, costly and error-prone proposition. What’s more, with millions of lines of critical code, relying solely on manual translation to perform such a herculean task significantly increases your risk of cost overruns and operational hang-ups.
An agency with a wide array of mandates—and zero margin for error
In this case, the client provides services to millions of users nationwide, including social security payments and vital subsidies for housing, family and childcare support, education and more. This meant they faced several non-negotiable mandates.
First, they must migrate to a platform that gave them flexibility to innovate to better serve their users, while also ensuring this transition entailed no downtime or lapses in service. Moreover, they had to be certain the highly confidential personal and financial data of their millions of clients remained secure throughout the entire transition process.
The roadblocks to modernization
The agency was experiencing serious ongoing cost overruns due to outdated applications built on Bull mainframes and IBM technologies. They needed to upgrade these applications, but couldn’t because they weren’t compatible with the modern platforms to which they planned to migrate.
Their goal in modernizing was to achieve greater business agility, cost efficiencies and faster time-to-market. The plan: to execute a technical transformation that would let the client integrate their strategic applications within the new modernized platform, dramatically speeding up internal and external business processes.
However, like many businesses these days, they faced a critical shortage of COBOL-trained staff who could execute the herculean task of re-coding involved in this project. Faced with what seemed like an impassable roadblock, the agency approached us as their last hope to successfully execute this much-needed modernization.
The solution: Automated migration supported by expert service
Our initial assessment of the customer’s strategic applications confirmed that this migration would require extensive re-coding that would take many months to complete and test—potentially doubling the cost of the project.
This was largely due to the fact that using in-house IT staff for such translations is fraught with pitfalls. The larger the organization the greater the hazards, including human error and lack of communication across teams that frequently lead to duplicated efforts and the failure to detect and convey information on translation errors and fixes.
Fortunately, mLogica’s exclusive LIBER*M suite allowed the project team to recompile the agency’s legacy code accurately and securely, all while staying within agreed-upon project timelines and budget.
The game plan
With a comprehensive understanding of both the agency’s needs and its unique challenges, as well as the experience needed to recognize potential pitfalls, our project team deployed LIBER*M to re-code each of the client’s applications. This ensured they were ready to hit the ground running on their new RHEL and x86 platforms.
LIBER*M speeds up replatforming of legacy applications, which also allowed for refactoring of the COBOL code into C++ or Java, making the migration process more time- and cost-efficient for enterprises with large mainframe workloads.
Additionally, the migration needed to preserve the system’s decades of accumulated business logic. The project team leveraged LIBER*Z, mLogica’s CICS-like high class transaction monitor and LIBER*Batch, mLogica’s JCL-like batch manager, to quickly and accurately move the agency’s workloads from the source platform.
Selecting new platforms based on the client’s unique requirements
The client, a vital government organization, was storing the personal data of millions of users. The team needed to select a powerful, resilient system that could securely support such massive and dynamic workloads. After comprehensive analysis, Red Hat Enterprise Linux 7 was chosen to host the customer’s hyperscale data and workloads.
The system was integrated with 600 CPUs replacing 55,000 MIPS, and eighty virtual machines (VMs), that replaced the eighty existing mainframe partitions, providing more than enough capacity for the customer to securely support their huge and increasing workloads. To ensure access for the client’s millions of users in a completely secure environment, these workloads were moved onto the Red Hat Enterprise Linux 7 system.
The result? Millions in ROI—in just one year!
While it’s well-known that migrating off an outdated mainframe can dramatically boost any organization’s bottom line—offering massive savings on infrastructure, staffing and support costs, by any standard the time to value for this client was impressive. For this vital public sector agency, selecting the right partner, one who could offer them the best tools for the job, paid off not only in a great leap forward in technology but tens of millions in savings in just one year.
In addition to critical gains in technical flexibility and scalability, the agency immediately began to realize such substantial cost reductions that inside of just one year they saved approximately $24.1 million—a compelling argument for modernizing sooner rather than later!