Last week, a bipartisan group of House lawmakers introduced the Legacy IT Reduction Act of 2026 (H.R. 8408), a bill that would require agency CIOs to inventory every legacy system in use and task agency heads with developing five-year strategic plans to update, modernize, retire, or dispose of them. The legislation, introduced in the House on 4/21/2026, is a response to a July 2025 Government Accountability Office audit that found only three of the ten most critical federal legacy systems flagged for modernization in 2019 had actually crossed the finish line. Six years. Seven systems were still running at the time, some nearly six decades old.

Whether H.R. 8408 passes in its current form, is amended, or stalls altogether, the underlying problem is not going away. The federal government's dependence on aging, brittle, and in some cases, cybersecurity-compromised technology is one of the defining operational risks of this decade. And it is arriving at precisely the moment when agencies are being asked to do more with less: to sustain critical missions, improve citizen services, and absorb new mandates under some of the tightest fiscal conditions in recent memory.

This is the paradox of federal IT in 2026. The cost of not modernizing is now demonstrably higher than the cost of actually modernizing. Yet the traditional playbook for modernization, multi-year waterfall-style rewrites funded through dedicated appropriations, is exactly the approach that produced the GAO's disappointing scorecard. If we want a different result, we need a different approach.

The bottom line: the legacy IT problem will not be solved by another mandate, another inventory, or another multi-year appropriation. It will be solved by agencies and partners who combine AI-accelerated portfolio rationalization and code modernization with the integration discipline federal missions demand, executed in smaller, faster, lower-risk increments that compound into enterprise-scale outcomes.

Done well, this approach does more than reduce risk. It reallocates sustainment spending to new mission capability for the American taxpayer, in the same budget years the savings are realized.

The scope of the problem is worse than the headlines suggest 

The GAO's July 2025 findings are sobering, but I would argue they understate the challenge by an order of magnitude.  Those highlighted systems are just the ten that made the list. Across the federal enterprise, thousands of systems of record, the backbones of tax collection, benefits administration, health care delivery, border security, and defense logistics, sit in similar or worse condition. Recent agency consolidations and reorganizations have only compounded the problem.  Overall, the GAO has estimated that the federal government spends roughly 80 percent of its IT budget on operations and maintenance, much of it on systems that would have been retired a decade ago in any comparable private-sector environment.

This is not a technology problem in the narrow sense. It is a compounding risk. Every year of delayed modernization raises the cost of the eventual transition, narrows the pool of people who can safely touch the code, and expands the attack surface.

Why legacy modernization has been so hard

Before we talk about what works, it is worth being honest about why so much has not. In my conversations with federal CIOs and program leaders, the same structural obstacles surface again and again:

Institutional knowledge has walked out the door

COBOL, Assembly, MUMPS, IDMS, and other languages that underpin tax processing, benefits adjudication, and health care support are maintained by a workforce retiring faster than it is being replaced. In many cases, the original system documentation is incomplete, outdated, or simply missing.

The systems are load-bearing

These are not peripheral tools. They are the rails on which entitlement payments, revenue collection, case management, and national security workflows run. "Turn it off and rewrite it" is not a viable option when millions of Americans depend on continuity of service.

Interdependencies are poorly understood

Decades of point-to-point integrations, undocumented batch jobs, and shadow data flows mean that touching one system can cascade into a dozen others. The risk of breakage suppresses the appetite for change.

Business rules are buried in code

Policy that Congress passed in the 1970s or 1980s is often encoded directly into procedural logic, with no authoritative specification outside the source itself. Modernization is not just an engineering problem; it is an archaeology problem.

Funding models reward sustainment over transformation

O&M dollars flow reliably. Modernization dollars require justification, political capital, and multi-year commitments that rarely survive administration transitions and continuing resolutions intact.

Risk asymmetry paralyzes decision-making

For a career civil servant, the downside of a failed modernization is far more visible than the downside of another year of sustainment. The incentives favor deferral.

The Legacy IT Reduction Act is the latest reasonable attempt to force the inventory-and-plan discipline that has been missing, and that discipline is genuinely useful. But discipline alone does not modernize a system. Execution does. And execution is where the methodology needs to change… AI has entered the chat.

From inventory to action: The portfolio rationalization step that H.R. 8408 implies but does not name

The bill's inventory requirement is a necessary first step, but an inventory is not a plan. A list of legacy systems, even a well-documented one, does not tell an agency head which systems to modernize, which to migrate, which to consolidate, which to re-platform, and which to decommission outright. That judgment requires a discipline the commercial sector has practiced for years and that federal agencies must now adopt at scale: application portfolio rationalization.

Done properly, portfolio rationalization produces a dynamic, continuously updated view of every application in the estate, mapped against its business value, its true cost of ownership, its interdependencies, its risk profile, and the skills required to sustain it. It is the analytical layer that turns an inventory into a strategy. For each application, it produces a defensible recommendation: modernize in place, migrate to a new platform, re-platform onto modern infrastructure, consolidate with overlapping capabilities, decommission, or replace with a commercial or shared service.

The practical consequence for federal agencies is significant. An H.R. 8408 inventory answers “What do we have?” Portfolio rationalization answers what we should do about it, and it does so with the cost estimates, capacity assessments, and downstream impact analyses that the bill explicitly requires in each agency's information resource management strategic plan. An agency that treats the inventory as a compliance deliverable will produce a compliance deliverable. An agency that treats it as the starting point for portfolio rationalization will produce a modernization roadmap. 

What AI actually changes about this problem

For the last two years, there has been no shortage of breathless claims about what generative AI will do for government. I want to be precise about what it actually does, and, just as importantly, what it does not do, in the context of legacy modernization.

AI dramatically compresses the discovery phase. Large language models, properly orchestrated and grounded, can read legacy dialects and produce human-readable explanations of what the code does. They can trace interrelationships across millions of lines, identify dead code, surface undocumented business rules, flag vulnerabilities, and generate candidate specifications that subject matter experts can validate rather than author from scratch. Work that used to take a team of contractors, eighteen months of interviews, and code walkthroughs can now be accomplished in hours and validated in just weeks.

This is also the step where AI earns its keep, even before a single line of legacy code is translated. AI-augmented portfolio analysis can ingest codebases, operational telemetry, incident histories, and cost data across hundreds or thousands of applications and produce the kind of portfolio-wide view that historically took an army of consultants a year or more to assemble, and that was out of date by the time it was delivered. 

AI accelerates, but does not replace, the translation phase

Machine-assisted refactoring from COBOL to Java, or from monolithic mainframe workloads to cloud-native microservices, is now a practical reality when paired with rigorous test generation and equivalence validation. The keyword is assisted. Unsupervised code translation produces working software that encodes the wrong policy with alarming confidence. Human engineers, armed with AI-generated scaffolding and test harnesses, produce software that is both correct and maintainable.

AI enables continuous documentation

One reason legacy systems became ”legacy” is that their documentation decayed faster than the code, if it existed at all. Modern AI-assisted engineering practices can keep architectural diagrams, data dictionaries, and business rule catalogs synchronized with the codebase as a matter of course, not as a deliverable at the end of a contract, but as a living artifact of the development process through robust digital twins.

AI changes the economics of the workforce problem

A COBOL engineer supported by a well-grounded AI assistant is measurably more productive. So is a younger engineer who has never touched a mainframe but can now reason about one through an AI interface. This is how we bridge the skills gap without waiting a generation for universities to produce specialists in languages they stopped teaching thirty years ago.

AI reduces the sustainment tax even before modernization completes. Digital workers can offload routine operational tasks such as ticket triage, log analysis, data reconciliation, and document processing, freeing scarce human expertise to focus on the transition itself rather than on keeping the lights on.

What AI does not do is absolve agencies of the need for governance, security engineering, systems integration discipline, and mission expertise. An AI that hallucinates a business rule in a tax system is not a productivity tool; it is a factory whose product is liability at speed and scale. This is why the governance wrapper around AI, and the agentic AI orchestration methodology, actually matter more than the model itself. Provenance tracking, human-in-the-loop review, auditability, role-based access, and continuous evaluation are not optional features. They are the precondition for using AI in a mission environment.

The role of a systems integration partner

I lead AI research and development at CGI Federal, so I have an obvious point of view here. But I want to make a point that holds regardless of which partner an agency chooses to work with. Legacy modernization is not a product purchase. It is a multi-year, mission-critical transformation that requires a partner who understands the policy, the data, the people, and the consequences of getting it wrong.

At CGI, we have spent four decades inside the federal systems that the GAO is now flagging. We have written, maintained, and in many cases modernized the very classes of systems under discussion, at Treasury, at civilian benefits agencies, at DoD components, across the federal footprint. That institutional memory, combined with modern AI tooling, is what allows our teams, and teams like ours, to approach a legacy modernization engagement with a realistic sense of the risks and a disciplined method for retiring them.

For as long as I have been in this industry (admittedly, most of that time on the government side), the value proposition of a federal systems integrator has rested on three pillars: domain knowledge, engineering scale, and risk absorption. Those pillars still matter. But a fourth has become non-negotiable: the ability to responsibly operationalize AI inside mission environments that have limited tolerance for error.

That means partners who bring pre-built, secure AI platforms rather than ad hoc tooling. It means partners with established governance frameworks… not slide decks about governance, but actual pipelines that enforce it. It means partners who understand that a model is a liability until it is tested, and an asset only once it is evaluated against mission-specific benchmarks and paired with the right human oversight and orchestration.

Leveraging a governed AI digital workforce

Our emerging-technologies practice has invested heavily in what we call a governed AI digital workforce: an environment in which AI agents perform discovery, code analysis, test generation, documentation, and integration tasks under human supervision, with auditable provenance for every artifact they produce. The governance is not an afterthought. It is the product. Federal agencies cannot accept AI outputs that cannot be explained, traced, or defended to an Inspector General. Neither can we. Our proprietary AI Digital Workforce platform is one expression of that thesis, but the thesis itself is bigger than any single tool: emerging technology only creates mission value when it is wrapped in the integration discipline, security posture, and accountability that federal missions demand.

The practical effect of combining deep domain knowledge with governed AI is that modernization projects previously quoted in years and hundreds of millions of dollars can, in many cases, be reframed as a sequence of smaller, faster, lower-risk increments. That reframing is not a marketing promise. It is a direct consequence of the fact that the most expensive and most human-constrained phases of modernization, discovery, requirements reconstruction, test generation, and translation, are precisely the phases that AI now accelerates most.

The fiscal case: reallocating from sustainment to mission

The most important argument for acting now is not technical. It is fiscal.

Every dollar spent sustaining a forty-year-old system is a dollar that cannot be spent on the next generation of the mission. In a budget environment where agencies are being asked to absorb real reductions while delivering more, the sustainment tax on legacy IT is no longer tolerable. It is, in a very literal sense, eating the future at the expense of the American taxpayer.

Modernization done well, accelerated by AI, disciplined by governance, executed by integrators who have done it before, can reverse that flow. The out-year savings from retiring high-cost legacy systems do not have to disappear into the general fund. They can be reallocated to the mission capabilities that taxpayers actually want: better services for citizens, faster benefits adjudication, more resilient defense logistics, smarter border operations, more effective fraud prevention, and the new missions that have not yet been funded because the old systems are consuming the budget.

That is the opportunity hidden inside the legacy IT problem. It is not just about avoiding risk. It is about freeing capital, financial, human, and technical, that has been trapped in sustainment for a generation.

What I'd tell a CIO reading H.R. 8408 this week

If I were advising an agency CIO preparing for the possibility of this legislation, or simply preparing for the reality that the legacy problem will only grow more urgent regardless of legislation, I would offer four things:

Don't wait for the mandate to start (or update) the inventory

The agencies that are ready when the clock starts will outperform those that treat the inventory as a compliance exercise. AI-assisted discovery can accelerate this dramatically.

Pick a lighthouse, not a moonshot

The agencies that have modernized successfully did it incrementally, with a clear first win. Scope for a 6–12 month visible outcome, then compound from there.

Consider AI governance alongside AI capability

The first public AI failure in a mission system will set the program back years. Invest in the guardrails as you invest in the horsepower.

Choose partners who will be accountable for outcomes, not hours

Modernization is not a staff augmentation exercise. It is an outcome that requires shared risk with a partner you can trust.

The legacy IT problem has outlasted administrations, committee chairs, and multiple generations of technology. It will not be solved by another report, another plan, or another mandate alone. It will be solved by agencies and partners who combine the hard-won disciplines of federal systems integration with the genuine, measurable acceleration that well-governed AI now makes possible… and who do it with a clear-eyed commitment to the taxpayer who is footing the bill.

The tools have finally caught up to the ambition. The question is whether we'll use them.