I spent most of my career building software for U.S. government clients as a contractor, and one thing I noticed is just how bad the government is at running software projects. Take every bad thing every commercial software project ever did in the 1990’s, and it was built into the government software development process. It’s like modeling every high school after Bayside High.

Eventually, everyone else caught on with the infamous rollout of HealthCare.gov, but even then pundits and politicians only sought to examine that debacle through the prism of the politics of the Affordable Care Act. It frustrated me that people used HealthCare.gov as a proxy for their views on the law rather than a cautionary tale on how not to run IT projects.

Politics and ideology aside, the government has a technology problem.

We’ve seen this most recently with the OPM hack, but HealthCare.gov gave us what some would call a teachable moment. The problem is most people learned the wrong lesson. Even alleged technology experts who took the time to examine the way HealthCare.gov was executed often got their analyses spectacularly wrong.

It was in fact because of my desire to help improve government IT that I began writing for Government Computing News around that time.

To its credit, the Obama administration has taken the first step—recognizing the problem. The Government Accountability Office produced a report on applying agile software development practices in the federal government, and I was one of the “experienced users” interviewed for it. Meanwhile, headquartered in Washington, DC, 18F was formed within the Government Services Administration to serve as a consultancy staffed with expert designers, developers, and product specialists. 18F is the driving force between the U.S. Digital Services Playbook, a guide for government professionals on implementing the best IT, and the TechFAR, an update to the Federal Acquisition Regulation (FAR). I wrote about the Playbook for GCN.

While there are a lot of things that need to be fixed, I have always felt the root of the federal government’s IT problems is the acquisition process—long before any code is written or you even know who will be writing it. The FAR remains the standard, but it hasn’t adapted to modern best practices for software development. 18F apparently agrees, and it sponsored a white paper challenge on Challenge.gov seeking a blueprint for a course for federal acquisition professionals on what they need to know in order to procure the best software. I joined a team as a co-author of one of the white papers. I know you’re dying to know what came of that. You will be the first to know when I do.

It was during this process that I first met government acquisition expert Frank McNally. It turns out he is just as passionate as I am about reforming the federal IT acquisition process and producing great software as a result.

18F recently put out a blanket purchase agreement (BPA) seeking vendors for a new kind of acquisition process—one that I’ve called for for a long time. For the first time, vendors had to actually build something using real datasets, real open-source libraries, real source code, real version control, and even real unit tests and containers. That’s cool. While this approach delivers working software—the Holy Grail of agile software development—to the government for free, it also signals to everyone else that lean software projects deliver fast. You don’t need to wait months and years to see results.

Frank posted a nice explainer on the 18F Agile BPA complete with video snippets. He was kind enough to invite me to add my own commentary in the videos. I don’t actually appear on screen, so you have every reason to watch the videos.

Commercial businesses have used lean and agile concepts to build the best software so they can reap efficiencies, value, and ultimately profits for their shareholders. I am proud to play a small part in helping the federal government reap efficiencies and value for the benefit of the American people.