Why the Navy Needs to Open Source Its Future

Space and Naval Warfare Systems Command, by Rick Naystatt

AA Font size + Print

The Navy put great energy into virtualizing its servers with one contractor. Now it should open source its way into the next era. By Gunnar Hellekson

In just two memos, Navy Chief Information Officer Terry Halvorsen defined computing for the Navy and Marine Corps into the foreseeable future. Every server will be virtualized by 2017, and 7,500 desktops will no longer live on someone’s desk, rather that data will reside in Navy data centers.

The goals are ambitious, but they obscure a fundamental change for which virtualization is just a first step.

In one respect, nothing could be less controversial than virtualization. Over the last five years, virtualization has become the standard method of deployment across government. As servers grow in capacity, it makes less sense to continue with the one-workload-per-server habit that has developed over the last 20 years of client-server computing. Halvorsen is simply formalizing, and maybe hastening, a transition that has already started.

The immediate goal, of course, is cost reduction. His measure of success: the number of servers converted. I’m sure this is informed by the Navy’s positive experience so far. In July 2012, Halvorsen announced that the Navy had eliminated 2,000 of their 6,000 shore-side servers using virtualization, with a commensurate reduction in capital expenses. But in exchange for a reduction in capital expenses, virtualization has non-obvious effects on operational expenses. Software often is more expensive than hardware, including some of the software that runs on these virtualized machines. Some vendors force you to pay to license the maximum possible capacity, rather than what you are actually using. This is particularly true for desktops, where the licensing for office productivity software can be many times the cost of the machine itself. Also, virtualization reduces the friction of creating new servers, which can balloon operational costs: those servers still need to be fed and clothed, even if they don’t reside on physical machines.

Halvorsen no doubt understands all this and decided to virtualize anyway. He’s playing a long game and it has little to do with up-front reductions in capital expenses. By virtualizing, he’s building for the future.

Halvorsen has every reason to expect continued budget cuts over the next few years, and he has to prepare for that. Virtualization is an easy choice: he gets the quick reductions to capital expenses he needs, and moves the burden to operational expenses, which are more easily managed and controlled. More importantly, his datacenters and desktop deployments can respond much more easily to changing demands and budgets. They can be moved to new hardware platforms without downtime. They can be easily duplicated. They can be allocated more or less hardware as missions change. They can be better managed and accounted for — a server cannot hide under a desk or in a closet when it is virtualized. Virtualization is also a precondition to moving to clouds, which everyone expects to be the norm in the very near future. When Halvorsen is confronted with an unmanageable uncertainty in budget and mission, these benefits of virtualization are not pleasant side effects. They’re mandatory for the datacenter he will need in 10 years.

This move, though, has a risk. At the moment, a single company is doing the lion’s share of enterprise virtualization: VMware. There’s good reason for this: they have an excellent product, and practically invented virtualization as we know it. But that doesn’t mean that one company should have effective control over an entire layer of the Navy’s architecture. That kind of control gives strong incentives to extract as much rent as possible from its Navy customers, undoing Halvorsen’s heroic cost-cutting measures. Worse, the Navy may find that the flexibility and agility of virtualization is actually undone by a vendor who has no reason whatever to ease migrations to other, cheaper platforms.

This is not hypothetical. Pat Gelsinger, VMware’s CEO, said at a partner conference earlier this year, “We want to own corporate workload…we all lose if they end up in these commodity public clouds. We want to extend our franchise from the private cloud into the public cloud and uniquely enable our customers with the benefits of both. Own the corporate workload now and forever.” Gelsinger’s statements are something to keep in mind as the Navy moves much of its unclassified work to public clouds.

We have seen this movie before, on the mainframe, proprietary UNIX systems and relational databases: when you have one supplier for a piece of critical infrastructure, things get expensive quickly and it is very difficult to leave.

Fortunately, alternatives exist. The trick for Halvorsen is ensuring that these alternatives are practical. That means ensuring a procurement environment that encourages competition among virtualization vendors. It means actively training staff on multiple platforms. It means creating a level playing field by employing open standards wherever possible. But most importantly, it means promulgating the policies and procedures the Navy needs for an orderly exit from any virtualization vendor they provide.

If Halvorsen is able to take the same amount of courage and resourcefulness that went into these memos on the acquisition of a technology, and applied them to the exit from that technology, he will be extraordinarily well positioned for whatever the future brings. That is what leadership looks like.

Gunnar Hellekson is chief technology strategist of Red Hat U.S. Public Sector.

Close [ x ] More from DefenseOne
 
 

Thank you for subscribing to newsletters from DefenseOne.com.
We think these reports might interest you:

  • Software-Defined Networking

    So many demands are being placed on federal information technology networks, which must handle vast amounts of data, accommodate voice and video, and cope with a multitude of highly connected devices while keeping government information secure from cyber threats. This issue brief discusses the state of SDN in the federal government and the path forward.

    Download
  • Military Readiness: Ensuring Readiness with Analytic Insight

    To determine military readiness, decision makers in defense organizations must develop an understanding of complex inter-relationships among readiness variables. For example, how will an anticipated change in a readiness input really impact readiness at the unit level and, equally important, how will it impact readiness outside of the unit? Learn how to form a more sophisticated and accurate understanding of readiness and make decisions in a timely and cost-effective manner.

    Download
  • Cyber Risk Report: Cybercrime Trends from 2016

    In our first half 2016 cyber trends report, SurfWatch Labs threat intelligence analysts noted one key theme – the interconnected nature of cybercrime – and the second half of the year saw organizations continuing to struggle with that reality. The number of potential cyber threats, the pool of already compromised information, and the ease of finding increasingly sophisticated cybercriminal tools continued to snowball throughout the year.

    Download
  • A New Security Architecture for Federal Networks

    Federal government networks are under constant attack, and the number of those attacks is increasing. This issue brief discusses today's threats and a new model for the future.

    Download
  • Information Operations: Retaking the High Ground

    Today's threats are fluent in rapidly evolving areas of the Internet, especially social media. Learn how military organizations can secure an advantage in this developing arena.

    Download

When you download a report, your information may be shared with the underwriters of that document.