By Patrick Tucker
January 9, 2017
The U.S. military needs an entirely new system for storing and managing data if front-line troops are to be able to find and act on information as easily as any of us can search Google, according to Eric Schmidt, executive chairman of Alphabet, legendary Google CEO, and chair of the Defense Innovation Advisory Board.
Schmidt also chairs the Pentagon's Defense Innovation Advisory Board, a panel of technology giants that includes Jeff Bezos and Neil deGrasse Tyson. At the board's meeting on Monday, Schmidt discussed the creation of a data storing and delivery system that sounds uncannily Google-esque.
The pitch came in the form of a new interim recommendation. (The board has voted to approve and forward its previous 11 recommendations.)
Though no individual board member contributes specific recommendations, Schmidt was clearly personally connected to this one. He explained that it rose from the group’s international discussions about future artificial intelligence capabilities and discussions with commanders across the U.S. military.
“In our meetings with the senior leadership, they talk about this thing called ‘data fusion.’ The fantasy goes something like: we’re going to have all these different signals; the signals will be automatically detected; the immediacy…will enable the warfighter to make a better decision,” he said.
It would work sort of the way Google does, crawling the Web for new information areas, ranking them for relevance, and presenting them when they match a user's request. The proposal would require a single network that allows any operator in the world to access any and all Defense Department data with a quick query (and based on appropriate permissions levels). Need to pull up drone footage over Kenya two days ago? Hunting for the design specs on a particular IED? If the DOD has it, it should be findable and mineable at scale — but that requires putting the data in fewer places, making it findable.
Of course, Google relies on data that people around the world contribute to the open Web. Google itself doesn’t have to worry about hosting the information, just indexing it.
The Defense Department can’t just send its information to the open Web. But, says Schmidt, if the Pentagon could figure out a more centralized storage scheme, its leaders and commanders could take advantage of search capability at a variety of levels.
“There’s no place in the military where the data is centrally aggregated and a lot of organizations either hide the data, don’t know they have the data, lose the data or don’t care about the data,” said Schmidt. The problem, he said, "is that the signals aren’t available and they aren’t minable. So, [data fusion is] a great strategy but you have no way of implementing it. The reason we wanted to bring this idea up and then work it through the bureaucracy or whatever else you call it is that without some kind of data repository, set of data repositories…you are not going to be able to achieve that vision. It’s a clear bug in the strategy.”
Centralizing data would allow future machine learning and AI programs to mine the information, and at least in theory, to discover new correlations and patterns. It’s the sort of thing that today takes analysts years. In theory, if streaming data on, say, fuel costs, weapons production, mission milestones, casualties etc. were all in one place, leaders would have a much more detailed, accurate, and timely understanding of the global conflict environment, how much they were ahead or behind.
Schmidt, however, cautioned that the centralization process should be gradual.
“In practice, you would never do such a broad release to the whole military, for security reasons," he said. "You would not have one big database. But the principle is the same."
Schmidt acknowledged that centralizing Defense Department data would also create new information targets whose compromise might put the nation at unfathomable risk.
“Now, before we get too excited about databases here: The databases have to be secure," he said. "These are secret, secret information, secured by all the computer scientists that we hire.”
But security is more a matter of will and implementation than miracle work, said Schmidt.
“Having worked with and done this for a long time, the algorithms to provide absolute security exist. They just simply have not been implemented. This is a computer science problem. Basically, if you use 2048-bit encryption" — which would take a standard desktop machine more than a million a years to break — "you use two-factor authentication, your information is not going to be leaked except by illegal activity by humans.”
By Patrick Tucker // Patrick Tucker is technology editor for Defense One. He’s also the author of The Naked Future: What Happens in a World That Anticipates Your Every Move? (Current, 2014). Previously, Tucker was deputy editor for The Futurist for nine years. Tucker has written about emerging technology in Slate, The Sun, MIT Technology Review, Wilson Quarterly, The American Legion Magazine, BBC News Magazine, Utne Reader, and elsewhere.
January 9, 2017