Sebastian Kaulitzki via Shutterstock

This Is How America's Spies Could Find the Next National Security Threat

A recent breakthrough in online prediction markets promises a better glimpse of the future – paid for by U.S. intelligence. By Patrick Tucker

There’s a new website in town that looks to crowdsource predictions about everything from drone delivery to the future price of BitCoins. The SciCast site, which researchers at George Mason University launched in December, allows users from around the world to make predictions and pose questions in order to forecast possible future events and technological breakthroughs. And it’s funded by the Director of National Intelligence’s Intelligence Advanced Research Projects Activity, or IARPA.

“We’re forecasting science and technology events that might be of national security interest, but we’re also creating technologies that can be applied to a wide range of national security questions,” said George Mason University economics professor Robin Hanson, one of the project principles.

The site is technically a prediction market, a sort of virtual casino where participants can wager on real events in much the way gamblers bet on the outcome of big football games. Prediction markets aren’t new. GE, Google, Motorola, Microsoft and various other companies have all used online prediction markets to get a consensus snapshot of the future. Sites like Inkling markets allow regular individuals to make bets on future events and to ask questions of the mob.

Online betting pools have proven useful in answering a wide number of national security questions. In early 2003, users of a prediction market called TradeSports.com correctly forecast a very low probability that weapons of mass destruction would be found in Iraq, at a time when the CIA and the Bush White House continued to assess the presence of WMDs in that country. Prediction markets can also determine how a technological breakthrough, like the development of a new weapon, might change depending on varying conditions of funding, political support or the availability of technical expertise.

“If you have a particular opponent or party of interest, you could ask what they might do, or how they would respond to a particular policy, an outside event, like a change of the price of oil; or how they might respond to a neighbor,” Hanson said.

The SciCast project isn’t the only crowdsourced intelligence effort currently receiving funding from IARPA; another is the Good Judgment Project, which has been under development for years. SciCast, however, represents a key innovation over the way prediction markets are usually run. It’s a so-called combinatorial market, meaning that the algorithm combines different answers and other bits of information together in real time to provide a more accurate, moving picture of how all of these answers and areas are interacting with one another. For instance, predictions about the future electric car market could influence probability scores for breakthroughs in batteries or energy storage. The site weights every respondent's answers depending on how many previous answers they got right and how much they wagered, all of which affects their influence. Unlike a simple running survey, participants have the option to go back and refine their predictions in the context of new information. The site tells the user if the probability score in a given area is moving up or down.

“The whole point of the combinatorial space is that you can ask about these relationships between things. You can see how it’s a pretty broad range of applications,” Hanson said. The SciCast project is the first project to employ that very unusual methodology at this scale. There are more than 3,000 forecasts on the site and about 1,000 registered users, with many more expected in the coming months.

Does it work? In a multi-year study where they used this methodology on a previous Web site (called DAGGRE, also funded by IARPA) the combinatorial approach performed 40 percent better than simply averaging a bunch of forecasts, Hanson said. So, if his numbers are correct, the combinatorial method is far superior to simplistic crowdsourcing techniques as a window into the future.

The use of prediction markets in the context of national defense isn’t without historical controversy. Following the Sept. 11, 2001, terror attacks, retired Rear Adm. John Poindexter, director of the Office of Information Awareness at the Defense Advanced Research Projects Agency, or DARPA, proposed the creation of a prediction market for the broader intelligence community. The objective was to allow spies, informants or anyone who might have inside information on a geopolitical event of any significance to offer insights into how these events might play out. Rather than rely on peoples’ sense of civic duty to volunteer this information, the program sought to pay people for contributing those bits of intelligence they might not give up so easily. The program was called FutureMAP and one of the projects associated with it was called the Prediction Analysis Market or PAM.

As writer Donald Thompson outlines in his book Oracles: How Prediction Markets Turn Employees Into Visionaries, the PAM website very quickly attracted powerful enemies in Congress after a rather faulty public unveiling. “The first PAM website offered ninety-five sample questions. Three of the more colorful examples related to an assassination of Palestinian leader Yasser Arafat by the first quarter of 2004, a North Korean missile attack on any other country by the fourth quarter of 2003, and the overthrow of King Hussein of Jordan by the fourth quarter of 2003. When DARPA reported on the progress of the project to Congress, it used as one example the question, ‘Will terrorists attack Israel with bio-weapons in the next year?’ The hypothetical website question on assassination plus the biological war reference sealed PAM’s fate.”

Sens. Byron Dorgan, D-N.D., and Ron Wyden, D-Ore., held a press conference in July 2003 to attack the PAM project. Wyden called it “ridiculous,” “grotesque” and a “federal betting parlor on atrocities.” Dorgan called the PAM site “stupid” and suggested that it might actually work to cause exactly the sorts of terrible acts that it was designed to predict. The PAM project was canceled the next day.

Hanson, in addition to being one of the pioneers in the development of the prediction market concept, was also deeply involved in PAM, an involvement he documents in great detail here. Ten plus years on, the fact that prediction markets have proven so effective in comparison to other means of information gathering is something of a bittersweet victory for him. Lawmakers simply didn’t understand the way that prediction markets work and didn’t truly grasp how PAM could be useful.

“We were developing a combinatorial project [at DARPA]. Killing that project put the combinatorial approach on hold…You would have seen something a lot like [this approach] 10 years ago,” he said.

NEXT STORY: How Social Media Affects Diplomacy