jasmina buinac/istockphoto.com

'Information Disorder' Is Biggest Social Danger, Commission Warns

"When bad information becomes as prevalent, persuasive, and persistent as good information, it creates a chain reaction of harm,” an Aspen Institute report begins.

The dangerous conundrum at the heart of our interconnected society is how to contain disinformation and misinformation.

“Information disorder is a crisis that exacerbates all other crises. When bad information becomes as prevalent, persuasive, and persistent as good information, it creates a chain reaction of harm.” That is the opening sentence in the report released yesterday by the Aspen Institute’s Commission on Information Disorder.

The three co-chairs of the commission—Katie Couric, a TV journalist with decades of experience; Chris Krebs, the first director of the Cybersecurity and Infrastructure Security Agency; and Rashad Robinson, president of the advocacy group Color of Change—introduced the report and discussed its recommendations in a webcast Nov. 15.

“If we want to reduce information disorder, there are structural changes that we can and must make to our information ecosystem, and there are rules that we can and must implement to better govern the decisions and behavior of information platforms and propagators,” they said in their letter introducing the report.

There are three broad areas where the commission made recommendations to reduce the toxic online stew: 

  • Increasing the transparency of platforms, including facilitating public interest research, requiring platform providers to disclose their content moderation policies and practices, and requiring the release of information about paid digital ads and paid posts;
  • Building users’ trust in the accuracy and reliability of information, including supporting local journalism outlets, and promoting new norms for professional organizations that include “personal and professional consequences” for those who violate the public trust, and;
  • Finding ways to reduce the harms caused by mis- and disinformation, including a comprehensive strategic approach by the federal government, holding “super spreaders” of bad information responsible, and amending Section 230 of the Communications Decency Act of 1996 to hold platform providers to the same standard as other media outlets for paid advertising and post promotion.

“It’s all of our problem because it affects all of us,” Robinson said. “It’s a tactic that’s used to take advantage of things that are already broken or [are] being broken.”

Couric noted that strengthening local journalism is important because of its role in creating a “well-informed electorate … If people don’t know what’s going on in their community, they’re less inclined” to participate in community life, including voting. She said that one in five people in the United States currently lives in a local news “desert.” 

Krebs described the issue as a “whole-of-society problem” that is rapidly becoming a national security issue. He said “of course” the federal government has a role to play.

“What we need is clarity of mission, clarity of purpose. It has to be holistic, not a ‘Ministry of Truth,’” he said. “Part of what we really need is a stock-taking of the roles and responsibilities of the various agencies in the executive branch, and to identify the authority gaps.”

The three all agreed that the First Amendment should not be seen as an impediment to trying to minimize lies and half-truths.

“The question of free speech is a straw man in some ways,” Robinson said. “It confuses and directs us away from” addressing the problem. “Freedom of speech doesn’t mean we have to live in a society without rules, [or] where lies outnumber the truth … There have always been regulations to protect the public from harm” and hold accountable the people trying to cause harm.

Krebs suggested this is a subject where the political parties can find common ground. “If there is any single issue that Republicans and Democrats alike have expressed concern about, it’s social media platforms’ role in dis- and misinformation,” he said. “There are attainable, achievable objectives. At a minimum the transparency recommendations are actionable.”

Neither the co-chairs nor the report itself says the problem can be eliminated. “To be clear, information disorder is a problem that cannot be completely solved. Its eradication is not the end goal,” the report stated. “Instead, the Commission’s goal is to mitigate misinformation’s worst harms with prioritization for the most vulnerable segments of our society.”