A review of Lee McIntyre, “On Disinformation: How to Fight for Truth and Protect Democracy” (MIT 2023)
In his 1962 book, “The Image: A Guide to Pseudo-Events in America,” future Librarian of Congress Daniel J. Boorstin predicted a time in which our ability to technologically shape reality would become so sophisticated that our creations would surpass reality itself. “We risk being the first people in history,” he wrote, “to have been able to make their illusions so vivid, so persuasive, so ‘realistic’ that they can live in them.” These illusions had already started to become “the very house in which we live; they are our news, our heroes, our adventure, our forms of art, our very existence.” And yet, Boorstin suggested, tearing down our illusions—disillusionment—was perhaps even more dangerous.
We increasingly find ourselves living in the world of Boorstin’s prediction: The social media landscape is splintering as specific communities migrate to distinct platforms and are influenced by different media universes, stories, and creators. Generative AI affords anyone both the power to create unreality and the license to deny the real. This is compounding, and compounded by, an increasing perception of affective polarization and growing distrust of institutions.
In this moment, people who seek to deliberately manipulate the public have more powerful tools at their disposal than ever before. Therefore, understanding disinformation is an important endeavor for scholars and policymakers alike. Indeed, as we collectively experience multiple wars, an upcoming series of crucial elections, and life-altering challenges such as pandemics and climate change, it has become an imperative.
Enter Lee McIntyre, philosopher and author of “How to Talk to a Science Denier,” “Post-Truth,” and other writings focused on epistemic crises. McIntyre’s prior works have engaged with conspiracy theories, theorists, and pseudoscience, explaining why they are compelling to millions while offering actionable suggestions to address them. He takes a similar approach in his new work, “On Disinformation: How to Fight for Truth and Protect Democracy.”
As readers of Lawfare know, many books have been written on the topic of disinformation these past few years, many of which discuss the role of influence operations and disinformation in shaping our illusions. To name a few, there is Thomas Rid’s “Active Measures,” Jeff Horwitz’s “Broken Code,” P.W. Singer and Emerson T. Brooking’s “LikeWar,” Richard L. Hasen’s “Cheap Speech: How Disinformation Poisons Our Politics.” The list goes on.
Lee Mcintyre’s 2023 entrant into the field—“On Disinformation”—is physically a very small book, immediately evocative of a pocket constitution (or, for readers of books about the political zeitgeist, of Timothy Snyder’s “On Tyranny”). Its form conveys its intent to be a field manual of sorts, a step-by-step guide to enable the reader to accomplish the mission declared in the subtitle: to fight for truth and protect democracy. The reader can tuck it into a back pocket while walking into battle … or stick it into the Christmas stocking of their crazy uncle.
The book has roughly three parts. McIntyre first explains why the project of responding to disinformation matters—what risks it poses, what impact it has. Then he describes the dynamics of modern disinformation, covering three types of participants in its supply chain—creators, amplifiers, and believers. Finally, he offers a series of suggestions for how to fight back.
The goal of the book is an important one: Disinformation is having significant impacts on society. Explaining how ordinary people can make a difference is empowering; too often, people concerned about disinformation assume that government or Big Tech will step in to fix things. Particularly since the problem is not strictly technological, that hope is mistaken. Emphasizing the role that ordinary people can play in their communities is important.
But the book also feels a bit dated, for two reasons. First, the framing of the problem in terms of Trump and Russia. It is a convenient example, but too much emphasis is placed on Russia’s role. Ironically this framing also potentially undermines the book’s goal. It is certainly an entertaining read, full of spicy prose and battle cries for those who already believe that disinformation is a problem. But those who most need to be convinced will likely be turned off. Second, many of McIntyre’s suggested solutions, particularly for regulators, have now been discussed for the better part of a decade. I found myself wishing that McIntyre had incorporated more up-to-date findings on trust, polarization, and social cohesion.
The first third of the book reads like a manifesto. McIntyre very concisely lays out why fighting disinformation matters, and in far more of a call-to-arms style than other books discussing the problem. He emphasizes the stakes, linking disinformation to totalitarianism and fascism within the first three pages: “Totalitarianism happens when people no longer have a strong sense of reality,” McIntyre writes, “when the division between fact and fiction, and true and false no longer exist.” His call will resonate with anyone who feels anxiety about global political trajectories, or who has spent time on social media marveling at the posts of some of their relatives. It will also hit home with the extremely-online, who have a front-row seat to fracturing publics and easily fabricated unreality.
Many of McIntyre’s arguments lean on American political examples. He quickly connects disinformation to election denialism, the Big Lie, and deliberate efforts by disinformation artists such as Steve Bannon to “flood the zone with shit.” However, the book ascribes too much credit to Russia; McIntyre describes Trump as having been the first to apply Russian-inspired disinformation tactics to U.S. politics and gives a discourse on Putin and his role. This minimizes the extent to which the Russians trolls themselves mimicked homegrown American troll behavior and recycled homegrown American hyperpartisan propaganda, and gives the Russians too much credit for the largely self-inflicted devolution of American social cohesion and splintering of consensus reality. Disinformation’s most potent manifestation is often not the outputs of foreign trolls. They certainly throw gasoline on the fire and can achieve significant engagement while sowing discord and casting doubt on reality. But it is hyperpartisan influencers, media, and politicians who have perhaps the greatest potential for impact; they have personal incentives, significant reach, and are trusted by their audiences. (Indeed, Russian trolls expended effort getting just such people to retweet their posts.) This is a nuanced conversation, however, and a manifesto-style field guide is perhaps not the ideal place for it.
The book does address the critical component of incentives. Denialism, McIntyre points out, is part of a coordinated campaign run by people who want to spread disinformation to the masses for their own benefit. Lies about voter fraud and Joe Biden stealing the 2020 presidential election, for example, benefit “the creators”—the first of the three groups of actors McIntyre focuses on—not the people who believe them. The creators of disinformation campaigns, McIntyre explains, are running the same playbook that Naomi Oreskes and Erik M. Conway described in their book “Merchants of Doubt”—except that instead of casting doubt on the harmful effects of smoking tobacco, or on the dangerous changes in the world’s climate brought about by hydrocarbon emissions, or on the integrity of scientists writ large, the denialist game has expanded to “reality itself.”
I very much appreciated McIntyre’s reference to “Merchants of Doubt,” a book that clearly lays out not only the strategy but also the tactics and goals of manipulators who act in order to preserve or extend their power. “Merchants of Doubt” has no Russians, nor bots, but describes partisan machine attacks and a playbook nearly identical to that deployed at this very moment by the most powerful domestic denialists of the 2020 election. This includes, of course, the hyperpartisan lawmakers and fake think tanks that churn out whatever chum will provide enough pretext for vicious smears and ginned-up investigations. In referencing this playbook, McIntyre pulls “disinformation” (and his book) out of the abstractions of academia and reframes it as the high-stakes power game that it is.
After covering the highly deliberate creators, McIntyre turns to the other two sets of actors in his disinformation supply chain: amplifiers and believers. Disinformation, after all, must be not only created, but amplified, and believed, in order to be effective. The amplifiers he focuses on are not just social media platform algorithms but broadcast media outlets as well, and it is good to see commentary focusing on a broad media spectrum. However, McIntyre pays little attention to perhaps the most important amplifiers of all: influencers and the digital crowds that surround them. McIntyre treats the public primarily as “believers,” but the amplification of disinformation today is increasingly recognized as a very participatory process; the distinction between believer and amplifier is far less clear-cut than it was in prior media eras. Indeed, it is precisely the agency and power of the believer-as-amplifier that makes online disinformation so potent today. Things do not simply “go viral” because of platform algorithms. We participate, making content go viral through individual acts of clicking and sharing that, in the aggregate, are a form of collective behavior that propels the content across communities, networks, and platforms.
The remainder of the book offers suggestions for how to address the problem of disinformation. McIntyre’s guidance about how to speak to believers is compelling. He recognizes that belief is a function not of facts but of values and identity—how we see ourselves and our communities. Disinformation aims to be polarizing, he notes, in order to exploit partisan enmity, not just doubt. It often relies on a particular frame: “They are lying to you.” And so, he offers ways to resist polarization and break down barriers, particularly through in-person outreach and patient, face-to-face conversations that build up trust—guidance he has offered in his prior work on science denial and post-truth as well.
Some of McIntyre’s other recommendations are related to regulation and accountability for tech platforms and the creators. These are a mixed bag. They include reintroducing the fairness doctrine, revising Section 230, having the platforms get more aggressive about moderating the influential individuals most active in creating and amplifying disinformation, and several others. Some of them are solid. For example, his call for regulators to require more transparency from platforms in giving outside investigators and the public access to social media data (yes, please!), a call many others have made, merits a rapid response from Congress and the relevant agencies. However, McIntyre’s call to reform Section 230, like many others, suffers from lack of specificity, and there is little popular support or momentum for bringing back the fairness doctrine. I would have preferred to see a more specific and nuanced focus on a handful of what McIntyre sees as the most impactful possible reforms, rather than a few dozen suggestions simply listed in what is a very short book.
Ultimately, many of the book’s suggestions treat disinformation primarily as a supply-side problem. We need models of how disinformation works that more tightly integrate amplifiers and believers because it is on the demand side that interventions may have the greatest impact. Indeed, given McIntyre’s prior work, it would have been interesting to see him delve more deeply into possible demand-side strategies to reduce disinformation’s corrosive effects. Overall, however, his call to build a nonpartisan countermovement committed to truth is critically important because, as he writes, “The truth does not die when liars take power; it dies when truth tellers stop defending it.”
– Renee DiResta, Published courtesy of Lawfare.