The very idea of truth and science, Jonathan Rauch argues, is now under threat from many quarters. In his latest book, The Constitution of Knowledge, he gives a novel account of the principles of science, and explains why democracies must strive to preserve the truths that bind us together.
In this week's conversation, Yascha Mounk and Jonathan Rauch discuss the dangers of disinformation, the limits on robust debate, and why truth is fundamental to preserving democracies around the world.
This transcript has been condensed and lightly edited for clarity.
Yascha Mounk: Why should we think that truth is so important and under attack at the moment?
Jonathan Rauch: The subtitle of my book is “A Defense of Truth,” but it's really a defense of objective knowledge, which is as close as we ever get to truth—meaning we're always adjusting it, but it is cumulative and it is progressive and it did just put the vaccination in my left arm that's protecting me from COVID right now. So it's important for that reason, but it's also important because every society, whether a small tribe or a mighty nation, needs a way to come to some kind of agreement about what's real and what's not, at least on big public questions. And if you can't do it, societies become unmoored from reality. They break into sects with conspiratorial leaders. They become vulnerable to demagogues. Very frequently, they get involved in complete breakdowns and civil wars. Symptoms of what's been called an “epistemic crisis” include things like extreme polarization, conspiracy theories, ”chilling”—where a lot of people are just afraid to speak out. And we're seeing all of those things now in the United States. The thesis of my book is that this isn't “just happening.” This isn't like a natural disaster. This is partly resulting from a very deliberate attack on what I call “the Constitution of Knowledge” by nameable actors who know what they're doing, and are acting in very sophisticated ways to undermine our epistemic constitution.
Mounk: So who are these actors? What is the nature of this attack on truth?
Rauch: The core notion here is “disinformation information warfare.” I define that as organizing and manipulating the social and media environments for political advantage specifically to dominate, divide, disorient, and demoralize a political opponent. And there are lots of ways to do this. This goes back hundreds of years, but the two big ones are disinformation and canceling—and of the two, I worry more about disinformation. Here, the nameable actor would be Donald Trump and his enablers and friends in conservative media, cable news, talk radio, the Republican Party. Some foreign actors are involved in this as well. But there I argue that we should not just view Trump as some kind of idiot savant or bumbler who does a lot of silly, stupid things and lies sociopathically. In fact, I argue he is the most brilliant and innovative and effective propagandist since the 1930s. He's up there with Putin. I think he's better than Putin. And the reason is that he figured out something no one else ever tried before, ever thought of trying in an American context, which is adapting and refining Russian-style disinformation tactics to the US political environment, and turning those tactics against the American people from the office of the presidency. The culmination of that campaign is, of course, “Stop the Steal,” the massive disinformation campaign against the election. It has convinced 75% of the Republican party that this is no longer a democracy, that the person in the White House did not win the election. We've never seen anything like that in America, at least not since the 1850s. It's frightening.
Mounk: You say that this is a set of tactics which in certain ways were first pioneered by Putin, but do you think that Trump actually used them in a more effective or more subtle way? What are those tactics?
Rauch: The first and most important is what researchers at the RAND Corporation and elsewhere call the “firehose of falsehood” tactic. You just put out masses of lies at a rate which no one can keep up with. You can't possibly debunk it all: It would be silly even to try because it doesn't matter if the lies are mutually contradictory. But what you do is you swamp the information system with so many lies and conspiracy theories, people become disoriented, they become cynical, they don't know who to trust, you dumbfound and disorient mainstream media that try to check it [but] can't possibly keep up. You do this over every channel simultaneously. We saw that with “Stop the Steal:” They were using the office of the presidency and his Twitter account, but they were also using conservative media, Republican politicians, and the courts. Dozens of lawsuits filed, all of them meritless, [were] another form of spreading disinformation. This just disorients and drowns people. It's very effective. They don't know which way to turn and that opens the door to demagogues. Putin is very good at this. A good example is Sergei Skripal—the poisoning of him and his daughter. Some [Russian] agents went to Britain and used Novichok, a nerve agent, to poison two people, and when they were nailed for it, [they] said, “We have an explanation for this. In fact, we have dozens of them.” And they just poured out: “Well, it was a suicide. It was a lover's quarrel. It was an accident. It wasn't Novichok. It wasn't a nerve agent. It was a nerve agent, but another nerve agent.” And so forth and so on.
Mounk: It's not saying, “Here is one countertheory, which is as plausible as the main one.” It is trying to muddy the waters by coming up with lots of different theories.
Rauch: To coin a phrase, this is not about persuasion: This is about disorientation. [...] As Steve Bannon famously put it, “Flood the zone with shit.” This is a structured attack, and it is coordinated. Trump was coordinating it. That's why he started the campaign against mail-in voting. He was signaling to a network what the message was going to be, and he was setting up the network and testing it so that after election day it would be ready to go.
Mounk: How do you defend the Constitution of Knowledge against disinformation? Because I recognize the danger you speak about, I take it very seriously, but I also worry that there’s a rise of the “misinformation expert” over the last years, and that these misinformation experts both legislate ex cathedra about what is misinformation—often without themselves being particularly expert in the things they're pronouncing on—and tend to always have the same remedy for misinformation, which is to censor.
We've already had at least two important contexts in which something that was widely proclaimed by The New York Times and The Washington Post and CNN to be misinformation or disinformation later turned out to be at least partially true, and are now being treated as serious by mainstream media outlets. And so I worry about the way in which well-intentioned people can use these very real concerns about misinformation or disinformation to erect an effective censorship regime, which actually often will misfire. And that can have bad substantive consequences if it means that certain truths actually get banned in the name of fighting misinformation. And also, of course, that only serves to delegitimize mainstream institutions even more.
Rauch: I would argue that [the debate about the origin of Covid, for example] is a success story for the mainstream media, not a failure story. It starts with a failure, which is, they got it wrong. But I look back at the original fact-check, and they were reasonable, because they were reflecting what scientists were saying at the time. It turned out to be an error. And Trump complicated things by weaponizing it. But it was also mainstream media at The Washington Post, New York Times and Wall Street Journal—doesn't get more mainstream than that—that kept on this story, dug it out, re-elevated it. And now you can't pick up the paper without reading a post mortem from the mainstream media, “Where did we go wrong?” So the errors are inevitable. And the biases are inevitable, because that's how we're wired as humans.
Mounk: But the extent of groupthink is not inevitable. I don't think there was ever scientific consensus about this. There was a media simulacrum of a scientific consensus about this, and anybody who diverged from that false consensus was canceled for 15 months. And then, suddenly, as one, within a week, all these mainstream media outlets talk about the theory. I don't think that's a story of media success. I think that’s a story of media failure.
Rauch: Well, I don't quite agree with your characterization of how sudden and herd-like and cancel-like it was, and maybe that's a separate conversation. I agree that it was a mistake. I also think it's been corrected, largely because of the fact that major media went after it and stayed on it.
Mounk: Or was it because the politics have changed, and suddenly the White House has instituted a commission to study this and so the media follows the White House? This seems like an incredibly partisan story to me.
Rauch: That too, of course. Then you get to the question of what do you do in a heavily manipulated media environment? Granted, the media blew that story. A lot of people blew that story. Science blew that story. But one of the reasons that they blew that story is that they have a guy in the White House who is spreading conspiracies every day and they're looking with an extremely jaundiced eye—as they should—on anything he says. And that's part of the problem with being in one of these distorted media environments: You can't get it right. If you think it's a conspiracy, and you're wrong, they say you got it wrong. If you think it's a conspiracy, and you're right, and you rebut it, you're giving it airtime. This is why you don't want to get into one of these traps to begin with: They create an impossible situation for people to function. I would shift the blame there, to some extent, away from mainstream media and toward the people who are creating the environment in which they couldn't function.
You asked a bigger question there, which is, so what do we do about all this? What works is a kind of all-of-society, multi-layered response, which is how we've done it in the past. We've had these problems before with the printing press, and with 19th-century journalism. You begin to create some standards and guardrails that hopefully can guide you through it and bring you to a better place. I think Facebook is doing that now with its oversight board. People are very cynical about that. I don't think they should be, because the way we got out of the trap of massive fake news and partisan extreme media in the 19th century was start creating journalism schools, ethics codes, the American Society of Newspaper Editors, that gave us eventually a system that allowed us to get a sense of our bearings as reporters. So some of it is institutional, and a lot of it is going to be product design at a very granular level.
Censorship and heavy-handed bans will have to happen sometimes, because people are violating Terms of Service. What's really going to work is figuring out how to de-amplify and how to amplify. I believe in freedom of speech, but not freedom of reach. These algorithms were tuned, in fact, to put outrage in front of us and often put falsehood in front of us because people click on it. So you're going to have algorithmic changes. A big change is in the news media, and [they're] getting a lot more sophisticated about understanding disinformation. I would argue that in the 2020 election, the news media did a much better job of not just lying prone before falsehoods and conspiracy theories of the kind that we saw in 2016. And now they have reporters covering this information. They're better about providing context. There's going to be a public education component. It looks like folks in Europe are way ahead of us on this, but the evidence suggests that media literacy instruction has some positive effect, helping people understand how to sort through stuff online. Getting Donald Trump out of office is a huge help. People have compared this to an immune reaction: You've got to develop multiple layers of immune responses in society. And it's touch and go, I tell people all the time—you as a European will know this very well—the tactics we're talking about here are tried and true. Lenin used them, Hitler used them. They're very effective and sophisticated and powerful. They have not been deployed in the American context this way before. We've got to take them very seriously and understand them. There is no guarantee that we just sort of get out of it on our own, without thinking it through.
Mounk: Let's talk about the second half of your account, which is not disinformation but cancellation. You had a really wonderful article early in the days of Persuasion about how you distinguish cancellation from straightforward criticism. What is the nature of cancellation?
Rauch: Cancellation is not like criticism. Criticism is about the rational exchange of ideas in hopes of finding truth; cancelling is about manipulating the environment for political gain. And it's also important to understand that the goal here is to get people to self-chill, to self-censor. So you're not just going after the particular idea that Yascha Mounk may have. You don't want safe harbors. You want to make people just afraid that anything they say could get them into trouble. If it comes anywhere near a topic—it's usually going to be race or gender, sexuality, but could be a lot of other things—you intimidate them, and they're silent and their point of view doesn't get represented. The other part [of cancellation] is really sophisticated because it's cognitive, which is that humans look to each other to figure out what's true. If we're the only one in a room of eight people who says x, even if x is demonstrably true, a lot of the time we will decide that y must be right, because all those other people can't be wrong. So when cancelers mess with the environment to suppress one point of view, they bring in a lot of other people who say, well, maybe this point of view that seems so strange and wrong and excessive and illiberal must be right. What people are trying to do here is create an environment that's just studded with landmines. You don't know where they are, you don't know where you can walk—and, remember, the ultimate goal here is to demoralize the other side because demoralization is demobilization. But you know, there's the famous case of David Shor a year ago. A left-leaning, Democratic political analyst, he tweets out an accurate account of a solid piece of academic work and loses his job after people gang up on him. That's not a conversation about a viewpoint. It's not even really about David Shor. It's about a demonstration that at any given moment, no one is safe: We can come after you, we can go after your friends and your jobs. So we're in charge around here. That's the real agenda.
Mounk: Why has cancellation risen as a threat to the constitution of knowledge? Once you have someone like Donald Trump in the White House, or even as a major presidential candidate, it's easy to see why disinformation suddenly takes a more central role. I think with the rise of cancellation campaigns, it's a little less clear. Is the source just technological? Is it the existence of Twitter with its algorithm that favors the most controversial tweet, or is there a broader reconfiguration of our moral, political or intellectual landscape?
Rauch: I think that the technique itself is ancient. Tocqueville cited it as the biggest threat to freedom in America in the 1830s. John Stuart Mill cited it as the biggest threat to freedom in Britain in 1859. There’s nothing new about using social coercion to create “spirals of silence,” as they're called. So what turbocharges it now is, for one, the technology. [In the past] you'd have to do it by mail, or take out an ad in the newspaper. Well, now it's just trivially easy to dogpile. Literally, you can do it with a few clicks of a button. A second is the emergence of “emotional safetyism” as a doctrine, which is the notion that if you're saying something I disagree with, you're actually committing an act of violence against me, a human rights violation. This turns out to be a powerful tool for intimidating people. It's what was used at The New York Times to fire James Bennet: A lot of staffers said, “Running an editorial we disagree with was the equivalent of violence. It made us unsafe.” A third is generational change. A fourth is a very powerful tool that people discovered recently, and it’s that employers are a very vulnerable target. They are wired to avoid controversy. They're not there to promote free speech by their employees. So if I go after Yascha, and Yascha has an employer, the easiest way for the employer to solve that problem is to fire Yascha. That's a major vulnerability.
What's being appealed to here are deeply good things. For instance, anti-racism. I've learned a lot from anti-racist ideas. I've changed as a result, and I'm grateful for that. The problem comes when people don't just argue for the ideas, but use coercion—illiberal means to regulate how we talk about those ideas. No one wants to be accused of being a racist: We want to be on the right side of this issue. That makes it hard for us to distinguish between the ideas themselves, which may be good or may have good in them, and the tactics that are being used to promote the ideas, which can be illiberal and authoritarian.
Mounk: What is it about science, the pursuit of knowledge, that is so important and so noble, and allows us to make such great progress? Tell us a little bit more about the thing that is actually worth defending.
Rauch: We forget that because [the Constitution of Knowledge] worked so well for so long, we just assume a marketplace of ideas, free speech, and everything takes care of itself. That's not how it works at all. If you just leave people alone to have open exchange without any structure, any rules, any institutions, they associate with people who agree with them, they engage in bias confirmation, they divide into sects, they go to war, and that's 200,000 years of human history.
You're going to have to go out and engage a global network of people—most of whom are complete strangers, most of whom have very different views—and persuade them and talk to them and interact with them in structured ways with things like peer review, and fact-checking, and through journals and organizations. You're going to have to learn whole vocabularies that allow you to engage these other people productively. But once you do that, you will have a global network of millions and millions of minds, places around the world, multiple languages that are capable of acquiring a hypothesis and checking it within hours, and doing so in a cooperative way that doesn't require anybody in charge. I mean, when you think about this, it's absolutely fantastic. I argue that it is a species-transforming technology. Every human could die, [but] our knowledge, our objective knowledge, would still be there, and an alien civilization could come to our planet, decode our books and our databases, and reconstitute and use all of that. This has, as Jonathan Haidt puts it, elevated our performance far above our design capacity.
Mounk: How do we defend the Constitution of Knowledge against attacks?
Rauch: The Constitution of Knowledge only works when you have lots of viewpoint diversity. We never see our own biases: We believe we’re unbiased and everyone else is biased. You can only find errors if you have lots of different points of view. Unfortunately, there are a lot of newsrooms and a lot of academic departments where there just aren't enough conservative voices, right and center-right. I'm talking about classical liberal perspectives. And when you get an environment that is tilted so heavily to one side, you make errors, you make mistakes, you don't see problems, you fail to ask important questions. And there's now worrisome evidence of that happening in a systematic way in academia. It's certainly happening in journalism, in some important newsrooms. And so I think something we've got to do to strengthen the system is take viewpoint diversity every bit as seriously as we do ethnic identity and demographic diversity. We've got to start looking around and saying, “What are the obstacles to recruiting and attracting conservative voices in sociology departments?” We're not used to thinking that way.
Mounk: Let me ask you a final question, on the pessimistic ledger, about the future of a Constitution of Knowledge. Historically speaking, the time in human history when we've been able to sustain something like a Constitution of Knowledge is very short and geographically limited. On the optimistic side of the ledger, I suppose I would put the fact that there is some commitment to it, at least in liberal societies; the fact that it allows us to do very, very important things; and the fact that perhaps societies that are better at maintaining a Constitution of Knowledge are likely to outcompete societies that don't, so there is a kind of quasi-evolutionary pressure to maintain and adopt the Constitution of Knowledge. But how do those balance out?
Rauch: There are structures and systems, institutions, norms, nameable people: places like the American Association for the Advancement of Science; agencies like the National Oceanic and Atmospheric Administration; nameable individuals like Liz Cheney, who stood up for truth at great personal cost. We need to understand that those people and institutions and norms are there, and we can't take them for granted. We need to defend them and not just assume everything works out. If we just stumble ahead blindly and let all these attacks proceed, there are going to be serious problems. If that happens, then we're looking at Russian-style disinformation being a feature of American politics for a very long time. If we do wake up and become more conscious that we have a Constitution of Knowledge, that we need to defend it, then we can defend it. I think if we do that, then we squash the other side like a bug—maybe not right away, but we have the advantage of enormous institutional depth and we have the advantage of reality. Trolls and disinformation artists and cancelers, they lose touch with reality. They can't put that vaccine in my arm. They're parasitic. They're nihilistic. They're negative. All they can do is restrict. So if we get our act together, I think we've seen off significantly worse challenges in the past. I think we will again. But that's a conditional statement.
Please do listen and spread the word about The Good Fight.
If you have not yet signed up for our podcast, please do so now by following this link on your phone.
Email: podcast@persuasion.community
Website: http://www.persuasion.community
Podcast production by John T. Williams and Rebecca Rashid
Learn more about your ad choices. Visit megaphone.fm/adchoices
Connect with us!
Twitter: @Yascha_Mounk & @joinpersuasion
Youtube: Yascha Mounk
LinkedIn: Persuasion Community
Learn more about your ad choices. Visit megaphone.fm/adchoices
Share this post