Persuasion
The Good Fight
The Good Fight Club: Who’s a Hypocrite About Free Speech?
Preview
0:00
-42:27

The Good Fight Club: Who’s a Hypocrite About Free Speech?

Yascha Mounk is joined by Renée DiResta, Jacob Mchangama, and Jonathan Rauch at the Global Free Speech Summit in Nashville!

In this episode of The Good Fight Club, Yascha Mounk, Renée DiResta, Jacob Mchangama, and Jonathan Rauch discuss threats to free speech under Joe Biden vs Donald Trump, how to protect free speech, and the administration’s new compact for universities.

Renée DiResta is an Associate Research Professor at the McCourt School of Public Policy at Georgetown and author of Invisible Rulers: The People Who Turn Lies Into Reality.

Jacob Mchangama is the Executive Director of The Future of Free Speech and a research professor at Vanderbilt University. He is also a Senior Fellow at The Foundation for Individual Rights and Expression (FIRE) and the author of Free Speech: A History From Socrates to Social Media.

Jonathan Rauch is a senior fellow in the Governance Studies program at the Brookings Institution, and a member of the Persuasion Board of Advisors.


Yascha Mounk: Welcome to this live recording of The Good Fight Club coming to you directly from the Global Free Speech Summit 2025 at Vanderbilt University. We have with us—please give them a big round of applause—Jacob Mchangama, who is the executive director of The Future of Free Speech and the organizer of this conference. We have Renée DiResta, a professor at the McCourt School of Public Policy at Georgetown and author most recently of Invisible Rulers. We have Jonathan Rauch, a senior fellow at Brookings and a contributing writer at The Atlantic.

This summit is about free speech, and we’ve had really great speeches and conversations about free speech over the last 24 hours. There is, I think, a question about free speech which is hard to escape at the moment, which is that virtually everybody who claims to care about free speech seems to be a bit of a hypocrite. When you look at some of the loudest defenders of free speech on the right of American politics, whether that’s somebody like Elon Musk or somebody like Donald Trump, suddenly they are demanding the firing of people who offend them. They talk about the importance of what they call “consequence culture.” They say of course people should be free to say whatever they want, but then there should be all these terrible consequences for when they express themselves in unpopular ways.

The left is just as hypocritical. The left has spent, or these parts of the left have spent, years arguing for consequence culture and talking about why actually free speech is a reactionary conservative value. Now suddenly when Jimmy Kimmel is fired, they are holding rousing defenses of free speech. So the slightly provocative question I’m going to ask each of you is, should we give up on free speech if everybody who talks about free speech is actually a hypocrite? If not, then why should we care about this value?

Jacob Mchangama: We are all hypocrites about free speech. It is human nature. Free speech is a deeply difficult concept and in some ways counterintuitive to human nature, I think. The good thing is that we have made progress. The circle of accepted speech, at least in mature democracies, has grown a lot. I do not think we will ever escape hypocrisy. It is what, in my book on the history of free speech, I call “Milton’s Curse” after John Milton, the famous English poet who wrote the Areopagitica in 1644, a very famous defense of free speech. But when you read it more carefully, it becomes clear that Milton is not in favor of free speech for Catholics. He is not in favor of blasphemy. In fact, he is mostly in favor of free speech for mainline Protestant sects. So it is definitely not a new concept, but it is something to be on guard against.

I think this is where, in the United States, the value of the First Amendment becomes really clear because it provides a very strong, robust protection against government overreach. It is a necessary but not a sufficient protection for free speech because unfortunately, as we see, cultural institutions, law firms, and media outlets voluntarily cave to the pressure of the Trump administration, even though they could rely on the strongest free speech protection in the world, which I think is an affront to dissidents in Iran and Russia who cannot rely on that protection. But I think it is a big difference, particularly compared to the European model, where a lot of the things that the Trump administration is threatening to do, but which would violate the First Amendment, are perfectly feasible and actually being carried out in Europe right now.

Ultimately, we need that civil libertarian impulse to win the day. That is extremely difficult at a time of intensive polarization because that just overrides the civil libertarian impulse.

Mounk: Jonathan, if we’re all hypocrites, why insist on the value?

Jonathan Rauch: We are all hypocrites some of the time on some issues, and that is okay with me. It is just human nature to be more reluctant to defend the speech of people we loathe than the speech from people we love. That is quite natural. It is, however, why we develop institutions and rely on them to give us guardrails, and those are what are being crashed through right now.

Renée DiResta: When I saw the prompt, one of the things that occurred to me was the extent to which hypocrisy is maybe not necessarily the best frame, in part because I think a lot of people do not actually know the specifics. They do not understand the nuance. They do not really know the details. One of the things that I think a lot about in the context of the work that I do, the sort of intersection of speech and the internet or content moderation or the areas that I work in, is actually just the lack of understanding of the facts of the matter and the extent to which that allows polemicists to spin things in particular ways, and the extent to which that spin, when coming to you from a very trusted voice in a media ecosystem, particularly a niche media ecosystem, means you are not necessarily going to see the alternate perspectives.

The way that social media curates and presents information to you means that if you trust Jim Jordan, you are going to see this particular drum that he has been beating for three years, and you are going to think that the Biden censorship regime is real, and that it manifested in very particular ways. You are not going to see the facts presented that came out of particular court cases and other things that undermine those claims. It takes an extensive amount of work to try to break through with—actually, here are the facts, here is the information, here is what actually happened, now maybe form an opinion on it. I think that it is not even so much a hypocrisy issue. It is that many people are not actually seeing the facts of the matter. That creates an additional layer of complexity on top of that.

Mounk: I think this is one thing that I really want to get into today, which is how we should think about some of those attempts to moderate social media. Is that a threat to free speech, or is that in fact an important way of facilitating free speech? Perhaps that is going to be one of the areas in which we have some disagreements on this panel. To stick with this idea of hypocrisy for a moment, there is one defense that I could make of the importance of free speech despite the hypocrisy, which is that, in lots of other areas, we also recognize the importance of the value even if everybody flips on it.


Thanks for reading! The best way to make sure that you don’t miss any of these conversations is to subscribe to The Good Fight on your favorite podcast app.

If you are already a paying subscriber to Persuasion or Yascha Mounk’s Substack, this will give you ad-free access to the full conversation, plus all full episodes and bonus episodes we have in the works! If you aren’t, you can set up the free, limited version of the feed—or, better still, support the podcast by becoming a subscriber today!

Set Up Podcast

And if you are having a problem setting up the full podcast feed on a third-party app, please email our podcast team at leonora.barclay@persuasion.community


It is always the case that when Democrats are in the White House, they suddenly discover the importance of having extensive presidential powers. The moment they get into the opposition, they say we need really strict limits on it. Republicans talk about the importance of having very strict limits on presidential power when they are in the opposition. Once they are in the White House, they do not think about those for a moment. There are all kinds of areas where we see this kind of switching, for example about whether you should be able to suspend the majority requirement to have 60 senators vote through a certain kind of appointment for you to the judiciary, but it again flicks back and forth depending on political parties. None of us would say, therefore, that the idea of limited government or the idea of checks and balances is unimportant. We recognize that precisely the temptation people have to use all the power they have when they are in government is one of the reasons why we need that value all the more importantly.

I do think that it also poses a kind of deeper cultural challenge, which is that if we are in a moment where our politics is so polarized that there are fewer and fewer principled defenders of free speech. If, as a matter of practice, the majority of people who make these free speech arguments are just making them in instrumental ways to serve their own cause, how helpful is that free speech talk in our politics, even if we continue to be committed to the First Amendment and to the underlying importance of those values?

Rauch: Can I break the order and challenge the premise? I think I heard you say something that requires pushback, which is that there is some sense of equivalence between Biden-era free speech hypocrisy and Trump-era free speech hypocrisy. Whatever the Biden administration did by way of jawboning Twitter and other social media is ant-size, microscopic compared with what the Trump administration is doing—seizing people off the street because they wrote op-ed pieces, weaponizing the FCC, threatening to pull licenses if Disney does not take Kimmel off the air. I could go on. This is a different kind of thing. This is not just turnabout hypocrisy.

Mchangama: Yeah, I agree that the Trump administration has definitely taken assaults on the First Amendment to the next level. On his very first day in office, he signed this executive order on restoring free speech, which has not been honored. I would say that in the Biden years, the vibes coming out of the Biden administration, also on the cultural aspects of free speech, were very much in tune with the 2020 vibe, when people could get canceled for speech. When I heard Democrats talking about regulating the internet, it seemed like they were very envious of developments in Europe and saw the First Amendment as an impediment.

To a certain extent, they respected it. Whatever they did, they did behind the scenes, not in the obvious ways that the Trump administration has done. I think, though, it is important to note that not that long ago, Attorney General Pam Bondi went out and said, hate speech is no longer free speech, which is not in line with First Amendment practice and is something that people on the left in the United States have long said—the First Amendment should not protect hate speech.

There was actually quite a bit of pushback from conservatives, even MAGA conservatives, who said, we do not want to go down this road. So it was not like the entire conservative movement was on board with suddenly introducing hate speech restrictions, and she had to walk it back. She did not do it very elegantly. That seemed to me like there were parts of even hardcore MAGA conservatives who said this is a red line that we do not want to cross, and I think that is important.

Mounk: I love it when we have an actual disagreement on The Good Fight Club, and I sense that there is a genuine disagreement here. So, Renée, if the argument is that yes, what the Trump administration is doing on free speech now is terrible, and I agree with you, Jonathan, that it is worse than what the Biden administration did, but if people then say, look, under Biden, the government did exert considerable pressure, as we now know from admissions from executives at Google and other social media platforms, to take off certain content about COVID from the internet, for example, we know that for a while the entire account of the New York Post was blocked on Twitter because of stories about Hunter Biden’s laptop, which turned out, according to The New York Times and other outlets, to be truthful.

There was certainly a very broad cultural defense of the idea that people should be canceled for various things they say and that we should have this kind of consequence culture. There were progressives who were on record as saying that the idea of free speech is a conservative ideal and not one that progressives should embrace. Is that part of the prehistory that explains this moment, and should we be concerned about some of the things that happened during the Biden administration, or do you agree with Jonathan on this?

DiResta: I’m very glad you brought that up. Who was president when the New York Post’s account was locked?

Mounk: Sure, that was still under Donald Trump during the 2020 campaign.

DiResta: This is one of the amnesia moments that I think it is actually important to call out. The president of the United States in 2020 was Donald Trump. Many of the content moderation policies that people object to today—and that, ironically, Ted Cruz wrote an entire report about two days ago calling the Biden censorship regime—he alleged that the Biden censorship regime began in 2018. Who ran the government in 2018? Can anybody answer that question? We all know.

Yet here we are, re-litigating how long the New York Post‘s account was blocked on Twitter. I said to Bari Weiss on that day in a public exchange—since we’re friendly—that it was a very bad call. How long was that account blocked? Approximately 48 hours. What happened when it was blocked on Twitter, when that URL was banned on Twitter? It was shareable on Facebook. It was shared 500,000 times on Facebook.

Two of the most notorious stories that the right harps on—and I say this as an independent—that the right harps on as the pinnacles of egregious content moderation tyranny, one happened during the Trump administration. It was blocked for 24 to 48 hours, and it was a bad call, but it was not what people have turned it into.

Now let us talk about COVID. The COVID lab leak theory, which is the other example that everybody points to, was blocked by one platform for three months, not at the behest of Biden; it was at the behest of the World Health Organization. Meta lays this out in its policies—that from February 2021 to May 2021, it blocked it for three months at the behest of the World Health Organization. By the way, I said that too was a bad call. I thought that was a bad call because I did not think it was in line with any of the other COVID policies that ostensibly were trying to protect people’s health or protect community health. Whatever you think about them, that was the dominant motivation behind the content moderation policies that the platforms put in place.

In the letter that Google sent to Jim Jordan this week—for those who do not know, Jim Jordan received a letter under subpoena; this is a letter from a company under subpoena that has been investigated for the last two years. I know this because I was investigated for the last two years. Under subpoena, he secured this letter from Google that says, the Biden administration requested us to moderate. We stuck with our policies. We did not change them. We refused the pressure. One of the things that Jim Jordan says is, but I secured a commitment from them that they are not going to fact-check. The irony of that letter was several layers deep.

The other thing that Jim Jordan had was two years of interview transcripts from about 20 interviews that he conducted with YouTube executives, which he dropped in a 12,000-page document dump about a week before Christmas in December 2024. If you go through those, every single executive says, we did not feel pressure, we did not feel coerced.

What Jon is saying is 100% correct. Every administration—and I went and I sat in oral arguments at the Supreme Court hearing for Murthy v. Missouri—and Justice Kavanaugh and Justice Kagan are sitting up there on the dais saying, when we worked in the White House, we applied pressure to the media too. There is a difference between pressure and coercion. What you are seeing here is something that is very different. People really need to understand that the facts of the case and what happened in this so-called Biden censorship regime are absolutely not on par in any way with what has happened since.

Mounk: Jacob, what do you think? Renée’s argument is that a lot of the “censorship regime” actually started while Trump was in office and therefore was not directed by the federal government, and that some of the most famous instances of censorship were short-lived or were partial—only on some social media platforms. Do you think, therefore, that more moderate political parties, whether that is the Democrats in the United States or some of the governments and government entities in Europe, do not have a problematic relationship to free speech—that the threat only comes from the populist right—or do you think that there is a genuine problem with the attitude toward free speech in the political mainstream?

Mchangama: No, even if it was short-lived, it is still an egregious example. That is one example, and those are the most prominent examples that are being brought to light. But what are the others that have not been brought to light? So I would not downplay it. Again, I think it is quite clear that the current administration is on a multi-pronged attack on free speech.

It is not related solely to social media. I would say that what the Trump administration is doing quite cleverly is going after elite institutions—traditional media, universities, law firms—all of those that were seen as being complicit in the cultural repression of free speech that went along with the 2020 vibe. That resonates with people who felt that their voice was limited, not necessarily by the government, but by elite institutions.

That is why, when Jane Fonda launches a First Amendment initiative to defend it, and you have a lot of Hollywood stars signing on to it, I think that feeds into the culture war narratives where people will say, yes, now you are interested because Hollywood stars and prominent journalists are on the receiving end, but you did not care when it was the ordinary social media user. That is the problem of polarization because you want both elite institutions and the ordinary user to be able to exercise free speech.

When you have that dichotomy—where it is the masses versus the elite narrative surrounding it—you pit two constituencies against each other, and that just feeds the underlying culture war.

Rauch: Jacob, could I see if I can locate whether or not you are disagreeing with Renée and me? I have been in the free speech business since Kindly Inquisitors in 1993. That book was about non-governmental repression of speech and thought at universities, but also elsewhere. I think that you and I, and probably all of us, agree that free speech is not just about law and government; it is also about culture. You need to have cultural institutions that follow principles of free speech, especially universities for whom the pursuit of inquiry and knowledge is fundamental.

So I think we all agree that culture matters, but I am going to guess that we may also agree—but I will try you on this—that government repression of speech is more concerning than cultural repression of speech. I would argue it is an order of magnitude more concerning because the government can yank your license, investigate you, try you, or put you in jail.

That is where I bridle at an equivalence between what students are doing on campus by way of canceling other students who are right-wing versus what the FCC just did or what ICE did. I am going to insist that these are different kinds of things and that one is much more serious than the other.

Mchangama: I just said that I thought the Trump administration is worse on free speech than the Biden administration. But when you are talking about culture, go back to John Stuart Mill. He talks about the tyranny of the majority. So I would say it depends on the context of a specific country whether cultural intolerance is worse than the government doing it.

Rauch: We do disagree. I think the problem with Europe, which you have rightly identified with what is going on in Britain, is that it is government-mandated. They are arresting people, and I am always going to worry about that much more.

Mchangama: If the government puts you in jail, that is definitely worse than being thrown off Facebook for a comment. I would agree with you on that. But again, I think that you have been a very prominent voice on cancel culture, and I think there is a danger if you say, well, we should not worry about that anymore because the Trump administration is doing that, so now we have to downplay what went before it. I think we need to be able to hold both things in our head and recognize that even if there is a hierarchy, this is worse. Getting thrown off Facebook is not the same as being jailed, but both matter.

Rauch: We do have to walk and chew gum. But cancel culture is cancer and what the Trump administration is doing is a heart attack.

Mounk: I want to go to the normative question. I think this is a really interesting attempt to understand what did and did not happen over the last five years, but I think there is also a question about what the ultimate settlement should be on things like social media. Obviously, there are many kinds of content on social media that are bad in all kinds of ways, whether they are straightforwardly hateful or spreading information that is false—lies about particular people that, in principle, might be libelous but are probably difficult to prosecute for all kinds of reasons.

How should we think about the moderation of social media? How much power should social media companies have to censor political viewpoints they dislike on their platforms? For example, do we say these are private companies so they can do whatever they want, or do we think that some of the liberties and privileges they have—for example, by not being legally responsible for their content—should be subject to some form of viewpoint neutrality that is maintained on those platforms?

What should the government be allowed to do? To what extent should the government be allowed to go to Twitter and Facebook and other social media platforms and say, we are really worried about misinformation about COVID or the election or whatever else, and therefore we want you to play ball in this or that way? At what point do we think that becomes a form of government interference in free speech?

Let us take it out of the debate about what did or did not happen during the Biden administration. What actually is the principled position, from a set of people who are not hypocrites about free speech and who very much care about this value, for how we should regulate this space?

DiResta: I have looked at the regulatory conversation for a little over a decade now and watched many arguments come up about CDA 230. Many of them came from the right, arguing that CDA 230 should be contingent upon must-carry, that platforms must not be able to take down certain types of content.

These laws have largely been found to be unconstitutional because the fact is, this is a platform with editorial and curatorial rights, and under the First Amendment law in the United States, the platform has the right to decide, as a private company and as a private entity, what it will or will not carry. That is simply the law.

Mounk: Can I ask you a question about that? My understanding is that The New York Times, The Boston Globe, and The Washington Post have a right to publish whatever they want, but they also have forms of liability and responsibility for what they publish. What normally comes hand in hand with having some form of editorial oversight over your product is the fact that you are then responsible for what it is you publish.

You know much more about this than I do, so I may be getting details wrong—but my understanding is that Section 230 says you are now free from those obligations. If somebody on your platform says something libelous, you, as the provider of this platform, are not legally responsible for it.

So I think, from my understanding of First Amendment jurisprudence, it would be perfectly acceptable for the federal government to say that if you want to keep Section 230 protection against being liable for the contents that you publish on your platform, then you have to be a platform rather than a publisher. The moment you start censoring some things but not other things, you are more similar to The Boston Globe, The New York Times, and The Washington Post. Like those publications, you should then be liable for what people say on your platform. You get into the business of being a publisher, we treat you like a publisher.

It is not obvious to me that that goes against either the letter or the spirit of the First Amendment, because it would be a way to make sure that those platforms—which have tremendous public reach, where much of our political conversation now happens, aren’t politically tendentious in different ways, first in favor of progressives and now in favor of the MAGA crowd.

DiResta: I think that what you are saying is a couple of things. First, platforms are intermediaries, which means they are carrying the speech of other people; they are not writing it themselves. There are some very interesting court cases that are starting to emerge about generative AI and the speech that comes out of AI, because that is much more clearly the speech of the company that has produced the AI. That is where some interesting First Amendment jurisprudence is being formed right now—whether those things are going to have First Amendment protection or whether they will be treated as products under product liability terms. There are a lot of interesting things happening there.

But social media platforms are curating and amplifying content created by users. One thing that is very difficult is the question of how you ascertain whether they have censored or taken a non-neutral stance toward a particular viewpoint. This is one of the reasons I was a big supporter of the Platform Accountability and Transparency Act, which was a piece of legislation that had fairly bipartisan support. It argued that there should be disclosure mandates made to platforms, requiring them to release certain types of transparency information, and that there should be data access provisions whereby researchers and others could request certain types of data. That would allow investigations into whether there were certain kinds of curation, moderation, or recommendation decisions that privileged one point of view over another.

I remember making this appeal to people like Senator Blackburn, saying that rather than going on the vibes of conservative censorship, if you would actually like to know, you should create a regulatory requirement for platform transparency. Ironically, this is what the Digital Services Act does. Many of the things that people in the United States want the government to regulate into existence—regulating the speech of centralized social media platforms—are what the DSA does. Ironically, you have Jim Jordan taking forays over to Europe to complain about the DSA as a censorship law. So nobody is actually content with any of the things that come out of the regulatory regime because, as I mentioned in the talk I gave here yesterday about reframing, everybody sees an opportunity to create an advantage for their side.

That is where I see the greatest hypocrisy—the flipping back and forth between the right and the left on what a centralized platform should do. My personal hope is that you actually have a proliferation of social platforms such that concentration is solved to some extent by market forces, which would dissolve the power of the concentrated platform by creating an environment where there are many more speech platforms. That is why I support interoperability laws and things that allow users to take their data and move to other places. I recognize that, too, is something that people do not find popular because they consider it to be an interference in the market, but this is where we are.

Mounk: So the Digital Services Act is the major attempt by the European Union to impose regulation on this space. I am sure you have certain things you like and certain things you do not like. I am sure you would write it slightly differently, but on the whole, do you think it is a good thing—yes or no?

DiResta: I never want to be in the position of defending European regulation. I will say I think it is unfairly maligned in American media because of the lack of nuance. The transparency provisions that it grants are highly useful. It also gives users a right to appeal when they are taken down, which is something they do not have under U.S. law. It gives them a right to demand an explanation for why they are taken down. There are a lot of user protections built into the DSA that are fantastic.

The parts that many people do not like about it are that it uses vague terminology around things like systemic risk, and it has requirements stating that speech that is illegal in the European Union can be subject to government takedown demands. There is a mechanism called “notice and takedown,” where the government issues a notice and the platform then carries out the takedown. These things, again, per my interest in transparency, should be made public. I actually support FIRE and others who have proposed legislation saying that should be done in the United States too, and I agree with that. The combination of transparency and access is critically important.

Mounk: Jacob, I’m guessing you disagree.

Mchangama: I think overall the DSA is actually a problem. Especially if you look at the current context of Europe, the typical defense of the DSA is that it does not regulate any content, it does not provide specific categories of content that should be removed. It is just that what is illegal offline should be illegal online. But the underlying reality of Europe, including the European Commission, is that it is regulating more and more speech.

With a notice-and-takedown system for illegal content, you provide this provision that when you get notified, you have to remove illegal content, and then you consistently expand the categories of illegal content. That becomes, to be a bit polemical, a censorship machine. Take, for instance, right now, the European Union—both the European Commission and the European Parliament—are pushing through a proposal to create a harmonized hate speech law, to make hate speech an EU crime. The idea is that hate speech should not be limited to any specific protected category; every viable identity should be protected, and it should be understood through an intersectional lens.

You can imagine how that might be interpreted. The DSA would mean that anything interpreted as illegal should be removed. Or take an example from Denmark, which has a recent law about desecrating sacred texts. That is now punishable. The Turkish ambassador just went out and thanked Denmark for its very wise law. That is illegal content that could be demanded to be removed under the Digital Services Act.

When you look at the speeches of European commissioners, they talk about free speech, but it is always followed by “but.” The consistent narrative is that disinformation and foreign information manipulation—a very Orwellian term—is undermining democracy. It is a constant move toward framing free speech as dangerous and emphasizing the need for top-down control. You cannot view the DSA in isolation from those dynamics. I think the net effect of the DSA is that it limits free speech much more than it supports it.

Rauch: Are you saying that the real problem, or the big problem, in Europe is not the Digital Services Act but the hate speech codes?

Mchangama: No, if you create a notice-and-takedown system where you say, we want to ensure that there is no illegal content online, and then you keep ramping up the underlying categories of illegal speech, you are basically outsourcing online censorship to private companies in ways that create huge incentives for over-removal. We have done reports at the Future of Free Speech where we looked at the legally deleted comments on Facebook and YouTube in Sweden, Germany, and France. What we found was that between 88% and 97% of the comments that were deleted were perfectly lawful. Half of those lawfully deleted comments were not even particularly controversial.

There is a tendency, especially among policymakers, to look at particular content and then say that social media is drowning in illegal content or drowning in misinformation. But the data does not support that. A good example was prior to the European parliamentary elections, when Věra Jourová, a European commissioner, said that disinformation and AI-generated content were like an atomic bomb. What happened? There was no significant impact on the parliamentary elections. It had been the same narrative in 2019, when the European commissioner also said that the European elections risked being drowned in disinformation. The autopsy afterward showed there was no problem.

When you have that narrative and a political reality in which European politicians are moving to ban more and more speech, the DSA must be seen in that context. It becomes a tool to limit more and more speech. I just cannot see how it can function in any other way.

Mounk: Jonathan, take us back to the broader principles here. You have written about the dangers of a kindly inquisitor. You have talked about what it takes to have a constitution of knowledge—a set of social institutions and norms that actually allow us to pursue truth in meaningful ways. What does that mean in the age of social media? How do we balance, on the one hand, the fact that social media can be a fire hose of falsehoods—it is one of the forces that clearly is deeply polarizing our society—and, on the other hand, the fact that precisely for the reasons beautifully expressed in your work, we should be very worried about governments having influence and say over what can be said on social media? Because sometimes we may have enlightened governments that are on the side of the good and the true, but often, as in the United States at the moment, we do not.

How are you thinking, on the whole, about the principles that should guide us as we try to figure out these difficult questions of social media regulation?

Rauch: The core idea of The Constitution of Knowledge is that free speech is necessary but not sufficient if you want to live in a world that is knowledgeable, peaceful, and free. You also need a fairly elaborate structure of institutions, rules, and norms that shape how we talk to each other in the search for truth and that allow us to build a global network of researchers, journalists, lawyers, government officials, and others who are looking for each other’s mistakes. This is a species-transforming technology. It is why humanity now produces more new knowledge every morning, literally, than it did in its first 200,000 years. Free speech is part of that.

But you need two other elements. You need the discipline of fact—you cannot just make things up or talk your way through nonsense—and you need viewpoint diversity. You need enough different ideas in the room to create generative conflict and expose mistakes. Everyone says “social media,” but social media itself is not necessarily part of the constitution of knowledge. It is an entertainment product, an advertising vehicle. It is not Facebook’s job to advance the goals of research.

Nonetheless, I would like to hope that digital platforms try to be good citizens of the constitution of knowledge and that they understand that when people look up important information—like whether the measles vaccine is safe—the answer they get back should be more likely to be true than false. I think that is good for society, and I also think it is good for the platforms. I do not think they are required to do that, but as a moral matter, I wish they would.

Four years ago, when I wrote The Constitution of Knowledge, I thought content moderation was necessary. You cannot do without it. These are communities, and they have to set community standards. Since then, I have come to think that the public is just not going to accept anything that appears heavy-handed. Now I think we should be looking at the kinds of things Renée DiResta is talking about—middleware, which gives individuals more control over their algorithms, and transparency, which lets us see and evaluate what is going on.

The government should be able to talk to these platforms—it talks to The New York Times all the time—but this should be formalized and logged. It should not be people at the White House calling up someone and yelling at them. So now I am focused on subsidiary elements that are not as heavy-handed as content moderation. Did I answer the question?

Mounk: You did, I think, better than I expected anybody could.

In the rest of this conversation, Yascha, Renee, Jacob, and Jonathan discuss how the Trump administration is attacking academic freedom, how universities should respond, and to what extent the political climate in the United States resembles Hungary. This part of the conversation is reserved for paying subscribers…

This post is for paid subscribers