How to Reclaim Social Media from Big Tech
“Middleware” is an idea whose time has come.
This article is brought to you by American Purpose, the magazine and community founded by Francis Fukuyama in 2020, which is proudly part of the Persuasion family.
Social media platforms have long influenced global politics, but today their entanglement with power is deeper and more fraught than ever. Major tech CEOs, who once endeavored to appear apolitical, have increasingly taken far more partisan stances; Elon Musk, for example, served as a campaign surrogate in the 2024 U.S. presidential election, and spoke out in favor of specific political parties in the German election. Immediately following Trump’s re-election, Meta made radical shifts to align its content moderation policies with changing political winds, and TikTok’s CEO issued public statements flattering Trump and praising him for his assistance in deferring enforcement of regulation to ban the app. Both Meta and X chose to settle lawsuits that had been widely seen as easy wins for them in the courts, with their CEOs making donations to Trump’s presidential library, in presumptive apology for their fights over his post-January 6 deplatforming. Outside of the United States, there is growing tension between platforms and EU regulatory bodies, which Vice President JD Vance has opportunistically framed as concern about “free speech” amid increased European calls for “digital sovereignty.”
While companies have always sought to maintain favorable relationships with those in power—and while those in power have always sought to “work the referees”—the current dynamics are much more pronounced and consequential. Users’ feeds have long been at the mercy of opaque corporate whims (as underlined when Musk bought Twitter), but now it is clearer than ever that the pendulum of content moderation and curation can swing hard in response to political pressures.
It is users, regardless of where they live or their political leanings, who bear the brunt of such volatility. Exiting a platform comes at a high cost: we use social media for entertainment, community, and connection, and abandoning an app often means severing ties with online friends, or seeing less of our favorite creators. Yet when users try to push back against policies they don’t like—if they attempt to “work the referees” themselves—they are often hindered both by a lack of relative power and the lack of transparency about the internal workings of platform algorithms. Without collective action significant enough to inflict economic consequences, user concerns rarely outweigh the expediencies of CEOs or governments. Unaccountable private platforms continue to wield disproportionate control over public attention and social norms.
We need to shift this paradigm and find alternatives that empower users to take more control over their social media experience. But what would that look like?
As Francis Fukuyama and others at Stanford University argued in 2020—and as we expanded upon in a recent report coauthored with Fukuyama and others—one promising solution is middleware: independent software-enabled services that sit between the user and the platform, managing the information that flows between them. For example, a user might choose a middleware service that filters out spammy clickbait headlines from their feed, or one that highlights posts from trusted sources in a specific domain, like public health or local news. Middleware can help rebalance the scales, empowering users while limiting platforms’ ability to dictate the terms of online discourse.
Putting users in control of their attention
Middleware has the potential to transform two of the most contentious functions of social media: curation and moderation. Curation shapes how content is ranked in users’ feeds, shaping which voices are amplified. Moderation governs what is allowed, labeled, demoted, or removed. Both functions have become politicized battlegrounds, with critics on all sides accusing platforms of bias, censorship, or failing to address harms.
Middleware cuts through this dynamic of overly-centralized control by offering users and communities control that is more direct and context-specific. An open market (think “app store”) of middleware software and services would allow users to freely choose from a variety of algorithms and/or human-in-the-loop moderation services to compose their feeds for them. For instance, one user might prefer to subscribe to a feed optimized for civil discourse, another might choose one that highlights breaking news, while a third wants cat pictures. On the moderation front, some users may want to see profanity and nudity; others may want to subscribe to a tool that hides or labels such posts in their feed. Flexibility allows people to tailor their online environment to their needs (which shift depending on task, mood, or context) or to their political orientation or membership in different communities. This supports a greater diversity of online experience in terms of politics, values, and norms, enabling users and communities to select for their desired “vibe”—not one imposed by platform overlords or a tyranny of some majority.
Middleware can also reduce the risk of political capture, making it more difficult for incumbent platforms, or governments, to exert undue pressure or outright manipulation over online discourse. It fosters competition and innovation by enabling a robust market of providers, which improves both transparency and responsiveness to user and community needs. Importantly, middleware replaces the binary choice between centralized control and total anarchy with an adaptive middle ground that empowers individuals, communities, and institutions to shape their own social experiences.
Retaking control
So how does increased user choice become a reality? Where Facebook, X, and the other incumbent giants are concerned, middleware’s success depends on their cooperation. Third-party tools need the ability to interoperate through open protocols or interfaces. So far, platforms have shown very limited interest in enabling this. However, as moderation becomes more politically fraught, they may decide that devolving more control to users—selectively opening their “walled gardens”—really is a smart choice. Meta’s Threads app is experimenting with a limited degree of such openness.
Whatever the centralized providers do, an alternative path is already emerging. Decentralized platforms based on open protocols, such as Mastodon and Bluesky, have been designed from the ground up to prioritize user choice and agency—without needing permission from a corporate gatekeeper. This is most apparent on Bluesky, which now serves well over 30 million users, some of whom already subscribe to alternative feeds for curation and independent content labeler services that flag porn or hate speech. Newly-formed non-profit foundations that serve as custodians for the Bluesky and Mastodon protocols (one using the very apt #FreeOurFeeds hashtag) promise to ensure that these infrastructures can remain “billionaire-proof” and open to competition, as public goods.
This open infrastructure model is not anti-commercial. On the contrary, it opens space for innovation, extensibility, and entrepreneurship. Just as Apple’s App Store created a flourishing ecosystem of third-party tools, middleware could spur new markets for feed curation, trust labeling, moderation filters, and more. News outlets might create branded options: the “Fox News Feed,” or the “New York Times Feed.” Trusted intermediaries—civil society groups, perhaps—might offer labels grounded in shared community values. Interoperable services can compete and cooperate across an ecosystem of distinct but connected communities. The goal is not to overwhelm users with technical choices, but to create options—similar to how users can now easily choose an email service or an add-on function extension for a browser.
Policy support
Policymakers can help promote user choice by removing barriers that entrench the status quo. On the regulatory front, lawmakers should reimagine outdated statutes like the Digital Millennium Copyright Act (DMCA) and Computer Fraud and Abuse Act (CFAA)—laws that, while originally designed to protect creators and national security, have too often become tools for corporate suppression of competition. By reforming these laws, barriers that favor entrenched monopolies can be dismantled, promoting a more open internet, and ensuring that the interests of users, communities, and innovators come before exploitative profit. There are also worthwhile legislative efforts like the proposed Senate ACCESS Act, which would require “the largest companies make user data portable – and their services interoperable – with other platforms, and to allow users to designate a trusted third-party service to manage their privacy and account settings.”
Middleware empowers communities to decide how they wish to balance competing democratic values—free speech, protection from harm, pluralism—even in a time of high polarization. It offers a path toward a more democratic and resilient information ecosystem, where users have more agency over their attention. The question is no longer whether such alternatives are necessary or feasible—it’s whether they can be scaled, enhanced, and sustained to meet the moment.
Renée DiResta is an Associate Research Professor at the McCourt School of Public Policy at Georgetown and author of Invisible Rulers: The People Who Turn Lies Into Reality.
Richard Reisman is Nonresident Senior Fellow at the Foundation for American Innovation.
Follow Persuasion on Twitter, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.
And, to receive pieces like this in your inbox and support our work, subscribe below:
The issue is not free speech, but rather free news, and free anonymity, a free soapbox and amplifier with which to respond. Of course nothing is free. What was once selected through purchase of printed newspapers, is now funded by third parties selling something, and apparently free to the user. To obtain third party funding news has become the carnival midway of decades prior. "See the two headed lady" "watch the human cannonball" , "see the worlds fattest man" have simply been updated. And people continue to suspend reality and rationality in discerning the truth of events around them. Do not blame big tech or the circus owners, people seek distraction and entertainment. More importantly our politician's use this gullibility to be elected to our government and enact bizzare laws and policies, rather than being innocently displayed on the circus midway. To solve the problem long term "middleware" is needed between the grotesque passions of the people, and the longer term needs of society. Repealing the 17th amendment might be a good start.