Humans Suck at Seeing Into the Future
Breathless predictions about nuclear power put today’s AI “revolution” into perspective.
In 1945, humanity first released the massive power of nuclear fission, on the plains of the Jornada del Muerto in New Mexico. This was Trinity, the first nuclear explosion in history. The result was far from a controlled harnessing of energy, but it wasn’t meant to be; the Manhattan Project’s mandate was to produce bombs of unprecedented destruction, and in that it succeeded wildly. But thanks to further investigations into fission, largely undertaken to better understand the nature of the atom, six short years later nuclear power was harnessed to produce electricity for the first time. Three years after that, the American submarine USS Nautilus launched, and for the first time nuclear power was used for a sustained and practical purpose. In 1957, the Virginia-based power plant known as SM-1 began supplying power to the electrical grid, and the age of nuclear power was upon us.
Imagine if a particularly prescient person observed what was occurring. Developed by Lise Meitner and Otto Frisch in 1938, the practical science of nuclear fission had gone from theory to commercial application in less than two decades. Nuclear had gone from tearing apart Hiroshima and Nagasaki to providing clean energy for the citizenry in a dozen years. Even without expert knowledge of nuclear power, our observer would be totally justified if they believed that the technology would irrevocably change the world. Such a person would not have a sophisticated modern understanding of the importance of carbon release in the production of energy and its attendant effect on global temperatures, but they would surely still understand the value of dramatically reducing the amount of air pollution being released into the world.
We now know, of course, that the burning of fossil fuels releases carbon into the atmosphere, which then traps heat, gradually raising the world’s temperature, particularly at the poles. This change in climate presents many potential problems for humanity and the environment, both of which exist in an exquisite balance. Even the more restrained projections of the destructive potential of climate change demonstrate that its effects could be exceedingly costly for civilization. If given an education in the modern understanding of climate, our prescient observer might have noted that, widely and responsibly deployed, nuclear power could have helped prevent one of the biggest global challenges in human history.
If they grasped nuclear power’s capacity for severing the relationship between global supply lines and energy production, the consequences for their current era might have seemed even more vast. The story of the 20th century, to a remarkable extent, was the story of the fight for fossil fuels, the immense strategic, diplomatic, and espionage resources spent in the pursuit of convenient forms of portable energy. Wars were fought for oil, governments toppled, regions destabilized. All of these came at the expense of blood, treasure, and the serial violation of human rights by great powers. Quite literally, since the end of World War II there has been no question of greater international strategic importance than the question of who has access to fossil fuels—and yes, that holds true even considering the existence of nuclear weapons. Today, nothing changes a country’s international standing more considerably or more quickly than the discovery of oil. It turns ghost towns into boom towns, it props up dictators, it shifts the balance of world power. It’s the stuff that drives the world.
To pick one particular region, in 1953 the effort to secure British access to oil compelled the United States to help depose the established government of Iran and re-install the Shah, leading directly to the conditions that enabled a conservative Muslim revolutionary movement that still holds power in the country today; Iran would go on to engage in one of the bloodiest wars of the past half-century against fellow regional power Iraq, with the United States hedging its bets by sending weapons to both sides; saddled by war debt, Iraq would go on to throw its weight around by invading oil-rich Kuwait; this action was seen as imperiling Saudi Arabia, whose own theocratic government controls the second largest proven oil reserves in the world, and who invited American troops into the country for their defense; the presence of those troops in Islam’s holiest places enflamed a rich zealot named Osama bin Laden, who orchestrated an audacious terrorist attack against the United States; hungry for vengeance, the Americans invaded Afghanistan and, later, Iraq…. You get the idea. In a world where nuclear energy had become a dominant form of energy production, the path of recent history would have been radically different. It’s true that fissile materials are themselves natural resources that require extraction, and could spark conflict, but it’s also true that they are not nearly as rare as believed in the popular imagination, and vastly less of them is required to produce a comparable amount of energy as oil.
There was every reason for a nuclear revolution to happen. Nuclear energy is far safer than many people think, with only a small handful of adverse incidents in over 70 years of using fission to generate electricity. Indeed, the overall reliability of nuclear power plants is believed to exceed that of most other forms of energy generation. Initial costs for establishing plants can be steep, but not unreasonably so relative to building a conventional fossil fuel plant, and such costs can be recouped with nuclear power’s efficient energy generation. The land footprint of nuclear power plants is small, typically requiring something like one square mile, while wind and solar farms take up vast stretches of geography. The actual electricity-generation function of nuclear fission is zero-carbon, and while the plants themselves generate some carbon impacts, they’re a rounding error compared to burning gas or coal. Widespread nuclear use could have hastened overall electrification of our society, which in addition to combating climate change has a tendency to reduce maintenance costs, and not just in cars; replacing gas-powered engines with electric motors in most contexts can provide potential maintenance advantages. Nuclear power would not have come at zero cost—nothing good does—but the potential benefits to society were immense, and in the latter half of the 20th century, we had every ability to grab them. And the truth is we have no idea just how different the world would look now, had the adoption of nuclear power taken what seemed to be its natural course.
Then the nuclear revolution just… didn’t happen.
A graph of the development of nuclear power, since its inception, looks like a hopeful thing at first. From its beginnings in the late 1950s, the number of plants and kilowatts of energy being generated rises steadily. Europe, in particular, seemed poised to adopt nuclear at scale, with France being particularly aggressive in opening plants. In 1973, The New York Times declared a “building boom for nuclear power plants.” But, at around the 1990 mark, the growth in nuclear energy plateaus, and at some point in the mid-2000s, begins to fall. While some new plants are still being built, a projection of future closures and the paucity of new construction suggests a rapid decline in the use of the technology. In the recent past there have been enough green shoots, so much so that, in 2008, the Times suggested that we were entering an era of revival. But such optimism has mostly not panned out, and now renewables like solar and wind have more public support, more buzz, and higher government subsidy. As if to put a bow on things, Germany recently shut down its last nuclear power plant—during an energy crisis that has underlined the country’s historical dependence on Russian natural gas.
An informed observer of these developments would have been perfectly justified in assuming that, even under conservative estimates, the technology would result in massive changes in the lived experience of human life. And yet today nuclear power is largely a curio, and one with a bad reputation. Very misguided (but consistently effective) protestors have led a lot of people to reflexively oppose nuclear without knowing why, and that’s just in the abstract—in specific locations, even many people who are ostensibly in favor of nuclear power vociferously oppose new plants. Now people only want to talk about “renewables.” Today, only 10% of worldwide electricity generation comes from nuclear power. That isn’t nothing. But the technology remains a revolution that could have happened and didn’t. And I think it’s an object lesson in epistemological humility: we don’t really know what’s going to happen, today or in the future, about technology or anything else.
Now, when I have made this point privately in the past, I’ve gotten some version of this response: “but not developing widespread nuclear power was stupid! The people who oppose nuclear power are idiots!” To which I say, yeah, mostly. Nuclear energy is a no-brainer in a world of climate change. And we can still change course on expanding its use. But my point here is not to say anything in particular about nuclear power, other than this: sometimes things don’t happen when you have every reason to think they will. Sure, opposition by stupid people sank nuclear energy, not its safety or efficiency. But stupid people are the most powerful force in the world, and they will follow us into the future no matter what we do. They’re one of a myriad of chaos agents that bend the course of human history, unpredictably and in defiance of all of our extrapolation. Betting on nuclear power’s large-scale adoption would have been a very smart thing to do, back in 1957. It just also would have turned out to be a losing bet, against all sense. Sometimes history rolls snake eyes.
I’m saying all of this as a prologue to an argument about artificial intelligence. Recent developments in machine learning and “neural networks” have left our news media and our intellectual culture, let’s say, a little excited. The media is absolutely festooned with what I would call “AI maximalism.” The term is meant to denote both people who think artificial intelligence is going to save humanity from all of its problems and those who think it’s going to kill us all, and both relatively soon. A term that refers to both is necessary because both stem from the same impulses—the desire for an entirely new world, the common human yearning for revolutionary change, the tendency of our culture to reward hype. Every day, you can read a new article in a stuffy publication expounding on all the ways that AI is going to completely change the world; every day, you can read thousands of tweets by people certain that the AI-dominated future is not just coming but coming sooner than you think. There’s a weird self-denying aspect to all of this; we have a media conversation about AI that is as alarmist and sensationalistic as I can imagine, and yet many on social media routinely complain that we’re not taking AI seriously enough. This has the advantage of making them seem like wise Cassandras, but it defies all evidence. It’s genuinely unclear to me how the AI hype could grow more heated than it stands now.
You can read a lot of intricate arguments about what AI will and will not be capable of in the near future. Unfortunately, there’s about 1% careful engagement to 99% insane hype. (Actual headline from a professional publication: “AI Can Now Make You Immortal—But Should It?”) The trouble, in part, is that there’s just so little professional advantage in telling people to slow down and take a breath. The nature of contemporary journalism and social media ensures that “AI IS IN YOUR HOUSE WITH A KNIFE RIGHT NOW” will always, always outdraw “AI Might Have These Interesting Consequences, But Maybe Not.” What I can’t understand is why so few are adjusting to this reality by deciding to be more consciously cautious. Look at the current landscape. Which is more likely? That the hype now is too little, or too much?
The problem is that people mistake rapid development in a given technology for a rapid onset of predictable consequences. As nuclear power shows us, those aren’t nearly the same thing. What so many people who talk about AI seem not to understand is that the argument against AI maximalism is not an argument about technology. It’s an argument about the contingency of history. Human beings have a terrible track record when it comes to futurism, and there is no reason to believe that we’ve gotten better at it in the recent past. I could come up with a whole argument about the mathematical reality of chaos and the tendency of minor changes to make massive impacts on human events, but it’s easier simply to observe that we try and fail to predict the future, again and again, and we’re always wrong, and then we go on confidently making predictions again.
When I got to Purdue University in 2011, in pursuit of my Ph.D., I met a couple of very earnest grad students working with 3D printing, which was having something of a moment back in those days. And these guys—extremely intelligent, well-credentialed, brilliant in the technical capacities of their field—told me that, in ten or twenty years’ time, stores like Walmart would have to radically retool or close their doors, because everyone would have a large-scale 3D printer in their garage that would produce most of their physical goods. They very sincerely believed that to be true, and they were deeply versed in the reality of the technology. They weren’t alone. A dozen years or so later, and adoption of the small-scale hobbyist 3D printers that are commercially available remains anemic, owing to the limitations and frustrations associated with their use. Self-driving car hype was so intense seven or eight years ago that writers at serious publications could solemnly predict that it would soon be illegal to drive your own car. Today self-driving cars remain a niche of a niche, employed in only a few select locales because even mildly inclement weather can paralyze their systems. Just a few years ago, 5G cellular networks were predicted to have massive effects; today, even in those places where the required infrastructure has been deployed, very few people can point to any material change in their lives at all.
You can say that these are all minor examples, and maybe they are. But they point, again, to the simple reality that we’re bad at predicting the future, and that the incentives of hype in our information economy are now so intense that it’s effectively impossible to be wrong by telling people to calm down. It appears today that not even The New York Times or The New Yorker can resist the siren song of sensationalism, so I guess it falls to me to say: calm down. AI is not coming to save you from your ordinary life, not with utopia nor with apocalypse. You have to make peace with this life, here, in a world that will go on being more or less the world you know. I’m sorry to disappoint you.
Will AI change some things? Probably. I’m sure it will disrupt some industries, to use that tedious phrase. It already does some cool things. But the basic technological situation is this: though many people believe that we live in some technologically revolutionary period, the fact of the matter is that human scientific advancement peaked from around 1860 to around 1960 and has slowed considerably since; the only field where tremendous progress has been made is information science and communications; and what architects call the built environment (that is, the corporeal reality in which we actually live) has changed remarkably little in a half-century. And I think that dynamic persists with AI: it will likely transform computing but have far more limited influence on the real world. As the Silicon Valley people say, bits are easy, atoms are hard. Meanwhile, we will adapt to the social and cultural changes and preserve the disappointing grind that is modern life. Normal always adapts.
I can anticipate an objection to my analogy here: nuclear power requires a tremendous amount of physical infrastructure to deploy, while AI requires far less; nuclear power (thankfully) involves a great deal of regulation and permitting, while AI development at present requires none. Therefore, an AI takeover is more likely than a nuclear-powered future was in the 20th century.
I would respond by saying that, first, I think this line of reasoning underestimates just how vast the infrastructure powering AI really is, as these systems require massive amounts of computing power and thus large networks of server farms, which are expensive to build and run. But more to the point, I think this mindset misunderstands the point: the argument is not “nuclear power was likely and didn’t happen, so AI dominance is less likely.” The argument is “history is full of arbitrary turns and random occurrences, causality chokepoints, and chaos injections, and thus we should be far more humble about our predictions about the future.” AI looks inevitable right now. So have many things that haven’t happened.
Someday, people are going to look back at us and laugh, just as we look back and mock those 1950s futurists who predicted a nuclear power plant in every home by the end of the century. There is absolutely no prediction about AI that can be made with the same confidence as the prediction that people in the future—the near future—are going to look back in astonishment at how wrong our futurism was. The most reliable human prediction is that we will go on making bad predictions. The only reason we refuse to grapple with that certainty is because, frankly, our tendency to get lost in hubris is extraordinary. And that’s the greatest impediment we have to putting artificial intelligence and its consequences into their proper context, not bad media incentives or technological ignorance or our relentless drive to tell exciting stories, but the weight of our own egos.
Freddie deBoer is a writer and academic. He lives in Brooklyn.
Follow Persuasion on Twitter, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.
And, to receive pieces like this in your inbox and support our work, subscribe below:
And here I was about to talk about how impressed I was with your history lesson (very nice analysis, btw), and then you had to serve up the stereotypical nuclear-bro pugnacity, wherein nuclear energy is opposed because "people are stupid".
Sorry, I know this isn't the focus of your piece, but this triggers me. Not because I'm against nuclear energy (I'm not), but because this type of attitude doesn't help people trying to make the case for it.
Those who make this argument, that "Very misguided (but consistently effective) protestors have led a lot of people to reflexively oppose nuclear without knowing why" are being intentionally obtuse. People aren't scared of nuclear energy because of protestors. They're scared of it because they've seen what happens when things go wrong.
Nobody needed protestors to be horrified by Chernobyl, Three Mile Island, or Fukushima. Not to mention that we spent almost all of the remainder of the 20th Century worried about death as a result of weaponry based on the same technology – weaponry that we know horrifically and instantly killed hundreds of thousands of innocent Japanese civilians. Advocates of nuclear energy brush these objections away with abstract arguments about the number of deaths per unit of energy generated with nuclear technology, declaring it the "safest form of energy" and acting as if anyone who isn't immediately comforted by this is just an irrational ape holding back society with their defective monkey-brain.
My educational background is in mathematics, so I take comfort in statistics more than most people, but even I understand why people are uncomfortable with nuclear energy. It is entirely human to fixate on worst-case scenarios, regardless of how improbable. You can say that it's an irrational degree of concern about a rational possibility, but it's how our brains work, and even in the most certain of cases, trying to convince someone of your argument by calling them stupid for having reservations based on situations that *we've seen come to pass* is arguably just as stupid.
Not to mention, the nuclear argument may not even be the most certain of cases. I'm no expert on this, but I know that there is considerable ambiguity to estimates of deaths due to nuclear accidents. They are ridiculously low if you just count immediate deaths that are incontrovertibly caused by the accidents per se; they're potentially much higher if you count eventual deaths due to radiation exposure, but those are highly speculative. Furthermore, calculations based on the fact that there have been only a handful of large scale nuclear incidents are relying on that same proportionality going forward, but that sort of calculation is tricky when you're talking about small numbers of freak accidents. Each such accident has a large potential for damage and loss of life; advocates of nuclear power can dismiss them by pointing out what mistakes were made in hindsight and saying that those are easily correctible, but that's the problem with freak accidents – there is always the potential for something to go wrong because people didn't think to pay enough attention to it until it caused a problem. But when the potential damage caused by an accident is large enough, forcing us to rely on a consistent low rate of them, that is reason for concern.
Also, there are other things that can go wrong outside of full scale meltdowns, such as subtle leaks of radioactive material into the atmosphere which can go unnoticed for significant periods of time. And this can happen at places other than nuclear plants, such as fuel reprocessing and fabrication centers. The more one relies on nuclear energy, the more need there will be to produce fissible materials, and the more risk we run of having such accidents – and that's not even considering the risk of *intentional* (mis)uses of such materials by terrorists or adversaries.
It should also be pointed out that nuclear energy production has undergone advances in recent years such that *modern* nuclear reactors address many of the concerns of traditional nuclear power plants. They're much safer, more environmentally friendly, and avoid the enormous cost of plant production. And that's a *huge* part of the argument going forward for justifying renewed investment in them. But by the same token, assessments of their dangers should not be projected backward onto older technology to imply that people were "stupid" for having serious concerns about their safety, or that they just opposed nuclear technology because a bunch of radicals convinced them it was bad without them understanding why. People know why they opposed nuclear energy, even if you want to argue that their rationale doesn't withstand statistical scrutiny.
Anyway, nice piece otherwise, and I think your overall point is well taken. The hype from recent advances has been absurd at times, generated largely by the same people who stand to profit from them, and has a lot to do with people being bedazzled by LLMs and their linguistic fluency while disregarding their serious limitations.
Sorry to dump all over you because of one paragraph, and I hope you see my criticisms as at least being in good faith.
I believe I have been blessed, or else have learned to, visualize a future state. During my corporate career it helped me better perform against my peers who at best would seem to barely correctly characterize the present, and tended to be more reactionary. The downside is that I have often taken risks to achieve something that has been brought down by unsee-able events that are generally the result of the majority being incapable to see and thus critical reactionary.
That is the challenge for all profound progress to be made. It isn't the accuracy of the vision, it is the painful and laborious job to convince the other 90% that cannot grasp what it would be like.
In my small liberal college town of about 90,000 residents when school is in session, the debate rages over growth with the NIMBYs generally prevailing. The debate at the time was densification of the core downtown... specifically an ordinance to restrict the size of the buildings to 3 stories or less. In one city planning commission session that was standing-room only participated, I came prepared with a number of foam-board-mounted photos, and I asked for a show of hands in the room "how many people have traveled to cities in old Europe and like the way their cities are designed. Almost everyone raised their hands. Then I showed the photos of cities in Europe with four to six story buildings lining the narrow streets. There was a lot of silence in the crowd. It made an impact.
These people were incapable of seeing a future state that met their aesthetic expectations and were thus afraid of and against of making any changes. It was only when presented specific images that their brains could process the "what can be" opportunity.
That is the challenge with nuclear power. People are afraid and yet there is really not enough effort going into the presentation of what has been developed and what will be developed, and how it will effectively eliminate the risk of any radioactive emissions disaster. Plentiful, safe and (should be cheaper) nuclear power replacing oil, natural gas and coal power generation will effectively eliminate enough carbon emissions to meet all reasonable goals to combat global warming (even though I still think that project is a WEF globalist scam).
There is copious programing that feed the fear of climate change, but a pittance of programming on what is being worked on to solve the problems. It is all fear and a demand for scarcity instead of hope and a vision of abundance.
The lack of this help to move the vision-less population toward a better understanding of what the future can be tells me that there are power people in control of the influence machinery with a vested interest to prevent it.
Or we are just missing the need to do that hard work to present and explain, in graphic form, what the future can be.