Last Saturday I posted a conversation with the military historian Phillips O’Brien, much of which was devoted to the war in Ukraine and what has passed for U.S. diplomacy the past few weeks. But we also talked about his new book War and Power, and I was struck by one of his points: The importance of having good allies.
As he noted, Germany lost both world wars in part because it was confronted by powerful alliances while its own allies were “terrible” — Austria-Hungary in World War I, Italy in World War II. He went on to say
The key of the United States has been that it has maintained arguably the most successful alliance system in history since 1945. What the U.S. maintained with NATO, an alliance which kept Europe very much on the American orbit, in the American orbit, both economically and militarily, also with Japan, South Korea, Taiwan and countries in Asia is, they constructed this alliance system which hugely amplified both America's economic possibilities but also its strategic possibilities.
And Trump is throwing all that away.
I try not to say too much in these interviews — my one weird trick for good discussion is, as far as possible, to shut up and let the interviewee talk. But I couldn’t resist a follow-up here, based on my own observations:
[W]hat always struck me, is that the U.S. had a specialty of creating international organizations that were formally equal, where we were all partners together. Now, everybody understood that the United States was actually in charge, but we went to great lengths to make sure that the World Trade Organization or NATO were alliances of equals, at least on paper. And it was a very effective trick.
O’Brien agreed: “The United States was getting the substance of power but giving up the style.”
For today’s post I thought I would enlarge on this point — and on what we’ve lost, possibly irretrievably, thanks to just a few months of Trumpism.
The Pax Americana that emerged after World War II — and basically ended on January 20, 2025 — was, in many ways, an American Empire. Even after Europe recovered from wartime devastation, the United States retained a dominant economic and military position among non-Communist nations. And we built international economic and military alliances to support a world order in effect designed to U.S. specifications.
But for Europe and Japan the American Empire was a subtle thing, with the United States avoiding crude displays of power and bending over backwards to avoid being explicit about its imperial status.
OK, I’m very aware that the picture I’m painting only applies to U.S. relations with wealthy democracies. U.S. power didn’t look so benign to Mohammed Mossadegh in Iran or Salvador Allende in Chile. Yet in the history of world empires, the Pax Americana nonetheless stands out for its subtlety, restraint — and effectiveness.
We set up the postwar international monetary system at a famous 1944 conference in Bretton Woods, New Hampshire. It was a U.S.-centric system, although Britain also helped shape the rules. (Some guy named John Maynard Keynes played an important role.) But while the initial system did give a special role to the dollar (a role that ended in 1971) the international institutions it established, the International Monetary Fund and the World Bank, are, at least on paper, country-blind. Obviously they have always given special deference to U.S. concerns, but they have never been explicit instruments of U.S. power.
In 1947 a conference in Geneva established the General Agreement on Tariffs and Trade, which set the ground rules for, um, tariffs and trade. The GATT in turn became the foundation for the World Trade Organization, established in 1994.
The GATT very much set up a world trading system in America’s image — to a large extent it was a globalized version of America’s 1934 Reciprocal Trade Agreements Act. But the text and the rules it sets don’t single out the U.S. for any kind of special treatment.
Then there were the military alliances. More O’Brien’s specialty than mine, but NATO, despite U.S. military dominance, has always formally been an alliance of equals. The military commander has always been American, but the Secretary General has always been European.
Are there historical precedents for an empire run as an alliance? I’m not a historian, and I’m sure there are examples I don’t know about, but the closest parallel I can think of is the Delian League Athens established to confront Persia in the 5th century BC. Athens eventually gave in to temptation and began treating its allies as subjects to be exploited; America never did. Remember, the Soviet Union repeatedly had to send in the tanks to keep puppet governments in power in Eastern Europe. Nothing like that ever happened, or even came close to happening, in NATO.
You could, I guess, say that formally treating our allies as if they were our equals was hypocritical. But I see it more as a way of showing respect and declaring that we would not abuse our national power.
Now, we squandered a lot of credibility by invading Iraq under false pretenses. And the credibility we lost in Iraq has made it difficult to act against atrocities elsewhere, from the use of chemical weapons in Syria to the terrible things Israel is doing in Gaza.
But in 2024 America was still in a real sense the leader of the free world. And while you can criticize the Biden administration for always delivering too little, too late, it nonetheless did help mobilize a large coalition to help Ukraine defend itself against Russian aggression.
But that was another America.
The current occupant of the White House clearly has no use for subtlety and understatement:
And let me say that I don’t think Trump’s vulgarity is irrelevant to understanding what he’s doing to America and the world. One of the best explanations I’ve read of who Trump is, and implicit predictions of what he would do, was a 2017 discussion of his design tastes titled “Dictator Chic.” Trump’s New York apartment, wrote Peter York,
projects a kind of power that bypasses all the boring checks and balances of collaboration and mutual responsibility and first-among-equals. It is about a single dominant personality.
Remember, this was written in 2017, yet it was a better prediction of Trump’s current behavior than almost any judicious-sounding “news analysis.”
In any case, in just 7 months Trump has completely ripped up the foundations of the Pax Americana. Almost all his tariffs are clearly in violation of the GATT, yet Trump has vandalized the world trading system as casually as he has paved over the Rose Garden. We haven’t yet had a test of whether he would honor our obligations under NATO, but he’s said that his willingness to abide by the most central obligation, the guarantee of mutual defense, “depends on your definition.”
Trump’s foreign policy doctrine appears to be Oderint dum metuant — let them hate as long as they fear — supposedly the favorite motto of the Emperor Caligula. America, he seems to believe, is so powerful that it doesn’t need allies; he can bully the world into doing his bidding.
As Phillips O’Brien told me, history shows that such a belief is always wrong. And it’s especially wrong right now, when America is far less dominant than it once was. Whatever Trump may imagine, the world doesn’t fear us. For example, Trump may have imagined that his tariffs would bring India crawling to him, begging for relief; instead, India seems to be moving to closer ties with China.
In fact, not only does the world not fear us. Increasingly, it doesn’t need us. This is even true for nations that used to depend on U.S. military aid. You may remember Trump berating Ukraine’s president Zelenskyy, declaring “you don’t have the cards.” In reality, even in the Ukraine war Trump has far fewer cards than he imagines. At this point Europe is providing far more aid to Ukraine than we are:
And in an ever-evolving war in which drones, not tanks, rule the battlefield, Ukraine (with European help) is producing many of its own weapons.
One of the many problems with the slogan Make America Great Again was that America already was great. Now, not so much. In a world in which America is no longer the dominant economic and military power it once was — measured by purchasing power, China’s economy is already 30 percent larger than ours — our role in world affairs depends, even more than it did in the past, on having willing allies who trust our promises.
We used to be very good at having allies. But Trump has flushed all of that down the golden toilet.
Thanks for reading! If you’ve read ABUNDANCE, the book—or if you’ve just heard about it and want to know more—consider becoming a subscriber to this newsletter to receive more essays on building out the Abundance Agenda in housing, energy, health care, education, and beyond.
I do not often find myself listening to Tucker Carlson1 while nodding along with an enthusiasm that puts me at risk of neck injury. But that’s what happened earlier this week, as I listened to Carlson at a conservative conference describing the US housing market for young buyers.
It’s especially bad that young people can’t afford homes. If you want a measure of how your economy is doing ... My measure is really simple: I’ve got a bunch of kids. Can they afford a house with full-time jobs at 27, 28 years old? And the answer is: No way. Thirty-five-year-olds with really good jobs can't afford a house unless they stretch and go deep into debt. And I just think that's a total disaster.
Indeed, until recently, it was common for couples in their mid-20s to buy their first house. Today it’s a rarity. In 1991, the median age of first-time homebuyers was 28. Now it’s 38, according to the National Association of Realtors. That’s an all-time high. In 1981, the median age of all homebuyers was 36. Today, it’s 56—another all-time high. This is the hardest time for young people (defined, generously, up to 40!) to buy their first home in modern history.
Beyond his statistical accuracy, it is Carlson’s anger—A TOTAL DISASTER!—that rings most true. Housing is so much more than a statistic. Our homes are our lives. Where we live shapes who we meet, what we do, and who we become. It is no coincidence that, as homeownership among young people has declined to all-time lows, marriage rates and fertility have also continued to plummet. Young Americans are the most depressed generation when it comes to the state of the economy. One might feel tempted to write this off as an extension of youth mood disorders, and persnickety economists can point out that young people today are not poorer than previous generations. But there is no debating the fact that today’s 20- and 30-somethings are significantly more blocked from owning a home than any other generation.
In the 1990s and 2000s, one’s 30s were a decade of ownership. Now, our 30s are a decade of mostly trying and failing to own. It is hard to capture in official government statistics what that kind of disappointment can do to the psyche of a generation.
This Substack is reader-supported. To receive new posts and support my work, consider becoming a free or paid subscriber.
A 50-Year History of the Housing Crisis
The deterioration of housing affordability in America, especially for young people, isn’t a simple story, but rather a nested history of troubles. There’s the 50-year tragedy, the 20-year tragedy, and the 5-year tragedy.
The 50-year story is significantly about rules. As Ezra Klein and I wrote in Abundance, the 1970s marked a turning point in the use of zoning restrictions. In prior decades, zoning had mostly been used to cut up cities and towns into business and residential zones, which often had the effect of keeping non-white and low-income people away from high-income neighborhoods. But in the 1970s, city planners and neighbors aggressively sought to restrict overall housing growth, through the expansion of exclusionary zoning, finicky rules like minimum lot sizes and parking mandates, and the normalization of NIMBY movements and lawsuits designed to block new development. "The 1950s and 1960s were a golden age of new construction," the economists Edward Glaeser and Joseph Gyourko wrote in a recent paper on the modern history of housing. But in the 1970s, construction rates plummeted. “In the 1980s and 1990s, the growth rate of housing was barely half the rates seen in the 1950s and 1960s.”
The 20-year story adds in the economics of homebuilding. The Great Recession of 2007 and 2008 obliterated the construction industry. Bankruptcies bloomed, and national construction employment fell by roughly 40 percent by 2010. Homebuilding was stuck in a depression for years, and the U.S. built fewer homes per capita in the 2010s than any decade on record. By 2019, home construction had slowly normalized, but since the large Millennial generation was rounding into their 30s, the prime home-buying years, a groundswell of demand pressure pushed home prices to new highs. Even as young people were earning more than their Gen X peers, many of them felt further from the promise of adulthood, as housing prices outran their raises.
The color-coded maps of Harvard’s Joint Center for Housing Studies make the point starkly. In 2000, most markets had a price-to-income ratio under 3. That means the typical price for a single-family home cost less than 3 years’ income. By 2022, the price-to-income ratio across the country had doubled. For all the economic and scientific advances of the 21st century, the last quarter century has been the period when the typical American had to work twice as hard to buy a house where they lived.
The 5-year story might be the most important in explaining why, when you look back at the first graph in this piece, you can see the median age of first-time homebuyers lurching in the last few years. If the 50-year story is mostly about the law, and the 20-year story pulls in economics, the 5-year story tops it off with virology.
The COVID pandemic shut down much of the service sector. But the home-buyers market was supercharged. Shut inside their homes, wallowing in cabin fever, and having nothing to do all day but stare blankly at their phones and daydream on Zillow, many couples sprang for bigger homes, often just outside major cities. Suburban home prices launched into the exosphere. Meanwhile, the construction market on the supply side was significantly slowed with shutdowns and supply-chain hardships. When the unstoppable force of pandemic housing demand smashed into the immovable object of supply-constrained homebuilding, prices went bonkers, if I may employ a technical term. Between March 2020 and the summer of 2022, the Case-Shiller U.S. National Home Price Index increased by 42 percent—equal to the entire home price index increase between 2003 and the beginning of the pandemic. In other words, the U.S. warp-speeded through two decades’ worth of housing inflation in about two years.
The worst part for young homebuyers had yet to come. Overall inflation surged in 2021 and 2022. The Federal Reserve raised interest rates to cool the hot economy. The typical 30-year mortgage rate jumped from about 2.5 percent in early 2021 to 7 percent by 2023, where it's hovered for the last two years. As a result, the typical monthly mortgage payment on new loans has more than doubled. (Meanwhile, insurance costs and tax payments on homes also rose, due to a variety of factors, including an uptick in natural disasters.) Higher rates don't just push housing costs out of reach for young middle-class couples. They also discourage homeowners with old mortgages from moving. After all, who would trade a 2.5% 30-year mortgage for a 7% 30-year mortgage if they didn't have to? As a result, existing home sales plummeted 20 percent between 2022 and 2025, falling to a 30-year low.
For young homebuyers in many places, it’s the worst of all worlds. Home prices are too high; and mortgages are too expensive; and insurance costs are too high; and existing homeowners don’t want to move; and all of this happened after decades of construction struggles and half a century of onerous rules and legal norms that blocked development in major cities. The 5-year, 20-year, and 50-year histories of the US housing market fit inside each other like a cursed nesting doll.
Housing has forever been integral to American life. It has more recently become core to American politics. I have seen private Democratic polling showing that young voters, who once considered housing the 20th or 25th most important issue facing the country, have recently put it in the Top Five. Housing affordability is now as essential to voters under 40 as protecting Social Security is to voters over 60. And politicians that recognize this fact will have a major advantage. In New York City, Zohran Mamdani shocked the establishment by running a campaign that was laser-focused on addressing cost-of-living issues (even if I disagree with some of his approach). After the 2024 election, the pollster David Shor found that the "single most effective" Kamala Harris ad had been her promise to lower housing costs by building more homes.
Donald Trump won a cost-of-living election. He had an opportunity to be the cost-of-living president. Instead, he has responded to America’s housing shortage with his own scarcity agenda. Rather than make it easier to build homes, he has announced tariffs on critical inputs, like Canadian lumber and Mexican drywall. His immigration crackdown tightens the labor pipeline for homebuilding, which relies on foreign-born workers more than almost every other industry. Rather than calm bond markets to help reduce long-term interest rates, he has signed off on a debt bomb of a bill and made noises about firing the chair of the Federal Reserve.
An abundance agenda for housing at the national level is possible. But it must begin with an appreciation for the fact that housing policy is exquisitely local. American cities and suburbs will need more wins like California’s recent CEQA reform, which will hopefully accelerate the construction of urban “infill” units. But Washington can urge this winning streak along. The federal government could experiment with a “carrots, not sticks” approach by creating a national rewards program that sent significant infrastructure and development funds to the states and cities that increased their housing permits the most. This YIMBY Carrot strategy wouldn’t just reward good behavior; it could also scramble the political psychology of local housing advocacy. Pro-housing groups would have a new counter for their NIMBY neighbors: “Hey, your opposition to new development is directly costing us money we could spend on parks and schools!” Or, “When you say no to new housing, you’re not just saying no to future residents, you’re taking money away from the people who are here now!” Meanwhile, reducing long-term mortgage rates is not a simple task. It will require that the US combat inflation in the short run and constrain our debt in the long run. Unfortunately, both are hard and neither is happening under this administration.
This essay has been full of catastrophic statistics. But an upside of catastrophe is that it gets people’s attention. America’s housing crisis has until recently roamed just outside the main radar screen of national politics. Today, housing is firmly at its center. Young people are voting on it. The news media is talking about it. Politicians are piecing together policies to address it, and even once-sclerotic blue states, such as California and New York, are finally acting on it. If Democrats want to remodel themselves as protectors of the middle-class, defenders of the American Dream, and stewards of a future that young people can believe in and afford, they have to come home to housing.
When we started Hearing Things last year, one of the many questions we asked ourselves was: Should we even be on Spotify at all? As longtime music journalists, we had all covered the streaming wars in one way or another, and we knew Spotify was often singled out as the company with the worst track record. According to countless artists and reports, their per-stream rates are pitifully low compared to their competitors. Their playlist-centric strategy takes music out of context and relegated it to the background of people’s lives. Their sound quality is butt. They engage in practices that seem a lot like a modern version of payola. And there was that one time Joanna Newsom called the company a “cynical and musician-hating system” and compared it to a rotting banana. “It just gives off a fume,” she said. “You can just smell that something’s wrong with it.”
Our values as a publication—pro-worker, pro-artist, pro-active listening, anti-villainous corporations—did not align with many of Spotify’s actions and policies. At the same time, more people listen to music on Spotify than any other platform, and we wanted to make any playlists we put together as accessible as possible. So we reluctantly started paying for a Hearing Things Spotify account. But a couple of days ago, we canceled that paid subscription, and we will no longer be making playlists on Spotify or linking to the platform. It feels great.
So what finally pushed us over the edge? Well, a lot of things. For me, chief among them was music journalist Liz Pelly’s incredibly damning—and incredibly well-reported—recent book Mood Machine: The Rise of Spotify and the Costs of the Perfect Playlist. It details all the ways Spotify has devalued music through the years, helping to turn the most powerful art form we’ve got into another frictionless commodity controlled by tech oligarchs. Like how Spotify created an entire program—ominously dubbed Perfect Fit Content—in which they pushed more and more faceless muzak onto their popular in-house playlists because it was licensed by the company under cheaper terms, taking money and placements away from genuine artists. Or how its hyper-personalized algorithmic playlists forced listeners to burrow deeper and deeper into their own musical comfort zones, dulling the opportunity for personal exploration. Or how their Discovery Mode introduced a shadowy pay-for-play scheme that all but required many independent artists and labels to lower their own royalty rates in order to surface songs on the platform. Every chapter—practically every page—of Mood Music offers revelations on how Spotify purposely undercut music makers in order to bolster their bottom line. I don’t know how any ardent music fan could read this book and not be moved to cancel their subscription.
Unfortunately, there’s more. In April 2024, Spotify enacted a new policy that denied royalties to songs that collected less than 1,000 streams, causing artists to wonder what would stop the company from arbitrarily increasing that number in the future. The following month, Billboard estimated that Spotify was expected to pay songwriters $150 million less in the ensuing year, even as the company raised their subscription rates, and their market cap hit new highs. This January, no less an authority than Björk declared, “Spotify is probably the worst thing that has happened to musicians.”
Just when you think there couldn’t possibly be any more reasons to quit Spotify, the brutal news keeps coming. Last month, it was reported that Spotify CEO Daniel Ek’s investment firm led a nearly $700 million round of funding for the European defense technology startup Helsing. Which means that when people now pay for Spotify, they are indirectly paying for the manufacturing of A.I. war drones. (What’s more, according to Bloomberg, Helsing’s technology is allegedly overpriced and glitchy, leading me to ask myself: Is it better or worse that the defense startup Daniel Ek invested in is purportedly kind of shitty?! 2025 is so bad, dude.) The disclosure caused veteran experimental rock band Deerhoof to announce plans to remove their music from the platform. “We don’t want our music killing people,” they wrote in a statement. “We don’t want our success being tied to AI battle tech.”
Speaking of A.I. (sorry), the last few weeks saw the explosion of a 100-percent A.I.-generated band called the Velvet Sundown. As of this writing, the classic-rock entity (which somehow does not sound like a mix between Velvet Revolver and Sunset Rubdown) has more than 1.2 million monthly listeners on Spotify, and it doesn’t look like the company is in any rush to completely shut down their page. In fact, this latest abomination feels like both a culmination of Spotify’s anti-artist policies and a precursor of hells yet to come.
So, for those reasons, we’re out. We understand no tech company is perfect, and that the problem is bigger than Spotify. But at the very least, Apple Music and Tidal—where we will be hosting our playlists moving forward—care about sound quality, have higher per-stream rates, and don’t seem to be daring music lovers to hate them every single week. As a liner-notes nerd, I for one really appreciate Tidal’s granular song and album credits. And when I hook up my wired headphones to my digital-to-analog converter and listen to lossless audio on Apple Music, it really is more enveloping. Of course, as Andy recently wrote, the best way to support artists you love is to actually purchase their albums and merch and concert tickets. Streaming isn’t going anywhere, but it should only be a part of any true fan’s music-listening habits. And if you care enough about the culture of music to read this entire story, Spotify doesn’t deserve your money, either.
Do readers remember the debt panic of the early Obama years? For a while scare stories about national debt dominated discussion in the media and inside the Beltway.
.I got a lot of grief at the time for bucking that consensus, urging people to relax about government debt. The United States, I argued, had lots of “fiscal space” — ability to run up debt without losing investor confidence — so it should focus instead on the importance of restoring full employment, which required running substantial deficits.
These days, however, many though not all of the people who were screaming about debt back then have gone quiet. Funny how that happens when there’s a Republican in the White House.
Yet there is much more reason to be worried about debt now than there was then. On one side, there’s no longer any good economic reason to be running large deficits. On the other, America has changed in ways that have greatly reduced our fiscal space, our ability to get away with a high level of debt.
And the One Big Beautiful Bill Act, which just passed the Senate and will probably pass the House, will make things even worse.
Why was I relatively relaxed about debt back in the day? Largely because history tells us that advanced nations can normally run up large debts without experiencing crises of confidence that send interest rates soaring.
Look, for example, at the debt history of the UK, which ran up huge debts relative to GDP during the Napoleonic Wars and the two world wars without losing investor confidence:
Why are advanced countries normally able to pull this off?
First, they’re normally run by serious people, who don’t try to govern on the basis of crackpot economic doctrines and will take responsible action if necessary to stabilize their nations’ debt.
Second, they’re competent: They have strong administrative states that can collect a lot of tax revenue if necessary. The United States collects 25 percent of GDP in taxes, but could collect much more if it chose. Some European nations collect more than 40 percent.
These factors normally lead investors to give advanced countries the benefit of the doubt, even when they run big deficits. That is, investors assume that the people running these countries will take action to rein in debt once the emergency justifying deficits ends, and that they will be able to take effective action because they have effective governance.
And that’s why I was a deficit dove in, say, 2011. America needed to run substantial deficits to recover from the 2008 financial crisis. But I didn’t think this would cause trouble down the road, because we were a serious country run by serious people, easily able to do what was necessary to stabilize the debt once the economic emergency was past.
But that, as I said, was then.
Right now we are running big budget deficits even though we aren’t fighting a war, facing high unemployment, or dealing with a pandemic. We should be taking action to bring those deficits down. Instead, Republicans have rammed through the One Big Beautiful Bill Act, which will add trillions to the deficit even as it causes mass misery. Money aside, the way Congress was bullied into passing that bill and the lies used to sell it show that we are no longer a serious country run by serious people.
Republicans are using transparently dishonest accounting to hide just how much they’re adding to debt — hey, we aren’t really cutting taxes, just extending tax cuts that were scheduled to expire. And they’re also claiming that the OBBBA’s tax cuts (the ones that they say aren’t really happening) will generate a miraculous surge in economic growth. I’ve had my differences with the Committee for a Responsible Federal Budget, but it’s an honest, highly competent think tank, and its (appropriately) incredulous analysis of Trump officials’ economic projections is titled “CEA’s fantastical economic assumptions.”
Add in Trump’s bizarro claims about what his tariffs will achieve. Again, do we look like a serious country run by serious people?
Moreover, mass deportation and incarceration of immigrants, aside from being a civil liberties nightmare, will inflict severe economic damage and significantly worsen our debt position.
Finally, how long will we have an effective government that can collect taxes when necessary? Elon Musk’s DOGE failed to find significant amounts of waste, fraud and abuse, but it did degrade the functioning of the federal government and demoralize hundreds of thousands of civil servants. Republicans have done all they can to eviscerate the IRS and make tax evasion great again. Even if control of the government is eventually returned to people who want to govern the country rather than pillage it, it will take years to recover competence in tax collection and other mundane government functions, which we used to take for granted.
The falling dollar is an indication that foreign investors are losing faith in America. But I don’t think they fully realize, even now, that the risk of a U.S. debt crisis is vastly higher now than it was when Republicans were yelling about Obama’s deficits.
On August 5, 1945—the day before the world ended—Frank Sinatra was at a yacht club in San Pedro, California. There, he is reported to have rescued a 3-year-old boy from drowning.
On the other side of the country, Albert Einstein—the father of relativity—was staying in Cabin No. 6 at the Knollwood Club on Lower Saranac Lake, in the Adirondacks. Einstein couldn’t swim a stroke, and (in a reverse Sinatra) was once saved from drowning by a 10-year-old boy.
What neither of them realized when they woke up on the morning of August 6 was that at 8:15 a.m. Japan Standard Time, the first atomic bomb, nicknamed “Little Boy,” had been dropped on the city of Hiroshima, obliterating standing structures and killing close to 80,000 people.
“The day the world ended” is how Kurt Vonnegut described it in his novel Cat’s Cradle, published in 1963. Vonnegut had served in the U.S. Army during World War II, and was one of a handful of survivors of a different American attack: the firebombing of the German city of Dresden, which killed as many as 35,000 people and leveled the town once described as “Florence on the Elbe.”
“The sky was black with smoke,” Vonnegut later wrote in Slaughterhouse-Five, the novel that fictionalized his experience. “The sun was an angry little pinhead. Dresden was like the moon now, nothing but minerals. The stones were hot. Everybody else in the neighborhood was dead.”
The atomic bomb dropped on Hiroshima is believed, by some estimates, to have killed as many as 146,000 people, once injuries, burns, and long-term radiation poisoning were factored in—approximately the population of Gainesville, Florida, today.
Here is a photograph of the children who dropped it:
U.S. Department of Defense
I say “children” because the mission commander, Colonel Paul Tibbets, was 30. Robert A. Lewis, the co-pilot, was 27. Thomas Ferebee, the bombardier, was 26. The navigator, Theodore “Dutch” Van Kirk, was 24. Here is a picture of what happened to the children down below:
Keystone / Getty
President Harry Truman was on the USS Augusta at the time, returning from a conference in Potsdam, Germany, following that country’s surrender. The ship’s captain interrupted Truman’s lunch to give him a message announcing the attack.
That afternoon, Truman attended a program of entertainment and boxing held on the well deck. The ship’s orchestra played. The boxing ended abruptly when the ring posts collapsed, slightly injuring a spectator. Such was the nature of human suffering that day.
Cat’s Cradle was Vonnegut’s fourth novel. He had started it nearly a decade earlier, in 1954, when he was just 31 years old. It is the story of Jonah, a journalist who has set out to write a book about what famous people were doing the day of the Hiroshima bombing. In the book, Jonah tracks down the three living descendants of Dr. Felix Hoenikker, one of the so-called fathers of the atomic bomb. Hoenikker is an eccentric scientist who once left a tip for his wife by his coffee cup and would go on to create a substance called Ice 9, which could freeze all water on Earth at room temperature—thus ending the world.
Cat’s Cradle made about as much impact on popular culture when it came out as Vonnegut’s previous books had, which is to say not much. His first novel, Player Piano, had been published more than 10 years prior, to little acclaim, and Vonnegut was scrambling to make ends meet for his growing family. After the war he had made a pretty good living writing short stories, until that market softened. Since then he had worked as an English teacher at a school for wayward boys and as a publicist for General Electric; in a fit of optimism, he had even started a doomed Saab dealership on Cape Cod. An apt word to describe Vonnegut’s state of mind in those years would be desperate. Little did he know that Slaughterhouse-Five, published in 1969, would make him one of the most famous writers in the world.
Vonnegut was similarly unaware that World War II would be the last war of what historians call the Industrial Age. In the 19th century, steam-powered machines had revolutionized human enterprise. Then, following the development of electricity, came a wave of innovation never before seen—the telegraph, telephone, automobile, airplane—as physicists such as Einstein and his successors illuminated the very fabric of the universe. Many of those same physicists would later join the Manhattan Project, harnessing the power of the atom and creating the first atomic weapon.
In some ways, Little Boy was the ultimate invention of the Industrial Age, which ended a few years later. What replaced it? The Atomic Age, of course, followed in the 1970s by the Information Age. Were Vonnegut alive today, he might say that whatever they call the age you live in is actually the name of the weapon they’re using to try to kill you.
In 1943, two years before the bombing of Hiroshima, Kurt Vonnegut dropped out of Cornell University and enlisted in the Army. He was 20 years old. Here is a photo of him:
PJF Military Collection / Alamy
The Army taught him to fire howitzers, then sent him to Europe as a scout. Before he left, Vonnegut surprised his mother, Edith, by going home for Mother’s Day 1944. In return, Edith surprised Vonnegut by killing herself. That Saturday night, she took sleeping pills while he lay unaware in another room. Seven months later, Private First Class Vonnegut was crossing the beach at Le Havre with the 423rd Infantry Regiment of the 106th Infantry Division.
They marched to Belgium, taking up position in the Ardennes Forest near the town of St. Vith. It was one of the coldest winters on record, and death was all around them. On December 16, the Germans attacked. Inexperienced American troops holding the front buckled, creating a bulge in the line, thus giving the ensuing battle its name. When it was over, about 80,000 American soldiers had been killed or wounded. But Vonnegut didn’t make it to the end. He barely made it three days. Cut off and outnumbered, his regiment was forced to surrender; Vonnegut and more than 6,000 other soldiers were captured. As the Germans advanced, his buddy Bernard O’Hare shouted, “Nein scheissen! ” to the advancing German troops. This did not mean “Don’t shoot!,” as he thought. What he yelled instead was “Don’t shit!”
After a long forced march, Vonnegut and thousands of other American POWs were packed into boxcars. The dark cars smelled of cow shit, and the soldiers were crammed so tightly, they were forced to stand. It took two days to load them. Vonnegut later recounted how, 18 hours after their departure, the unmarked German train was attacked by the Royal Air Force. It was Christmas Eve. Strafed by RAF fighters, bombs dropping all around them, dozens of American prisoners were killed by Allied planes. Against all odds, Vonnegut was still alive.
The name Little Boy was chosen by Robert Serber, a Los Alamos physicist who worked on the bomb’s design. It seems only fitting for a weapon dropped by children from a plane named after the pilot’s mother, Enola Gay. Ten feet long and weighing close to 10,000 pounds, “the gadget”—as the scientists called it—was a plug-ugly sumbitch, made of riveted steel and wires. Nothing like the sleek, gleaming technology of today. See for yourself:
Keystone-France / Gamma-Keystone / Getty
Little Boy was a gun-type bomb, its explosive power triggered by firing a “bullet” of uranium into a target of uranium. When the projectile and target combined, they formed a supercritical mass capable of sustaining a rapid nuclear chain reaction. That’s a fancy scientist way of saying “massive explosion,” and boy howdy, was it.
Fission reactions occur so fast that it’s hard to describe them using our human sense of time. Within one-millionth of a second of the uranium bullet hitting its target, a fireball of several million degrees was formed, spawning a shock wave with a blast equivalent to 15 kilotons of TNT that pushed the atmosphere at supersonic speeds, and that traveled outward at two miles per second from the hypocenter. The fiery shock wave flattened everything in its path, igniting birds in midair. About a third of the bomb’s energy was released as thermal radiation: gamma and infrared rays that flashed through clothing, burning textile patterns into victims’ skin and causing severe burns up to a mile away. In the time it takes to say “boom,” roughly 80,000 people were reduced to ash, and 4.4 square miles of city were obliterated.
Wilfred Burchett was the first Western reporter to reach Hiroshima after the bombing. On September 2, sitting on a piece of rubble, he wrote, “Hiroshima does not look like a bombed city. It looks as if a monster steamroller had passed over it and squashed it out of existence.”
For clarity, a steamroller was an Industrial Age machine used for compacting dirt and gravel in order to create smooth surfaces upon which vehicles could drive.
And so the world ended, if not in fact then in theory.
When he arrived in Dresden, Vonnegut and his fellow POWs were put to work in a malted-syrup factory, making food for Germans that the POWs were not themselves allowed to eat. The guards were cruel, the work exhausting. Vonnegut was singled out and badly beaten. One night, as air-raid sirens roared, Vonnegut and the other POWs were herded into the basement of a slaughterhouse, huddling among the sides of beef as the city above them was bombed.
All told, British and American bombers dropped more than 3,900 tons of highly explosive and incendiary bombs on Dresden that night.
Vonnegut described it this way in a letter to his family: “On about February 14th the Americans came over, followed by the R.A.F.” The combined forces “destroyed all of Dresden—possibly the world’s most beautiful city. But not me.”
Here is a photo of the city before the bombing:
Ullstein Bild / Getty
And here is what it looked like when the Allies were done with it:
Ullstein Bild / Getty
To destroy the city of Dresden took hundreds of bombs dropped over multiple hours. To destroy the city of Hiroshima, all it took was one. This, a cynical man might say, is what progress looks like.
In his 1967 collection of essays about the Atomic Age, The Ghost in the Machine, Arthur Koestler, a Hungarian British author and journalist, wrote, “The crisis of our time can be summed up in a single sentence. From the dawn of consciousness until the middle of our century man had to live with the prospect of his death as an individual; since Hiroshima, mankind as a whole has to live with the prospect of its extinction as a biological species.”
Throughout human history, children have adopted a rule of engagement called “not in the face.” Think of it as the first Geneva Convention. Violating the not-in-the-face rule opens the offender up to serious retribution. It is an act of war. Now I get to hit you in the face, or worse. In fact, maybe I should kick you in the balls to teach you a lesson and restore the balance of power. Maybe I need to make the cost of hitting me in the face so high, you never take another swing. If Pearl Harbor was an unprovoked face punch, then Hiroshima was the kick in the balls to end all future wars. Scientists of the Industrial Age made that kick possible.
Vonnegut’s relationship with his own children after the war was mixed at best. There would be seven in total, three biological and four of his sister’s boys, who had come to live with him and his wife, Jane, in 1958, when Vonnegut’s brother-in-law, Jim, died in a train derailment, his commuter train launching itself from the Newark Bay Bridge into Newark Bay. Two days later, Vonnegut’s sister, Alice, died of breast cancer. So it goes. It was Alice who had shaken Vonnegut awake on Mother’s Day 1944 to tell him their mother was dead. Vonnegut considered Alice his muse, and later wrote in Slapstick: “I had never told her so, but she was the person I had always written for. She was the secret of whatever artistic unity I had ever achieved.”
Suddenly the house was overstuffed with children between the ages of 2 and 14. For the next five years, Vonnegut tried (and mostly failed) to write Cat’s Cradle. The stress of supporting that large a family as a writer, while still processing trauma from the war, made him irritable. Never a hands-on dad, he left most of the actual parenting to Jane, and as the chaos of family life filled the house, he would hole up in his study all day, chain-smoking. The slightest noise from the children could propel him from the room, ranting.
Vonnegut himself had been raised in a house of math and science. His father was an architect. As a scientist, his brother would pioneer the field of cloud seeding. But Vonnegut had a complicated relationship with the word progress. His experience in the war had soured him on the idea that science was exclusively a force for good. Too often, he believed, scientists and engineers focused on the question Can we do something? rather than Should we? He saw this when he looked at the Manhattan Project. Though scientists at Los Alamos knew that the bomb they were designing was meant to be dropped on people, they rarely thought about the consequences of dropping it.
After the war, the physicist Victor Weisskopf, who’d worked on the bomb at Los Alamos, admitted that he was “ashamed to say that few of us even thought of quitting. It was the attraction of the task. It was impossible to quit at that time.” The task, he said, was “technically sweet.”
J. Robert Oppenheimer himself used this phrase during testimony at his security-clearance hearing after the war. “It is my judgment in these things that when you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb. I do not think anybody opposed making it.”
“Nice, nice, very nice,” as Bokonon wrote in his “53rd Calypso.” “So many different people in the same device.” Bokonon was the fictional founder of a religion that Vonnegut invented for Cat’s Cradle, a novel as much about the hypocrisy of organized religion as it was about war. Bokonon’s first dictum is this: “All of the true things I am about to tell you are shameless lies.”
Here’s another shameless lie: The atomic bomb was dropped to save lives. This is an ancillary thing that war does; it inverts language. See, the lives that mattered to scientists at Los Alamos were American. So they chose to focus on the lives they would spare—the GIs who would theoretically die in a conventional invasion—instead of the Japanese citizens who would actually die when the bomb was dropped. This made the morality of their actions easier to justify. In this way, they kept things sweet.
And yet, to quote a survivor, those scientists who invented the atomic bomb—“what did they think would happen if they dropped it?”
Here are some things that happened. Day turned to night. In a flash, the bomb destroyed 60,000 of the 90,000 structures in a 10-mile radius. Of the 2,370 doctors and nurses in Hiroshima, 2,168 were killed or injured too badly to work.
This is what the atomic bomb did to survivors: “They had no hair because their hair was burned, and at a glance you couldn’t tell whether you were looking at them from in front or in back,” a survivor told The New York Times in 1981. “Their skin—not only on their hands, but on their faces and bodies too—hung down.” In this way they stumbled down the road, going nowhere, “like walking ghosts.”
Only a few of the survivors were children, as most school-age kids near ground zero were killed on impact. This is because at 8:15 a.m. on August 6, they had gathered outside their schools to help create firebreaks to slow the spread of flames in the event of firebomb raids like the ones that had destroyed Tokyo and so many other Japanese cities. Did they hear the distant roar of the B-29, I wonder, flying overhead? An air-raid siren had gone off an hour earlier, but no planes had come, so now, when the Enola Gay approached, many didn’t even look up.
Picture the children of Hiroshima on that sunny morning, thousands of little haircuts, thousands of gap-toothed smiles. Thousands of children trying to be good citizens, wondering what the morning snack would be. This is whom the child pilots flying overhead dropped the bomb on: schoolchildren and their parents. What else are we to think? The city of Hiroshima had no real military or technological value. It was a population center, chosen to send a message to the emperor.
So it goes—or, as the survivors of Hiroshima used to say, “Shikata ga nai,” which loosely translates to “It can’t be helped.” This sentiment was born from the Japanese practice of Zen Buddhism—an even older made-up religion than Bokononism, Vonnegut might say. And yet, what else can one say about a world in which children drop bombs on other children?
In Slaughterhouse-Five, Vonnegut writes of an argument he had with his old Army buddy Bernard’s wife, Mary. Vonnegut has gone to their house to drink and trade war stories, and when he tells them he is writing a novel about the war, Mary erupts:
“You were just babies then!” she said.
“What?” I said.
“You were just babies in the war—like the ones upstairs! … But you’re not going to write it that way, are you … You’ll pretend you were men instead of babies, and you’ll be played in the movies by Frank Sinatra and John Wayne or some of those other glamorous, war-loving, dirty old men. And war will look just wonderful, so we’ll have a lot more of them. And they’ll be fought by babies like the babies upstairs.”
Later, thinking back on Cat’s Cradle’s amoral physicist, Dr. Felix Hoenikker, Vonnegut said, “What I feel about him now is that he was allowed to concentrate on one part of life more than any human being should be. He was overspecialized and became amoral on that account … If a scientist does this, he can inadvertently become a very destructive person.”
This overspecialization is a feature, not a bug, of our Information Age.
What are our phones and tablets, our social-media platforms, if not technically sweet? They are so sleek and sophisticated technologically, with their invisible code and awesome computing power, that they have become, as Arthur C. Clarke once wrote, indistinguishable from magic. And this may, in the end, prove to be the biggest danger.
Because so little thought has been given to the Should we? of the Information Age (what will happen if we give human beings an entertainment device they can fit in their pocket, one that connects them instantly to every truth and every lie ever conceived?), we have, as a society, been caught unprepared. If the atomic bomb, riveted from steel plates and visible wires, was irrefutable proof of the power of science, how is it possible that even more sophisticated modern devices have decreased our faith in science and given rise to the wholesale rejection of expertise?
Talk about a shameless lie! And yet how else to explain the fact that misinformation spread through our magic gadgets seems to be undermining people’s belief in the very science that powers them?
To put it simply, if the bomb was a machine through which we looked into the future, our phones have become a looking glass through which we are pulled back into the past.
Shikata ga nai.
After the war, Vonnegut wrestled with what he saw as hereditary depression, made worse by his mother’s suicide, his sister’s death, and the trauma of war. Unable to justify why he had survived when so many around him had died, and unwilling to ascribe his good fortune to God, Vonnegut settled instead on the absurd. I live, you die. So it goes.
If it had been cloudy in Hiroshima that morning, the bomb would have fallen somewhere else. If POW Vonnegut had been shoved into a different train car, if he had picked a different foxhole, if the Germans hadn’t herded him into the slaughterhouse basement when the sirens sounded—so many ifs that would have ended in death. Instead, somehow, he danced between the raindrops. Because of this, for Vonnegut, survival became a kind of cosmic joke, with death being the setup and life being the punch line.
On May 11, 1955, the Hiroshima survivor Kiyoshi Tanimoto, a Methodist minister, was featured on the American television program This Is Your Life. He had come to the U.S. to raise money for victims of the atomic bomb known as the Keloid Girls or the Atomic Maidens.
Seated on a sofa beside the host, Ralph Edwards, Tanimoto wears a baggy suit and looks stunned. After an introductory segment, the camera cuts to the silhouette of a man behind a screen. He speaks into a microphone.
“Looking down from thousands of feet over Hiroshima,” he says, “all I could think of was ‘My God, what have we done?’”
The camera cuts back to Edwards and Tanimoto. “Now, you’ve never met him,” the host tells the Hiroshima survivor sitting next to him, “never seen him, but he’s here tonight to clasp your hand in friendship. Captain Robert Lewis, United States Air Force, who along with Paul Tibbets piloted the plane from which the first atomic power was dropped over Hiroshima.”
The camera pans across the stage as the screen retracts and Captain Lewis emerges from shadow. Tanimoto steps into frame and shakes his hand. Both men appear as if they want to throw up:
Ralph Edwards Productions
“Captain Lewis,” Edwards says, “come in here close, and would you tell us, sir, of your experience on August 6, 1945?”
There is an uncomfortable beat, in which we wonder if Lewis will be able to continue. The camera cuts to a close-up of Lewis. He is unable to make eye contact with Tanimoto.
“Well, Mr. Edwards, when we left Tinian, in the Mariana Islands, at about eight—at 2:45 in the morning on August the 6th, 1945, our destination was Japan. We had three targets. One was Hiroshima. One was Nagasaki. One was Kurkura.
“About an hour before we hit the coastline of Japan, we were notified that Hiroshima was clear. Therefore, Hiroshima became our target.”
The camera cuts to Tanimoto, listening, horrified. The social contract of human behavior freezes him in place.
“Just before 8:15 a.m. Tokyo time,” Lewis continues, “Tom Ferebee, our very able bombardier, carefully aimed at his target, which was the second Imperial Japanese Army Headquarters. At 8:15 promptly, the bomb was dropped.
“We turned fast to get out of the way of the deadly radiation and bomb effects. First was a thick flash that we got, and then the two concussion waves hit the ship. Shortly after, we turned back to see what had happened, and there in front of our eyes, the city of Hiroshima disappeared.”
“And,” Edwards says, “you entered something in your log at that time?”
Lewis’s voice breaks and he rubs his temples, trying to compose himself.
“As I said before, Mr. Edwards, I wrote down later: ‘My God, what have we done?’ ”
After retiring from the Air Force, Captain Lewis went to work in the candy business, where he patented various improvements to candy-manufacturing machinery. Sweet treats for kids. Picture them. All those happy kids.
Picture them putting quarters in the vending machine. Picture them in store-bought costumes holding out their Halloween sacks. They are no more theoretical than the children of Hiroshima, but unlike them, these children would grow up.
They would come of age practicing duck-and-cover drills, diving under their desks at the shriek of a whistle; come of age hiding in the bomb shelters their parents had built, terrified of the theoretical deaths that the A-bomb had made all but inevitable.
Nice, nice, very nice. So many different people in the same device.
This article appears in the August 2025 print edition with the headline “Vonnegut and the Bomb.”
In January 1981, President Ronald Reagan nominated James B. Edwards to lead the Department of Energy. This was an unusual choice. As a dentist-turned-politician, Edwards’ expertise in drilling had more to do with root canals than oil fields. His appointment, however, was part of a coherent strategy: to dismantle the previous administration’s energy agenda. Under President Jimmy Carter, the U.S. had become a world leader in energy R&D. Solar projects took off around the country. Under Reagan, this progress unraveled. Research spending plummeted. Subsidies expired. Solar startups crumbled. As for the oil and gas industries? Reagan offered them new tax breaks.
The cost of this reversal was immense. According to Gregory Nemet, a professor and historian at the University of Wisconsin, Reagan’s election was the most important factor in the sudden halt of US solar development in the 20th century. Having invented the efficient photovoltaic cell in the 1950s and captured the lead in solar in the 1970s, the U.S. ceded the technological frontier to Japan, Germany, and eventually China, which now installs more solar panels than the rest of the world combined. We had the future in our hands, and we gave it away.
This Substack is reader-supported. To receive new posts and support my work, consider becoming a free or paid subscriber.
I’ve been thinking lately about Reagan and solar—and Edwards and molars1. Like Reagan, Donald Trump came to power in 2025 in the middle of a historic effort to increase US renewable energy, such as solar, storage (i.e., batteries), and wind. Like Reagan, he is seeking to dismantle progress made under the previous Democratic president. And, far worse than Reagan, he’s doing this at a moment when renewable energy isn’t just a neat side project, but rather the fastest growing source of American electricity, in an age when energy demand is rising alongside the emergence of artificial intelligence.
At a hinge point in history, the U.S. needs an energy policy fit for the 2030s. Instead, we’re getting a rerun of the energy whiplash of the 1980s, at the very moment when we can least afford it.
Higher Electricity Prices to Own the Libs
Trump’s signature legislation—the so-called Big Beautiful Bill—doesn’t just dismantle the tax credits for solar, storage, and wind that Joe Biden signed into law as part of the Inflation Reduction Act. It goes further by adding a new tax that clean energy projects can only avoid if they can prove they aren’t using critical parts made in China. Similar to how the BBB includes onerous Medicaid "work requirements" that force low-income patients to trudge through bureaucratic mud to get health insurance, this provision would punish clean energy firms with excess paperwork; in fact, the policy is kind of like a work requirement for decarbonization. It essentially ties clean energy’s legs together and says, “okay, now let’s see how fast you can run.”
You’ll sometimes hear conservatives accuse progressives of caring so much about climate change that they’d force ordinary Americans to bear the cost of higher prices and worse lives just to save the planet. But right now, it’s Republicans who are willing to stymie energy production, at the risk of rising electricity costs, just to own progressives and punish their favorite energy sources. I’m not sure I fully understand what woke means to the far right, but I’ve sometimes gathered that it means “a movement that’s willing to sacrifice economic common sense for unsound cultural ideology.” If that’s right, this GOP energy policy is as woke as it gets.
To appreciate just how bad this law would be for energy policy, you don’t have to listen to environmentalists approaching DEFCON 1 levels of panic. You could instead listen to Republican tech entrepreneurs, red-state experts, and ordinary energy analysts. Many of them think the bill is “utterly insane.”
That’s how Elon Musk characterized the BBB’s punishing attacks on clean energy, which he called "a massive strategic error" that will "leave America extremely vulnerable in the future." Since Musk is the chief executive of an electric car company, you might wonder if this criticism is a bit self-interested. But he’s not alone. The chief policy officer of the pro-business US Chamber of Commerce wrote that "taxing energy production is never good policy" and these measures "should be removed." Doug Lewin, president of the energy consultancy Stoic Energy and author of the Texas Energy and Power newsletter predicted that the state is “going to build less [renewable energy], and what we build will be more expensive.”
To make new things, we have to want new things. The Trump administration doesn’t want new things. As it pines for the past, China is building the future.
The Trump administration has a reply to these critiques. Energy Secretary Chris Wright has claimed that solar and wind power are “expensive” sources of electricity that make the energy grid “less reliable” and mostly just make politicians “feel good.” This argument would have worked in, say, 1981. In 2025, however, every claim falls flat. Even including installation costs, solar energy plus storage is in many cases the cheapest source of energy production available. Along with wind power, it’s become so easy to build that almost all of the net electricity generation growth in Texas has come from renewable energy in the last decade. (Do you really think the typical Dallas energy baron in 2025 is building wind and solar because he cares so much about polar bears on ice floes?) Far from unreliable, renewable energy is now critical for stabilizing the Texas grid. As the New York Timesreported:
During the scorching summer of 2023, the Texas energy grid wobbled as surging demand for electricity threatened to exceed supply. Several times, officials called on residents to conserve energy to avoid a grid failure.
This year it turned out much better — thanks in large part to more renewable energy.
The electrical grid in Texas has breezed through a summer in which, despite milder temperatures, the state again reached record levels of energy demand. It did so largely thanks to the substantial expansion of new solar farms.
When you take away an energy source whose cost is falling and whose reliability is rising, the effect is predictable: It becomes more expensive, and less reliable, for folks to heat and cool their homes and offices, run their appliances, and power big data centers. If the BBB passes, electricity prices are expected to surge in the next decade, especially in red states such as Oklahoma, South Carolina, and Texas. Ironically, that’s because Republican-led states in the sun-drenched south and windy midwest are currently deploying the most solar and wind. Battery manufacturing projects announced since the IRA in Georgia, Tennessee, and Indiana could be endangered, as well. South Carolina Gov. Henry McMaster, a Republican, warned Congress that, without IRA tax credits, nuclear power expansion in his state is “dead.”2 Once again, many of this legislation’s opponents aren’t card-carrying members of the Sierra Club. They’re just conservatives who want cheaper electricity and didn’t lock their energy takes in a 40-year-old time capsule with acid-wash jeans and VHS tapes.
America’s Resource Curse, China’s Future Shock
Three decades ago, US economists coined the idea of a “resource curse” to describe the historical irony that countries blessed with commodities (e.g., gold, silver, oil, timber) remained trapped in the past, while nations without ample resources embraced new technology and grew faster. For example, resource-poor Netherlands eclipsed the Spanish Empire, despite the latter’s hauls of gold and silver; Switzerland and Japan grew faster than the petrol-state of Russia; and Korea and Taiwan left oil-rich Venezuela in the dust in the second half of the 20th century.
Now consider the U.S. and China. The United States sits on a geological jackpot with easily accessible oil and gas reserves. We are the largest producer of oil in the history of the world and the largest exporter of natural gas. But in our eagerness to maximize this fossil-fuel advantage, we risk smothering the clean-electric economy before it can mature. On the other hand, China lacks sufficient oil or gas to power a billion-person economy, which is why it has spent decades trying to wean itself off foreign dependency by developing alternative energy sources.
China wasn’t blessed with America’s hydrocarbon plenty. But which nation is the cursed one, now? China dominates global manufacturing of solar panels, wind turbines, advanced batteries, electric vehicles, and the mining and processing of materials critical to the global clean-energy economy. While the Trump administration pines for the energy policies of the past, China is racing toward the future.3 (For more on this theme, you should check out Noah Smith’s work on energy and China. You can start here.)
One measure of America’s resource curse is a historical ambivalence toward renewable energy. The U.S. had one clean-energy policy under Carter and another under Reagan. Then, we had one clean-energy policy under Biden and now another under Trump. While China invests along multi-decade time horizons, the U.S. is whiplashing between policy regimes every time a new American president puts his hand on the Bible. As the philanthropist and former energy trader John Arnold wrote this week, “reversing which fuels get subsidized and which get penalized every time control of Washington shifts is about the stupidest way to run an energy system that needs long term planning and stable supply chains.”
The implications for artificial intelligence are uncertain but worrying. Training and using AI demands energy abundance; scaling AI while running a normal economy without brownouts requires energy superabundance. I have heard some Republican defenders of the Trump administration say that, while they might not support the stuff on vaccines or science or immigration, they trust the White House to do the right thing on AI data centers and energy. But if there are brownouts and blackouts due to insufficient energy generation in the next few years, “the narrative on every TV and newspaper and meme is going to be that AI data centers did this,” Paul Williams, executive director of the Center for Public Enterprise, wrote. “If you thought data center development was challenging now, wait until bills go up 30% and every single media actor blames data centers.”
What would a more reasonable policy look like? In conversations with folks in the solar and storage industries over the past few days, the issue that came up more than any other wasn’t even tax policy. It was time policy. As we wrote in Abundance, it takes too long to build important stuff in America. It takes too long to build houses where people most want to live. It takes too long to build transit where people most want to move. And it takes too long to build energy at a time when electricity demand is skyrocketing. Rather than hamstring solar and storage with new onerous rules, the U.S. should be streamlining NEPA and permitting and finding ways to reduce interconnection queues so that solar and battery manufacturers can build on a reasonable schedule and get more electrons pumping onto the grid and into homes and corporate offices and data centers.
Getting all this right doesn’t just require policy expertise. It takes a certain disposition toward the future. To build new things, we have to want new things. The Big Beautiful Bill doesn’t want new things. It wants an energy policy that belongs in the 1980s to go along with a tariff regime that belongs in the 1880s. It’s hard to see exactly how we’ll beat China to the future if we’re chugging this hard in the opposite direction.
One often hears that Donald Trump represents a sharp break from the legacy of Ronald Reagan’s GOP. On several issues, such as the president’s weird fondness for Russian autocracy, the claim is clearly true. And yet: The Republican president is currently pushing a policy agenda that will cut taxes, slash spending for low-income Americans, increase national security spending, raise the deficit, and deliver a gut punch to solar power in the midst of an all-out dismantling of the previous administration’s energy policy. The previous sentence would be equally true if it were written in 1981 or 2025.
I am deliberately trying to avoid quoting progressives and typical environmentalists in this article just to prove how crazy this policy seems to many Republicans and non-partisan energy watchers.
I don’t remember the first place where I read that anti-decarbonization bias in the U.S. was reminiscent of the resource curse motif, but if someone leaves the OG reference in the comments, I can edit in later.