A few nights ago, I made the increasingly unwise decision to drive into New York City with my wife. Dinner was pleasant enough, but the return to our car was not. The parking deck was dimly lit, deserted, and eerily quiet, until we reached the pay station, where a sleezy person was deliberately sitting at the foot of the machine, waiting for me to approach with wallet in hand. There were no police, no attendants, no other people—just me, my wife, and a deranged stranger between us and the only way out.
I had nothing on me with which I could defend myself. Not that it would have mattered. This was New York, after all, where defending yourself is sometimes more legally perilous than initiating an attack. We’re fast approaching the point where criminals won’t plead self-defense, they’ll plead that they were in the middle of committing a crime when they were victimized by someone defending themself. In many cities, that’s practically exculpatory.
Anyway, we bolted in the other direction, hoping we would not be followed, which is nothing new. Every time we go to New York or Philadelphia or D.C., it seems, we have some experience that requires us to keep our heads down, not make eye contact, and move at a rapid pace in order to avoid bodily injury, all the while breathing in secondhand marijuana smoke and trying not to step in human excrement, or interfere with someone in the process of making it. And that, we’re told, is just part of the experience.
What’s striking is not that this sort of thing happens, but that we’ve been conditioned to accept it, thanks to decades of legal precedent, political cowardice, and cultural decay. This sort of thing used to be unthinkable. Now it’s just part of the experience. Urban ambiance, if you will. And if you want to understand where the unraveling of American cities truly began, you need to look not at the mayors or the governors or even the radicals with clipboards. You need to look at the marble pillars of the United States Supreme Court.
In Shapiro v. Thompson (1969), the Court struck down residency requirements for welfare eligibility. Until then, cities and states could require that new arrivals live within their borders for a set period—typically a year—before becoming eligible for public assistance. It was a reasonable policy, designed to preserve fiscal sanity and discourage opportunistic migration.
But the Warren Court, always eager to discover new rights lurking in the penumbras, declared that such requirements violated the Constitution’s “right to travel,” a right, notably, nowhere mentioned in the Constitution itself, nor implicated by the welfare policy: nobody was denying anyone the right to travel, only denying an additional paid incentive to do so. But that’s not how the Court saw it. Denying welfare benefits to newcomers, the Court reasoned, penalized interstate migration and was thus discriminatory, as though not getting free money from a program you paid nothing into is somehow a penalty.
The ruling was couched in the usual moral vocabulary of fairness and equality. But the consequences were as predictable as they were disastrous.
With that single ruling, welfare was nationalized in its consequences. Cities could no longer guard their social services against sudden and overwhelming demand, nor could they exercise prudence in distributing limited resources. New York City, already a haven of social programs, became an open vault.
The results were immediate and devastating. Roughly one million newcomers, representing a new underclass and drawn by the city’s suddenly unrestricted welfare offerings, flooded New York City. At the same time, another million middle and working-class residents—taxpayers, business owners, civic-minded families—fled New York City for the suburbs. The demographic swap was not neutral. It was a political and cultural reconfiguration of the city itself that contributed to a fiscal crisis in the city by reducing tax revenue while increasing expenditures.
This new class of welfare-dependent residents didn’t just draw on city resources, they reshaped the city’s politics. They voted, understandably, for more benefits, more programs, more leniency. And the political class, eager to preserve its base and expand its power, gave them exactly that. Thus the Supreme Court decision didn’t merely coincide with the rise of the welfare state and ideological capture of urban governance, it accelerated and entrenched it.
At the same time, crime rose. Rapidly. Predictably. Police departments were overwhelmed not only by the sheer scale of disorder but by the political handcuffs placed on them by the very people elected in the wake of this demographic shift. Our friends on the Left will undoubtedly condemn this positive correlation between welfare recipients and criminal activity, but it is they who tell us that crime is a function of poverty. So mayors and district attorneys no longer viewed enforcement as a tool of public safety, but as a cudgel of oppression. Not that they had much choice, since they simply did not have the resources to police a population that increasingly did not police itself.
Crime itself became reinterpreted, not as a social ill to be contained, but as a kind of tragic expression of grievance. An expected, even justified reaction to structural injustice. In this new moral inversion, the criminal became the oppressed, the enforcer became the oppressor, and the city itself became hostage to a therapeutic ideology that viewed punishment as cruelty and disorder as authenticity.
Of course, this wasn’t the only wound, just the first incision. The Great Society cracked open the vault of federal entitlements. Fathers were kicked out of the households as a condition of women and children receiving welfare. Urban planners, in their zeal to build public housing, created vertical warehouses for poverty. Crime surged. The tax base evaporated. New York teetered on the edge of bankruptcy by 1975. President Ford famously refused a federal bailout, prompting the immortal (and wildly misleading) headline: “Ford to City: Drop Dead.”
But the city had already been dropped by its courts, its leaders, and its ideals.
In the years since Shapiro, the governing class has learned nothing and doubled down. Cities disarmed their citizens, de-policed their streets, and adopted a bizarre theological devotion to dysfunction. Loitering became a civil right. Drug use became therapeutic. Mental illness became “unhousedness.” And all the while, the message to law-abiding citizens remained the same: you’re on your own—and you’d better not do anything about it.
The result? A modern feudalism. At the top: the elite, those wealthy enough to rise above the chaos, living in secure buildings with doormen, traveling in chauffeured cars, sending their children to private schools. Below them: the masses, left to navigate subway platforms where someone might push you to your death for sport, or wait in line behind a man urinating into a trash can.
It goes without saying that we are not allowed to say what we see. That it is not just crime, but intentional political dereliction. That compassion without discipline is not mercy, but madness. That a city that cannot distinguish between the citizen and the vagrant, between the taxpayer and the grifter, between the violent and the vulnerable, is not a city at all—it is an open-air experiment in managed decline.
This is how cities die: not with a bang, but with a grant. The Supreme Court, in trying to sanctify human movement, broke the moral contract that once bound community to responsibility. And when you sever that link, when you tell cities they must provide for all, regardless of contribution, behavior, or allegiance, you don’t get equality. You get entropy.
Shapiro v. Thompson didn’t just open the gates, it dismantled them, and then criminalized the very idea of putting them back up.
It’s easy to forget how beautiful New York once was, how proud, how aspirational, how civilized. Watch Breakfast at Tiffany’s, and you’ll see it: a city that gleams like Gatsby’s green light, full of promise, elegance, and upward mobility. It was the modern city as cathedral, a place where dreams weren’t just imagined, but pursued.
Then fast-forward just fifteen years to Taxi Driver, Death Wish, The Warriors. The city hasn’t just changed, it’s decayed. The glamour is gone, replaced with grime and neon rot. You’re no longer watching a romance; you’re watching a slow-motion mugging.
That wasn’t just film. That was real. And it happened quickly.
Today, when we walk through these cities, we experience a strange kind of historical vertigo. It’s like walking through a ruined cathedral or an overgrown aqueduct. There are vestiges of a superior civilization: the architecture, the infrastructure, the echoes of ambition. But they are crumbling, or worse, actively being defiled. The marble is spray-painted. The subway smells of human waste. The air carries more menace than motion.
It must have felt like this in the early Dark Ages, after the fall of Rome, when men stumbled through marketplaces built by emperors, now occupied by looters and squatters, unable to remember who had once walked there or what they stood for. The ruins were still standing. But the civilization? Gone. Forgotten.
We are living in the ruins of something that was once noble. And if we don’t say so, if we don’t name the causes, reverse the decisions, and reject the ideologies that made it so, then we are not just permitting the decline. We are complicit in it.
The question is no longer whether American cities can be saved. The question is whether we’re still allowed to remember what they once were.
You may also be interested in:

