Thomas Moynihan is a writer and a visiting researcher at Cambridge University’s Center for the Study of Existential Risk.
Earlier this year, on an overcast spring day, Greenpeace celebrated the end of Germany’s nuclear era with a rally in front of Berlin’s Brandenburg Gate. The gathering’s eye-catching centerpiece was a sculpture of a tyrannosaur carcass lying on its back, legs akimbo, the nuclear power symbol plastered across its belly. An activist stood in front of it holding a placard that said “PROGRESS.”
The message of the display, it seemed, was to exult in the demise of a disliked technology by showcasing an earlier extinction. The dinosaur and nuclear power: outmoded, obsolete, good riddance.
On another spring day, this one more than a century ago, a 15-foot-long stegosaurus lumbered down Broadway in New York. Made from paper-mâché and mounted on a cart, his name was Jingo. The year was 1916 and Jingo was conducting a tour of major American cities as a symbol of protest against U.S. military preparations to join the Great War engulfing Europe.
At intervals, Jingo’s chaperones halted him and announced his cautionary lesson to streetside crowds. “Meet Jingo!” they shouted. “This animal believed in huge armament, he is now extinct!” Placards held by the activists bristled around Jingo’s bony plates, hammering the message home: “All armor plate, no brains, now he’s dead!”
In pamphlets, campaigners went further, slandering stegosaurs as “stumbling” and “clumsy,” a “slow-moving creature of low mentality.” Jingo’s spinescence apparently betrayed his “brutal foolery” akin to the bone-headed belligerence that seemed, during World War I, to be “destroying Europe.”
Like Greenpeace’s tyrannosaur, Jingo deserved his fate. Whether it was their superlative stupidity, lumbering lassitude or abundance of brawn that got them killed, everyone assumed dinosaurs were doomed to extinction. To many, the suggestion of beaked and spike-tailed lizards, armored like dreadnoughts, seemed absurdly unnatural, and unnatural things shouldn’t really exist. They deserved to be weeded out of existence, making room for what’s better.
Dinosaurs became the supreme symbol of deserved obsolescence around the same time the bones of most of the iconic species displayed in museums today were first unearthed, in the latter 1800s and early 1900s. Soon enough, they came to be treated with shocking spite. In 1893, reviewing a book that was among the first to present lavish illustrations of “extinct monsters,” one Victorian gentleman saw in them nothing but a “waste of good material” for trophy hunting. “What splendid sport it would have been,” he daydreamed. Fifteen years later, another writer dismissed triceratops as one of “nature’s unsuccessful experiments” before proceeding to salivate at the fun a “sportsman” would have had hunting one.
Pillorying these extinct creatures has long been to display a lesson, a precedent to avoid. Their disappearance was warranted because they’d done something wrong. But the pernicious assumption that extinction is somehow deserved is, in part, what got us into the predicament we find ourselves in today: ecological collapse, a planet-wide extinction event and the disappearance of creatures and cultures deemed unworthy of survival.
Having reflected on what “splendid sport” it would be to slay a sauropod, the Victorian reviewer quoted above absentmindedly moved from prehistoric to present-day extinctions. He remarked — with admiration, not remorse — that the “big game of modern days disappears before civilization” before pointing to Cecil Rhodes, who was then in the process of colonizing and stretching railroads across Africa. Such “progress,” the reviewer applauded, can only “help further hasten the extinction of the few great remaining beasts.” He regretted such disappearances, but only insofar as they would rob humans of fun game to hunt.
Of course, human activities have been jeopardizing other species for a long time. Roughly 10,000 years ago, mastodon, mammoths, megatherium and other large mammals abruptly disappeared, possibly due to our hunter-gatherer ancestors spreading out across landmasses.
But people only truly recognized that extinctions are possible and humans have caused them within the past 300 years or so. Over millennia, people pondered the origin of fossils — around 110 A.C.E., for example, the Chinese philosopher Zhu Xi speculated that shell-embossed rocks on high mountains were once shellfish living at the bottom of the sea. But understanding that those creatures no longer exist not just in that place but throughout all space everywhere, forevermore, is a different notion.
For the longest time, there simply wasn’t sufficient evidence to conclusively answer the question: Why could a species, once lost, not simply appear again later? This is mostly due to the fact that there was no consensus on how species originated in the first place. For centuries, people had sincerely believed that animals such as mice or snakes were “spontaneously generated” — created from unorganized matter without parents or ancestors. They believed this was unremarkable, happening often and continuously.
Going back at least to Augustine, one prevalent version of this theory held that species “pre-exist” all their actual manifestations in the form of invisible “seeds” diffused through the atmosphere. These, it was thought, always existed and always will; they were “indestructible,” baked into reality’s cake. All you need is welcoming conditions and the “seeds” simply blossom into fully adapted creatures.
This theory bypasses all the steps we now know are necessary for a living world to forge complex organisms from unorganized matter. It’s like saying you can manifest a cake without bothering with any preparation or recipe — or that a lifeless world might produce cities and satellites. Within such an outlook, there aren’t any prior states or cumulative steps needed for intricate animals to leap from inexistence to existence, or for a lost species to later return. As one French naturalist argued in the 1720s, though it may appear some species have “vanished from Earth,” it is nonetheless true they all “certainly survive” — “their seeds still occur,” circulating latently, and they “therefore could reappear again one day.”
Not many decades later, another influential scientist ran the thought experiment of annihilating all life on Earth. Following such a “universal death,” he confidently predicted a “replacement of living nature” would briskly and without difficulty reappear, populated by animals of the “same varieties.” No surprise that he also cheered how, as our “species multiplies and improves,” it pushes aside other lifeforms by expanding humanity’s “empire both terrible and absolute.”
Belief in the spontaneous creation of complex things, without the need of any prior genealogies, defangs the consequences of their eradication, because there’s no stringency of states the universe must retrace to produce similarly complex things again. Easy go, easy come. Hence, perhaps, why attitudes to extinction remained feckless well into the 1800s, as writers regularly hurrahed the culling of unique animals like Australia’s “feeble” kangaroo, exulting their destruction by European colonizers.
As late as 1819, a prominent German biologist named Lorenz Oken conjectured that humanity itself originated in just such an effortless way: generated from “sea slime,” with the first babies simply washing ashore, no ancestors necessary. He even speculated, sincerely, as to how this initial generation of children fed and fended for themselves, given there were no parents to show them the ropes.
Now we know what it took life on Earth to meander from prokaryote to primate and from monomer to marsupial: billions upon billions of years. We appreciate that such a tract of time seems to be the shortest possible period required to assemble kangaroos or humans, or any other living thing, from unliving matter. But if you believe that entire ecosystems can painlessly be forged “afresh” each time the world is wiped entirely clean — as many scientists seriously believed back then — it’s much harder to see the weight of what’s lost, should anything die.
In this world, there are two types of things: those that depend upon prior histories for their existence, and those that don’t. Some things can, at any moment, begin to exist, regardless of the universe’s prior states. Their appearance isn’t conditional upon any specific past holding true. They can spawn anywhere, anytime; they require no connection with parents or precedents, which would anchor and limit the region of space and time within which they can emerge. Accordingly, these things tend to be unremarkable. They also tend to be simpler.
But then there are historical things. In order to possibly exist, they require specific events in the past to have happened. You will never encounter them beyond the region within which this history, with all its piecemeal stages, has unfolded. The more complex and sophisticated something is, the more history — that is, the more steps required for its assembly — it tends to require in order to emerge. For this reason, things that depend upon histories tend to be remarkable, even unique. Unlike kangaroos, cities or satellites, hydrogen atoms aren’t the product of long-winding histories. This is why hydrogen is cosmically abundant, but kangaroos aren’t.
It was Charles Darwin who revealed that species are historical things by illuminating the sheer amount of time required to accumulate life’s present grandeur. He saw living species as precious because of this. They were not effortless “creations” but hardy “descendants” from the earliest forms, making them “ennobled.” When a “species” disappears from Earth, Darwin explained, the “same identical form never reappears,” for the “link of generations has been broken.”
This finally explained why species could not be re-created even if similar conditions happened to recur. We are born of time, Darwin revealed, not slime.
Though it took later scientists to illuminate this, Darwin’s theory additionally implied that Earth’s animals are unique to Earth, just as species that originate on islands aren’t found anywhere else unless they migrate. Should any species be wiped out here and now, therefore, it is lost not just from Earth, but from the whole of the rest of the cosmos, for all its remaining moments.
Nonetheless, attitudes to extinction remained callous. In 1868, Darwin’s mentor, Charles Lyell, remarked that entire ecosystems were being obliterated as “the colonies of highly civilized nations spread themselves over” what he called “unoccupied lands.” Heartlessly, albeit presciently, he predicted the trend of accelerating anthropogenic extinction rates would intensify.
Lyell argued that humans shouldn’t recoil at the “havoc committed” in “obtaining possession of the earth by conquest,” nor regret wielding the “sword of extermination.” In eradicating other species, he explained, we “exercise no exclusive prerogative.” That is, even the “most insignificant and diminutive species” has, in the struggle for survival, “slaughtered their thousands.” Lyell positively celebrated this, seeing it as a warrant for human rapacity.
One of the reasons such hard-heartedness to the extinct persisted after Darwin was the wide acceptance of a competing theory known as “orthogenesis” that also attempted to explain how evolution worked. Orthogenesis held that species don’t die based on interactions with an extrinsic world, but upon an entirely internal principle, like the aging of an individual’s body. Like clockwork unwinding, species and entire clades were envisioned as being propelled along a set path, arcing from youth to senescent decay. Their mortality was thus assumed to be as deterministic as a missile’s trajectory in Newtonian ballistics.
At the turn of the 20th century, this popular theory’s proponents believed lineages were born with an allotted lifespan. Once their congenital “virility” runs out, they become “spent” before perishing. Many spoke of “racial senility”; some compared extinction to “predestination.”
The theory provoked scientists to search for symptoms of “ebbing vitality in a race,” just as doctors recognize dementia and cataracts as illnesses of age. Numerous signs of “palaeopathology” were offered. Gigantism and dwarfism, luxuriant antlers or baroque horns: There were many identifiers of “decadent” or “geriatric” genera. Lurid, macabre analogies abounded. For example, one paleontologist described the coiling of ammonites as the symptom of a clade “writhing” in “death agony,” contorting like arthritic joints.
“Death comes ultimately from within,” supporters of orthogenesis intoned. What this belief obscured is that species can die by accident — that is, had the lineage not succumbed, it might otherwise have gone on, surviving, thriving, diversifying.
Contrarily, Darwin’s theory of natural selection accommodated chance, insofar as individuals survive or die based on random variations in changing environments. Nonetheless, Darwin and his followers insisted that species are predominantly pushed to the grave by their more adaptive descendants or competitors. Accordingly, they thought extinction was primarily a matter of bad design, not bad luck.
For this reason, Darwin downplayed the role of large catastrophes that could decimate indiscriminately, wiping out even the most “successful” lineages. He maintained instead that emerging evidence of mass extinctions — of conspicuous dips in biodiversity etched into the fossil record — must be distortions of our limited perspective arising from incomplete data.
He thus remained convinced that, averaged over time, “natural selection” works to inexorably “progress towards perfection.” In an 1860 letter, Darwin wrote this progress was so inevitable that “if every Vertebrate were destroyed throughout the world,” except the humble reptiles, then, after “millions of ages,” they would inevitably “become highly developed on a scale equal to mammals” — “possibly more intellectual.”
So even for Darwin, extinction wasn’t anything to repine over — a temporary setback at worst. Lost species were likely “inferior.” Better to have been pruned sooner rather than later, making way for what’s fitter.
Such sentiment dovetailed with racism and eugenics. The wholesale genocide of Indigenous peoples was similarly cast as an inexorable fate rather than the result of political decisions or culpable agency. “No motives appear to be able to stay the progress of such movements, humanize them how we may,” one Victorian writer heinously mollified, expressing a general outlook. Some reacted with mawkish sorrow, handwringing over this supposed unavoidability and pledging to make “their passing easier.” Others went further, rebuking such “sentiment” as “unreasonable,” cruelly insisting there is “satisfaction in the replacement of the aborigines.” But both parties alike laundered atrocities overseas as inexorable outcomes of natural selection’s “cosmic process.” Phrases like “improved out of existence” became quintessential fin-de-siècle parlance, used to excuse enormous evil.
Just like the “uncouth dinosaurs,” intoned one author as late as 1930, “so also many human beings, unlovely in character, must be looked on as necessary waste.” “Necessary waste,” that is, at evolution’s altar of selection and rejection. Indeed, during the new century’s opening decades, the spiteful tenor regarding dinosaurs intensified rather than abated. Perhaps it helped Western scientists to have a prehuman exemplar of “unavoidable extinction” — a distant “fact” to point to, tactfully positioned far beyond the realm of the political — to justify barbarities they benefited from in their present.
By the time Jingo rolled down Broadway in 1916, it was common for his kind to be raised as totems of either deserving death or indivertible decrepitude. Aside from being called “superlatively stupid,” many experts believed stegosaurian investment in spinescence to be the “faultless indicator” of a species that had “spent its vital force.” Jingo’s gothic protrusions were read as the “excrescences” of “racial senescence.” Like the writhing ammonites, his very anatomy was interpreted as the “embodiment of the death agony of a race.”
Summing up an overriding sentiment, the humorist Will Cuppy quipped in the 1940s that the “Age of the Reptiles ended because it had gone on long enough and it was all a mistake in the first place.”
This reasoning fed into self-serving assumptions regarding humanity’s own self-appointed position as successor and supplanter. The “moronic” and “slow-moving dunces” were mercifully “improved” out of existence by those “little warm-blooded beings” whose hefty brains would later blossom into evolution’s pinnacle.
Assuming that extinction only ever weeds out the “decrepit” and “unfit” not only absolved humans from wielding the “sword of extermination” in the present. It also had the bonus effect of making “man” — that latter-day inheritor of the Earth — seem anointed as the inevitable result of natural selection’s search for the best.
Such assumptions still lingered in 1959 at the centenary celebration of “On the Origin of Species,” where experts came together for discussion. During one panel, a curious query cropped up: Why did “man” evolve from “primates” and “not, for instance, kangaroos?”
The question revealed the persisting assumption that Homo sapiens was somehow the destination and apogee — a target even marsupials or reptiles might ache to achieve, rather than just one more outcome of life’s sprawling evolutionary explorations. Indeed, scientists have since conjectured that dinosaurs, if they’d stuck around, might have eventually become something like us: bipedal and brainy. Dismissing them as “necessary waste” is one thing, but imagining them “improved out of existence” by essentially becoming us smacks of an egoism yet more profound.
By 1964, the paleontologist Glenn Jepsen referred to dinosaurs as “amiable clodpoll buffoons, not to be exactly ridiculed or censured but not to be taken very seriously, either; amusing, every ton of them.” Friendly derision, but derision nonetheless. The tone, however, had by now softened enough for the sheer suddenness of their disappearance to be acknowledged as an enigma.
Listing the range of suggestions that had been made for this, Jepsen could only conclude: “no one knows.” Revealing his exasperation with the prevailing uncertainty, he facetiously included “flying saucers” in a litany of proposals, alongside the suggestion that the dinosaurs died of “paleo-weltschmerz.” (In German, “weltschmerz” means something like semi-suicidal boredom.)
Over the next couple of decades, however, much would change. New research in the 1970s and 80s concluded that dinosaurs were not cold-blooded slowpokes, but highly active, social and even intelligent. Reflecting this, illustrators updated their depictions, rectifying the ways human preconceptions had warped even the bodies of these beasts.
Tyrannosaurs transmogrified into well-balanced bipeds in stark opposition to the tail-draggers of prior reconstructions that seemed to amble around like cumbersome kangaroos. Additionally, it also became clear that the lineage of dinosaurs didn’t even end, as birds were identified as their direct descendants. No longer could their kind be cast as a clade that spent its “vital force” eons ago.
When the father-son team of Luis and Walter Alvarez stumbled upon strata dating to the end of the dinosaur’s reign that was laced with iridium — scant on Earth but abundant in asteroids — they took a critical step in identifying the cause of the dinosaurs’ disappearance. It suddenly seemed the dinosaurs didn’t die because of bad genes but from sheer bad luck. A streaking comet could hardly be rationalized as the handmaiden of progress, nor the invisible hand of adaptation.
During the 1980s, the giant impact hypothesis showed how adventitious luck, rather than innate superiority, shapes macroevolution’s unfolding story. Extinction is not always the pruning of “aged” or “unfit” lineages; even the most “successful” can succumb. Ultimately, this provided a sobering lesson for Homo sapiens, otherwise so often self-assured of its preeminent inevitability.
First, the new view implied that, if the asteroid had not plummeted into what is now Mexico, we cannot assume anything humanoid would have ever emerged. As the Polish writer Stanisław Lem pointed out in 1983, we are the offspring of cosmic contingency.
Second, it toppled the residual assurance that an animal as “successful” as Homo wouldn’t be succumbing to oblivion anytime soon. Suddenly, our own position seemed more precarious than before. As Stephen Jay Gould observed, research from the early 1980s onward into the fallouts of prehistoric asteroid impacts also provided an “important” impetus for the first attempts to “model the climactic consequences of nuclear holocaust.”
Third, by demonstrating how the marks of contingency cast wide legacies through evolutionary history, the impact hypothesis also made more salient the impacts of present-day human activity upon life’s further future. Suggestions that we are altering everything to come, in ways that aren’t inevitable, gained force. Indeed, it was around the same time that prominent scientists, like the anthropologist Richard Leakey, first began asserting that we are living during “the sixth mass extinction.”
For Leakey, it was evidently reckoning with the role of contingency, in previous mass extinctions, that led him to insist humans take responsibility today. Indeed, an important part of imputing culpability for a misdeed is acknowledging that the perpetrator could have acted otherwise and, if they had, the misdeed would not have happened. But this was precisely what prior outlooks precluded — with self-serving hastiness — by casting the destructions wrought by human industry as inevitabilities within the “cosmic process.”
Today, human industry presides over mass extinction and climate breakdown. Because of this, it’s tempting to label ourselves as somehow inherently evil. Not only have humans wielded the “sword of extermination,” but prior generations, in their small-mindedness, slandered and libeled the “vanquished” as somehow inferior.
Nonetheless, human societies — like species — are historical things: built out of precedent but in ways open to change. Qualities may become entrenched and hard to tweak, because of all that becomes built on top of them. But making claims about what’s “innate” remains dubious, always liable to prejudice and generalization. There is thus no fate — nor congenital destiny or doom — for historical things, be they living lineages or planetary civilizations. What they are today is the product of a past that could have gone otherwise, and what they will be tomorrow depends on what happens next, which can also go differently, hinging on chance or choice. This opens us up to accident; it also carves space for agency.
So, perhaps we should take the lesson of Jingo — not to prejudge the capacities, fates and dignities of other animals as matters of fate — and apply it to our own kind. Let’s not conclude that humans are irredeemably or inherently evil, nor that human interventions must only ever vandalize this world.
We may have been the first and only animal to have pilloried others for having perished. But we are also the first and only animal to begin apprehending the wrong of doing so. History reveals that this knowledge has only just started to seep into our consciousness. This has hardly yet stopped the wholesale destruction of ecosystems. But it is cause for urgency: to produce a viable planetary civilization that, rather than parasitizing the planet that birthed it, enters into symbiosis with it. After all, we don’t any longer have even the meager excuse of our recent ancestors: that of ignorance.