As Albert Schweitzer said, “Man has the lost the capacity to foresee and to forestall. He will end up destroying the earth.” After years of emphasis on science and technology, we are closer than ever to midnight. Yikes!
In the next few months modern man (Homo sapiens sapiens) will be entering a new epoch in the history of the world, as soon as the International Union of Geological Sciences formally announces a date for the end of the Holocene and the beginning of the Anthropocene epochs.1 The latter term has been in use for a number of years, although it has not been finally ratified or precisely dated and is thus not yet an officially recognized subdivision of geological time. However, it has been defined:
The Anthropocene . . . is a proposed geological epoch dating from the commencement of significant human impact on Earth’s geology and ecosystems, including, but not limited to, anthropogenic climate change.
This definition invites one to speculate about what event or development might make a useful date for “the commencement of significant human impact on Earth’s geology and ecosystems. . . .”
The very earliest candidates for “significant human impact” would be, I suppose, the introduction of agriculture and domestication of wild animals (c. 10,000 BC) or the start of our first chemical-industrial process, the production of charcoal (c. 4000 BC?). Both of these human activities had significant environmental impacts, particularly the deforestation of wide tracts of the earth’s surface. A number of academics in departments other than the earth sciences have suggested origins of the Anthropocene that highlight their own particular expertise: academic feminists have put forward the origin of Patriarchy (c. 7000 BC?), sociologists the beginnings of capitalism, social historians the industrial revolution, and so on.
Interesting as these proposals may be, they are all for various reasons inadequate. The earliest ones are simply rechristening most of the Holocene epoch (commencing c. 12,000 years ago) rather than baptizing a significantly new epoch, and is it is not at all clear that in those prehistoric times there was any significant “anthropogenic climate change.” Moreover, none of them satisfies the demands of the earth scientists for a clear marker in the stratigraphic record that indicates precisely the transition into a new epoch. For these reasons (among others), the commencement of the Anthropocene epoch will almost certainly be defined by the radioactive fallout from the atomic bombs tested and released in the 1940’s and 1950’s (this, in fact, is the recommendation of one subcommittee charged with examining the options). These explosions released cesium and plutonium isotopes which will act as a Global Boundary Stratotype Section and Point (aka a ‘Golden Spike’) defining everywhere in the world for thousands of years (at least) the end of the Holocene and the onset of the Anthropocene epochs.
I This notion that developments in science and technology, no matter how astonishing, are potentially tragic because they are steadily increasing the power of the Yahoos without making them any more virtuous is an insight we have by and large brushed aside in favour of more optimistic visions of Homo sapiens.
II This transition point also fortuitously marks (more or less) the beginning of an age of increasing anxiety about our environment, a state of mind that is not merely a reaction to the threat of atomic annihilation (important as that is) but also an acknowledgement that our preoccupation with scientific-technological progress has been changing the world in some potentially catastrophic ways.
This transition point also fortuitously marks (more or less) the beginning of an age of increasing anxiety about our environment, a state of mind that is not merely a reaction to the threat of atomic annihilation (important as that is) but also an acknowledgement that our preoccupation with scientific-technological progress has been changing the world in some potentially catastrophic ways. The alarm bells were first sounded most notably by Rachael Carson, The Silent Spring (1962) and Paul Ehrlich, The Population Bomb (1968). For a few decades, the key environmental issue in the public imagination was pollution, but the emphasis changed dramatically about twenty years ago to include climate change, a shift zealously promoted by Al Gore’s film (later a book) An Inconvenient Truth (2006), which highlighted the issue of global warming and placed it at the centre of acrimonious scientific and political debates. Right-wing politicians phooh phoohed the entire notion as a publicity stunt by Gore, whose case (they claimed) was riddled with scientific errors. One Republican senator, Jim Inhofe, Chairman of the Environment and Public Works Committee, who appeared in the film, compared it (without taking the trouble to attend a viewing) to Hitler’s Mein Kampf. However, the general public around the world reacted rather differently: in a 47-country Internet survey conducted by Nielsen2 and Oxford University “66% of those respondents who said they had seen [the film] stated that it had ‘changed their mind’ about global warming and 89% said it had made them more aware of the problem”. Nowadays, seventeen years after the film first appeared, international conferences on the threats of climate change are routine, and the Secretary-General of the United Nations, Antonio Guterres, is constantly issuing yet another jeremiad urging us all to act quickly before it is too late to avoid “a mass exodus of entire populations on a biblical scale.”3
In the last fifty years or more, as we grow increasingly concerned about what lies ahead, we have also witnessed a growing scepticism about the scientific-technological enterprise, thanks in large part to the questionable activities of scientists, their paymasters, and their government-appointed regulators. Thirty years before the appearance of Gore’s film and book, scientists at ExxonMobil “correctly rejected the prospect of a coming ice age, accurately predicted when human-caused global warming would first be detected, and reasonably estimated the ‘carbon budget’ for holding warming below 2°C.”4 Yet they refused to publicize their findings, deliberately “overemphasizing uncertainties, denigrating climate models, mythologizing global cooling, feigning ignorance about the discernibility of human-caused warming . . . .” Dupont Chemical poisoned the world by continuing to sell Teflon long after their scientists had discovered that it was extremely hazardous to human and animal health. Hooker Chemical—with permission from the federal government—filled Love Canal with toxic waste. And so on: Mesothelioma (1960-present); Minimata (1956 and 1965); Three Mile Island (1979); Bhopal (1984); Chernobyl (1986); Exxon Valdez (1989); Aral Sea (1960-87); Seveso (1976); Tokiamura (1999); Monsanto’s Roundup (1990’s) Baia Mare (2000); Opioids (2000); Kingston Fossil Plant (2008); and Deepwater Horizon (2010), to name some of the better known.
Of course, it is only fair to point out that the work of countless ethical scientists was essential to identifying the causes of these disasters, conducting a cleanup, punishing the major culprits, and providing some relief to the victims. The work of the scientific community also appears to have dealt successfully with the second great global scare of the post-war age (after atomic radiation), the ozone hole over the Antarctic, which is now predicted to “mostly recover by 2040.”5 And there has been no lack of dramatic scientific success stories in the last seventy-five year–the race to the moon and beyond, the identification and treatment of lethal pandemics (Polio. Asian Flu, Cholera, Hong Kong Flu, Bird Flu, Smallpox, HIV/AIDS, SARS, H1N1, MERS, Ebola, Zika, Covid-19, and Monkeypox), the resurrection of Lake Eire, the Green Revolution, antibiotics, transplants, transistors, computer chips, cloning, and on and on and on.
Presiding over this rapid succession of disasters and triumphs is an imaginary timepiece established by the Bulletin of the Atomic Scientists, the Doomsday Clock6, “a metaphor for threats to humanity from unchecked scientific and technological advances.” Midnight on the clock face represents a global catastrophe; the hands represent in minutes and seconds how far we are from that catastrophe, in the opinion of the Bulletin, which in January of each year decides whether to adjust the previous year’s position or not. The time was initially set (in 1947) at seven minutes to midnight, and in the years since the setting has been changed twenty-five times: eight times backwards and seventeen forwards. The reading furthest from midnight is 17 minutes (in 1991) and the one closest to midnight is 90 seconds (in January 2023).
Not that one would get a sense of an imminent crisis from observing how the vast majority of us have been behaving as we creep towards midnight. We are still losing our forests (47 million hectares in the last decade); global temperatures in 2022 have broken previous records; the consumption of fossil fuels continues to increase; brutal wars rage on in South Sudan and Ukraine with no end in sight; Germany is reopening coal-fired power plants; the UK is set to launch the first new coal mine in 30 years; in the US and Canada methane, hydrogen sulfide, benzene, and arsenic continue to leak from almost four million abandoned oil wells; in the US serious chemical accidents are occurring at a rate of one every two days; and the steadily accumulating mass of space debris is leading inexorably to the Kessler Syndrome7 (“a phenomenon in which the amount of junk in orbit around Earth reaches a point where it just creates more and more space debris so that . . . the distribution of debris in orbit could render space activities and the use of satellites in specific orbital ranges difficult for many generations.” Meanwhile, our earth is already in the midst of its sixth or seventh mass extinction (with species disappearing at a faster rate than in any of the previous die offs); the oceans continue to gather garbage and plastic; and toxic forever chemicals (which do not break down in the environment and accumulate in the bodies of animals and human beings) have been found in about 17,000 different sites in Europe and the UK. It is increasingly clear that the modern scientific enterprise launched in the 16th and 17th centuries has turned out to be considerably more problematic than we had imagined or been led to believe.
Right from the start of that enterprise, its proselytizers set a clear goal: the aim was to make us “masters and possessors of nature” (Rene Descartes) and to endow human life “with new discoveries and powers”; “the furthest end of knowledge [is] . . . a rich storehouse for the glory of the Creator and the relief of man’s estate” (Francis Bacon). In pursuit of this goal, they redefined our understanding of nature, which was no longer a divine mystery to be respected and treated with care or left alone but rather a mechanical puzzle to be solved by the application of a new method based on rigorous empirical testing and re-testing of hypotheses, without appeals to ancient authorities (Biblical or pagan or Scholastic). This new method, they insisted, was easy to understand and to practise, and in their popular writings (written not only in Latin, the language of the scholars, but also in the vernacular) they invited their readers to participate.
Who would guide this ambitious new project? That question (crucial for their Medieval ancestors) the new scientists either ignored or brushed aside: “We need guides in forests and in unknown lands,” Galileo remarked, “but on plains and in open places only the blind need guides. . . . [A]nyone with eyes in his head and his wits about him could serve as a guide for them” (Dialogue Concerning the Two Chief World Systems, 1632). Like many of his fellow countrymen, Francis Bacon fudged and took refuge in an appeal to the argument from design (unless the pagan reference is ironical):
It is an assured truth, and a conclusion of experience, that . . . when the second causes, which are next unto the senses, do offer themselves to the mind of man . . . it may induce some oblivion of the highest cause; but when a man passeth on farther, and seeth the dependence of causes, and the works of Providence, then, according to the allegory of the poets, he will easily believe that the highest link of nature’s chain must needs be tied to the foot of Jupiter’s chair. (The Advancement of Learning, 1605)
This supreme confidence in the rightness of the modern scientific enterprise has invited comparisons with ancient Greek tragedy (see, for example, Michael Davis, Ancient Tragedy and the Origins of Modern Science, 1988). At first, the comparison may appear incongruous, given the difference between the passionately egocentric strong-willed Greek heroes and heroines (Agamemnon, Antigone, Medea, Oedipus, Ajax, and so on) and the promoters of the new scientific method, who were, for the most part, calm, reasonable, god-fearing, affable gentlemen, eager to reassure their audience that their work was serving the public good. However, ancient tragedy and modern science were both inspired by a desire for autonomy (i.e, freedom from fate), and both displayed considerable hubris. The latter quality was muted in the case of the natural scientists because they did not trumpet their heroic status as individuals but stressed the collective and collegiate nature of their project. Moreover, the experiments they conducted and the inventions they came up with were in many cases easy to understand and astonishing in their effects, none more than Edward Jenner’s discovery of vaccination (1796).
Religious authorities in England were receptive enough to the new science. Many welcomed it as an inducement to religious belief: the wonderful designs discovered in nature were evidence (they claimed) of a divine designer. The fiercest critics of the new science were the Tory satirists, conservative Christian writers alarmed at the lack of moral control over the scientific enterprise and afraid that this dangerous assertion of human pride would lead to disastrous consequences. That prospect made the more astute of them very gloomy, for they could see that the horse had already fled the barn. All they could do now was lament. The intensity of Jonathan Swift’s attacks on the Yahoos in Gulliver’s Travels (1726), for example, stems both from his sense of how these creatures will inevitably act once they acquire weapons more lethal than lumps of their own shit to throw at each other and from his recognition that it was too late to do anything about it.
This notion that developments in science and technology, no matter how astonishing, are potentially tragic because they are steadily increasing the power of the Yahoos without making them any more virtuous is an insight we have by and large brushed aside in favour of more optimistic visions of Homo sapiens. But we have yet to find a reasonable way to make people, especially our legislators and capitalists, reasonable, and so our sanguine hopes are often disappointed. For example, it is remarkable how inventors of new weapons can deceive themselves about human nature. Such is the case with the invention of our two favourite weapons of mass destruction: machine guns and explosives. Richard Gatling’s invention of the first prototypical machine gun (1861) was prompted by his desire to benefit mankind: “It occurred to me that if I could invent a machine—a gun—which could by its rapidity of fire, enable one man to do as much battle duty as a hundred, that it would, to a large extent supersede the necessity of large armies, and consequently, exposure to battle and disease [would] be greatly diminished” (quoted in The Gatling Gun, 1971). And Alfred Nobel, inventor of dynamite and gelignite, defended himself against the charge of being “the merchant of death” (an accusation he read in his own obituary, published in a newspaper which mistook the death of Nobel’s brother Ludvig for the death of Alfred): “My dynamite will sooner lead to peace than a thousand world conventions. As soon as men will find that in one instant, whole armies can be utterly destroyed, they surely will abide by golden peace” (1892). In June 1917, twenty-five years after Nobel is alleged to have made the above remark, my grandfather was setting the charges for the most powerful man-made non-nuclear explosion in history (“The Big Bang Heard on Downing Street”). The detonation took place on June 7, 1917, in the opening phase of the Battle of Messines and killed up to 10,000 German soldiers. The result of the battle was considered a victory for the British, albeit one that had little influence on the outcome of the war.
One should not, however, be too quick to condemn Nobel’s logic in claiming his explosives would lead to peace. He simply overestimated the reluctance of human beings to kill each other in large numbers. To prove his sentiment correct we simply had to multiply the power of the explosion exponentially. Once that was accomplished by the invention of atomic weapons, a policy of Mutually Assured Destruction became an effective way of preserving an admittedly nervous peace among the nuclear powers, a situation best described by its acronym, MAD.
The commencement of the Anthropocene epoch may well mark the age in which human beings finally realized that some of their most cherished beliefs about nature and science needed to change in significant ways–and quickly. We have, I think, already lost our faith in the notion that there is any effective control of the scientific enterprise (other than money). In the years of my scientific education the mantra constantly repeated was “Science will deliver us power, and the humanities will teach how to use that power properly,” a magical pronouncement that was the theme of the convocation speech delivered by Adlai Stevenson in 1959 at McGill University, when I received by BSc degree. And that was the reassuring message I carried out into the world when I began my teaching career. Few people cling to that idea any more. Scientific research follows the money that our legislators and our business leaders supply. And the motives for allocating those funds are not always as altruistic as we have been led to believe.
Our views of nature as a mechanical puzzle we have had to adjust, for it has turned out to be somewhat more complicated than that reductive metaphor suggests. Perhaps we should pay more attention to that well-known apopthegm, variously attributed to J. B. S. Haldane, Werner Heisenberg, and Arthur Eddington, “Nature is not only stranger that we imagine; it is also stranger than we can possibly imagine.” In the spirit of these words, many environmentalists have conceded that the most effective way to a better future for ourselves is to leave nature alone, by encouraging proforestation, rewilding , and, in general, severely limiting or strictly prohibiting human access to delicate environments. Such measures we should undertake, not as a favour to Nature, but as measures conducive to to our own survival.
In this rapidly approaching crisis, is there a special role for eco-humanists? I have no specific recommendations here other than to suggest that all humanists should dust off copies of their canonical text, Charles Darwin’s Origin of Species, and reacquaint themselves with the brutal realities of Darwin’s account of Nature, the most authoritative we have. Natural selection is such an innocuous sounding phrase, until one’s own species or one’s own culture is at risk. At that point we realize more clearly that Nature is neither our friend nor our enemy. Nature is wholly indifferent to us (“extravagant without limit, indifferent without limit, without purposes and consideration, without pity and justice, simultaneously fruitful, desolate, and unknown” Nietzsche called it) and if we do not sufficiently adapt to a constantly changing environment we know what will happen: we will follow the example of Easter Island and cease to exist as a viable culture.
It already appears almost inevitable that cultural collapse will overtake some of those countries without the resources to adjust to the rising sea levels, droughts, fires, diseases, famines, and floods and to cope with the social upheavals brought about by climate change and severe weather events. The richer countries have pledged to help poorer lands, but when will these commitments be discharged? How can they ever be enough? The richer nations–who bear the major responsibility for creating the crisis–will be too busy saving their own citizens, if necessary behind sea walls or in hazmat suits or both.
One final point. Our earth has been here before, facing lethal global warming brought about by volcanic fires burning fossil fuels and producing excessive amounts of carbon dioxide. The result was the Permian-Triassic extinction event8, “the largest mass extinction in the earth’s history . . . in which nearly 81% of marine species and 71% of terrestrial species died out.” Land ecosystems eventually recovered, but it took a while–in the case of land vertebrates about 30 million years.
- https://en.wikipedia.org/wiki/Anthropocene[↩]
- Nielsen. July 2, 2007, https://en.wikipedia.org/wiki/An_Inconvenient_Truth[↩]
- https://www.theguardian.com/environment/2023/feb/14/rising-seas-threaten-mass-exodus-on-a-biblical-scale-un-chief-warns[↩]
- https://www.science.org/doi/10.1126/science.abk0063[↩]
- NASA Earth Observatory Website[↩]
- https://en.wikipedia.org/wiki/Doomsday Clock[↩]
- https://www.space.com/kessler-syndrome-space-debris[↩]
- https://en.wikipedia.org/wiki/Permian%E2%80%93Triassic_extinction_event[↩]