I like the concept of the Anthropocene. It finesses or postpones at least some of the conflict around the idea of climate change, broadens the conversation to include all human impact on the environment, and grounds thinking in geological (heh!) time without overloading it with burdensome sentiments like guilt or fear. The term leaves the future open to both positive and negative possibilities. It acknowledges human agency as the most powerful force currently reshaping the planet without getting too judgmental about what that means.
I find existing definitions of the Anthropocene unsatisfying though. Most of them, reasonably enough, focus on planet-scale external markers, ranging from the birth of agriculture to the first nuclear tests and climate change. But this seems too open narrative arbitrariness and not open enough to insight. If we turn inward though, there is a rather natural and fertile definition that immediately suggests itself:
The Anthropocene begins when survival in the built environment is as cognitively demanding as survival in the natural environment of evolutionary adaptation.
Note that “as cognitively demanding” is not the same thing as “as hard across the board”. It means you you have to think as hard for the same survival probability, but many other things might get easier.
A good illustration of this is life in a major city versus life in a small town. The former is more cognitively demanding but many things besides thinking become a lot easier. Nobody ever moved to a bigger city in search of a simpler life. A less emotionally stressful life, perhaps. A less impoverished life, perhaps. A more comfortable and convenient life, perhaps. But not a simpler one.
Now let’s apply that reasoning at civilizational history scale.
The motivation for my definition is simple: all our impact on the environment, beyond that of other species with a comparable planetary range and biomass, can be traced to our runaway brain development and the impact of the associated cognitive surplus.
We used our brains to create less demanding cognitive environments for ourselves than nature red-in-tooth-and-claw, and then turned the newly idled brain capacity to get up to new devilry in our new easier lives. Cognitive surplus induces an evolving and strengthening boundary between nature, the locus where the cognitive capacity is maximally loaded by the basic demands of survival, and civilization, where that capacity is minimally loaded. This is a consequential boundary, with a survival probability differential across it. Given the same amount of thinking, odds of survival have historically been higher inside the boundary than outside. That’s why we’ve been putting up with each other’s shit since we were just another kind of trooping ape.
The virtuous cycle was driven by a mix of Jevons paradox (demand for our own cognitive capacity increasing as we learned to use it more efficiently) and a sort of Moore’s law driving a falling absolute cost of human computation in built environments due to technological leverage.
It was a nice virtuous cycle while it lasted, since it produced more cognitive surplus than it consumed with ape politics. But civilization, as I once noted, is the process of turning the incomprehensible into the arbitrary, and at some point natural and artificial become equally demanding contexts to think within (the point I previously labeled hackstablity). Past this point, it makes sense to think of the built environment as a wild space, at least in cognitive terms.
Whatever your preferred evolutionary explanation for the historic cognitive surplus (see this excellent 2017 paper by Robin Dunbar and Susanne Shultz for your options), it seems reasonable to suppose that the Anthropocene ought to be defined in terms of that surplus going away. Demand absorbing supply in a new general equilibrium. Or at least a new sort of far-from-equilibrium dissipative structure.
This is, by the way, the exact opposite understanding of cognitive surplus to Clay Shirky’s original one (afaik). We’re losing a surplus, not gaining it, because life — civilized life — is getting harder faster than we’re getting smarter.
More people having the free time to write bad novels or make bad music is not a sign that we have more cognitive surplus. Any more than people having more money to spend on avocado toast is a sign that they have more of a financial surplus (they might simply not be able to afford mortgages despite preferring home ownership to avocado toast). By analogy, more cognitive resources being devoted to bad novels might mean we don’t have enough of it to figure out (for instance) carbon capture.
There are still good reasons to prefer built environments to natural ones, but having to think less is no longer one of them.
Evolution does not stop with this equalization of survival potential of course, but continues in ways that are increasingly indifferent to imagined boundaries between “civilized” and “wild” (and everything real and imaginary those boundaries imply, such as enemy tribes, enemy nations, landfills, and rainforests).
How hard does thinking have to get before it is as hard as survival in the wild? At what point do you need to be as smart to survive inside civilization as outside of it? At what point do the distinctions among civilized, barbarian, and savage begin to blur?
I think we’re already there, and have been for a decade. It’s what I’ve been calling the cyberpaleo condition.
When I was a teenager growing up in the 80s, we were advised to “think global, act local” to thrive in modern, post-Cold-War conditions. This was harder than being an Organization Man type at the local industrial firm, but still easy enough that many of us manage it quite well, with enough left over for a lot of Netflix-watching. But if I had to give comparable survival advice to teenagers growing up today, it would be this:
Think entangled, act spooky.
Acting on this advice will be as hard as surviving in the wild by yourself.
The world has of course, always been entangled and spooky, but while we had a comfortable cushion of cognitive surplus to work with, we could pretend it was simpler than it was by carving it up in ways where concerns and consequences could be made largely local, and causal reasoning could ignore “externalities” (that word itself is revealing — a general definition would be: residual computational work left over after using some simplistic local approximation to solve a problem, which can be ignored because it can be turned into Somebody Else’s Problem or worse, Some Other Species’ Problem; the tribalist version of map-reduce).
Now we are at the Pareto frontier where we cannot simultaneously make the world both more comprehensible and safer for ourselves at the same time. We have to choose one or the other, and it appears we have chosen to make it less comprehensible. Or to put it another way, we have accepted the need to think harder to make it work.
The recommended world metaphor for think entangled, act spooky is neither Hobbes’ Leviathan (too sentient, anthropomorphic, and intentional), nor Benjamin Bratton’s Stack (too lifeless) but Yggdrasil, the world tree of Norse mythology.
Vegetable is the sweet spot between animal and mineral when it comes to major, general mental models for the Anthropocene.
Yggdrasil has evolutionary tendencies and homeostatic dispositions of its own — that’s where the spooky-action-at-distance potentialities come from — but not a mind of its own. We have to supply the mind or it all falls apart. At least for us. The cockroaches might be fine.
Life in Yggdrasil is entangled in weird, non-local ways. Spooky action at a distance is the rule rather than the exception. There are no externalities and everything is internal.
Dealing with it all is going to be a hard enough challenge to absorb all human cognitive surplus. Even accounting for silicon leverage.
You go it backwards (intentionally?). The slogan was:
Think globally, act locally.
The other way around is more like what is happening now….
Sorry, typo. Fixed. your way is how I remember it too.
I like this framing of the Anthropocene, it doesn’t get entangled in precise historical periods or risk being overdetermined, insofar as it describes a cognitive relationship rather than a strictly technical one: The period of anthrogenic survival begins once our socially constructed eco-cognitive niches fully overcode the base environmental niche, and affordances are no longer merely given but become hidden and entail hidden risks; the cognitive economy achieved through the delegation of chances and inferences to the built environment (affordances) is eventually overcome by sign density and unanticipated effects of complex chains of operations. If we adopt Yggdrasil as an eco-cognitive model, then the goal moving forward would be to adopt instrumental means of intervention in the world that anticipate hidden affordances (cognitive leverage of latent environmental relations) as well as which increase legibility (discovery of entailed local risks and global communication of relevant intel). To take the tree metaphor too far, we need to leverage and enhance the fungal networks that allow real cross-platform communication and sufficiently reactive defences.
Ok, this is a cool idea, but it does kind of miss the point of how geological periods are defined. A marker event needs to be chosen — a “golden spike”. The proposal of either a particular geologically visible CO2 threshold (CCD shoaling maybe?) or the Big Obvious Isotopic Event of the first nuclear tests meets that criterion in the same way that the iridium anomaly (Cretaceous-Paleogene) or the Treptichnus layer (Ediacaran-Cambrian) does. What you’re doung here is mmore like trying to speculate about what the major distinguishing ecological feature of the Anthropocene is going to be. Still interesting, for sure, but don’t mix up your boundaries with your internal features.
I’m deliberately ignoring that point, not missing it 😎