Pascal’s Market

(Part 3 of a series. See Part 1 and Part 2.)

Every publisher insists upon its own epistemic solvency.

Image credit: Scott Herren

Given so many competing epistemic reserve banks (i.e., publishers), which ones keep enough truth in reserve to cover their reserve notes (articles)?

[Read more…]

MJD 59,169

This entry is part 12 of 21 in the series Captain's Log

If you remember your high-school physics, free energy is the energy available to do work. Energy is conserved, but free energy is not. For example, when a heavy ball drops from a height, the free energy stays roughly constant (ignoring drag) right until the moment of impact. The amount of free energy lost via inelastic collision is proportional to the height lost at the peak of the bounce. The rest is turned into useless heat. With each bounce, more free energy is lost, until finally all of it is lost. In a world without non-conservative forces like friction (which lower free energy), a ball could bounce for ever. Satellites orbiting earth approximate this: orbital motion is useful work that can be continued indefinitely for free.

***

[Read more…]

MJD 59,163

This entry is part 11 of 21 in the series Captain's Log

Moore’s Law was first proposed in 1965, then again in revised form in 1975. Assuming an 18-month average doubling period for transistor density (it was ~1 year early on, and lately has been ~3y) there have been about 40 doublings since the first IC in 1959. If you ever go to Intel headquarters in San Jose, you can visit the public museum there that showcases this evolution.

The future of Moore’s law seems uncertain, but it looks like we’ll at least get to 1-3 nanometer chips in the next decade (we were at 130nm at the beginning of the century, and the first new computer I bought had a 250nm Celeron processor). Beyond 1-3nm, perhaps we’ll get to different physics with different scaling properties, or quantum computing. Whatever happens, I think we can safely say Gen X (1965-80) will have had lives nearly exactly coincident with Moore’s Law (we’ll probably die off between 2045-85).

While there have been other technologies in history with spectacular price/performance curves (interchangeable parts technology for example), there is something special about Moore’s Law, since it applies to a universal computing substrate that competes with human brains.

GenXers are Moore’s Law people. We came of age during its heyday. Between 1978-92 or so, the personal computer (pre-internet) was growing up along with us. The various 6502-based home computers, and the 8088, 286 “AT”, 386, 486, and Pentium were milestones of my childhood and teenage years. During that period, performance was synonymous with frequency, so there was a single number to pace our own adolescence. Those computers were underpowered enough that we could feel the difference from power increases even with simple applications. Today, you have to design stress tests with hungry apps to detect the performance limits of new computers.

After the Pentium, things got complicated, and growth was no longer a simple function of frequency. There was register size, watts, core count, RISC vs. CISC…

Life also got complicated for X-ers, and growth was no longer about growing taller and buying higher-frequency computers. Moore’s Law shifted regimes from micrometers to nanometers (in a decade, it should be in the picometer regime)

There’s an Apple event going on today, featuring Apple’s own silicon for the Mac for the first time. The M1 5nm chip. But Moore’s Law is not in the spotlight. Apple’s design is.

I think some of the message of the silicon medium rubbed off on us Gen X’ers. We got used to the primary thing in our lives getting better and cheaper every single year. We acquired exponential-thinking mindsets. Thinking in terms of compounding gains came naturally to us. For me personally, it has shown up most in my writing. At some level, I like the idea of producing more words per year (instructions per cycle, IPC?) with less effort (watts). This is why anytime a new medium appears that seems to make it easier to pump up quantity — Twitter, Roam research — I jump on it. Quantity has a quality all its own, as Stalin said. We are lucky to live in an age when we can expect the fundamental tradeoffs of writing to change several times in a single lifetime. A few centuries ago, you could live an entire lifetime without writing technology changing at all.

But like Moore’s Law, I too am slowing down. The natural instinct when you feel yourself slowing down is to switch gears from quantity to quality. I think this is a mistake, at least for me. Quantity is still the most direct road to quality, as the parable of the pottery class suggests. But as with semiconductors, it doesn’t just happen. You have to consciously develop the next “process node” (like the upcoming jump from 5nm to 3nm), work out the kinks in it, increase “yield rate” (the number of usable chips you get out of a wafer of silicon, a function of defects, design, etc), and then architect for that scale. Each jump to a new process node takes longer, and you face new tradeoff curves.

But each jump begins the same way: stripping away complexity from your current process and going back to the basics of words each time. You can’t add volume to complexity. You can only add complexity to volume.

For writing, sometimes the new “process node” is a whole new medium (moving from blogs to twitter threads), other times, it is primarily a matter of rearchitecting the way you write, like with my blogchains and now this headline-free Julian-date-numbered shtick. It’s always about pushing quantity, not quality. Right now, I’m trying to engineer my next “process.” I don’t think I’ll ever produce the sheer volume of words I used to 10 years ago, but I suspect can be vastly more efficient with my energy if I arrange the toolchain right. More words per year per watt, that’s the thing to shoot for.

MJD 59,151

This entry is part 10 of 21 in the series Captain's Log

It’s been a busy week or so in space. NASA found water on the Moon (at concentrations lower than in the Sahara desert, but perhaps enough to extract and turn into hydrogen for fuel?). The OSIRIS-REx mission took a bounce-by biopsy of the asteroid Bennu, which makes me think mining might be closer than we think. And perhaps most interestingly, SpaceX is making Starlink subscribers sign a Terms of Service document agreeing that Mars will be outside of Earth jurisdiction.

Speaking of Mars, here’s a picture I took (Canon SLR attached to a 4.5″ Newtonian). I still haven’t figured out the fancy image-stacking techniques to produce more detailed output from my ongoing raw photography, but I’ll get there. Mars was in opposition on Oct 6, and it is still unusually large and bright, so it’s a great time to observe it. And it’s I think worth everybody’s time to do so — a reminder that there really is a universe out there beyond Earth. It’s not fantasy. There’s an actual neighboring ball of dirt, a pale red dot, in our cosmic backyard. I’ve seen it more directly than I have Antarctica.

The Starlink news is not a joke or merely of academic-legal interest. Starlink technology could easily be modified to provide broadband WiFi coverage for Mars. We really might see at least robotic space settlements in our lifetimes, and these terms of service will matter. I’m really glad SpaceX is forcing this conversation early, with what I think are the right initial conditions.

[Read more…]

MJD 59,145

This entry is part 9 of 21 in the series Captain's Log

The terms public and private seem to form a balanced opposition, but they don’t really. In modern usage, private is a bounded and circumscribed domain, while public is an open-ended space defined via negation as non-private. It was supposedly the opposite in ancient Greece, at least by Hannah Arendt’s account. In her version of events, public was a bounded and circumscribed domain, and private was an open-ended survival warfront against nature. I’ve come to see her version of events as mistaken on crucial points, due to her over-indexing on the Greek origin myth for the notion of the public. A bunch of islands in a third-generation civilization is not a good prototype for civilization in general, which mostly arose in continental interiors along river valleys.

[Read more…]

MJD 59,143

This entry is part 8 of 21 in the series Captain's Log

I’ve been thinking about creative pivots. Discontinuous reorientations in your pattern of creative production, possibly accompanied by a change in the audience for your creative work (lose one kind of reader, gain a new kind of reader). I don’t think I’ve ever really executed a true creative pivot. The kind that’s an abrupt, lossy, high-entropy reorientation maneuver in response to a changing environment.

While my writing has changed over the years, it’s mostly been gentle, smooth turns in response to my own gradually shifting interests, against the backdrop of a world that was changing and aging much more slowly than I was. The turns were powered by picking up new tricks while tiring of old ones, rather than new understandings of the world. For the first time in 22 years of writing online though, I feel like I’m in the middle of a true creative pivot. One driven by sharp changes in the zeitgeist rather than in my own interests. One that will for sure lose me a certain subset of readers, but hopefully gain me a new subset of readers. Of course, my own interests are continuing to shift at the same rate as before, but the broader artistic mood is shifting much more sharply.

One big era is yielding to another. The shift was already underway before Covid (the Great Weirding of 2015-19), but it has now passed some sort of event horizon.

We’re headed into what the future will likely judge to be a Lost decade, much like the 1920s. A temporally dislocated oxbow lake by the river of history. The 1920s were the Roaring Twenties, a decadal pause between Victorian/Edwardian (1837-1920, extending Edwardian to include WW1) and Late/Post-modernist (1930-2020). The 2020s will be the Searing Twenties, a decadal pause between the Late/Post-modernist era and whatever comes next. It too will be a Lost decade. Oddly enough, despite the dramatic nature of historical events, the ends of World War 2 and the Cold War did not trigger lost decades (except in Japan, which is on some sort of alternative timeline). Apparently it takes a pandemic to administer the coup de grace to an age.

The pause of a Lost decade is a grand-narrative pause, between big world stories that persist for 3-4 generations, and span all living memory. You can think of the prevailing mood as one where it is much harder to make up extended universe type stories. The age in decline is a fatally flawed and unraveled reference reality, but the emerging age is too ill-defined to serve as a new reference reality. And since we’re talking 100-year windows, nobody alive has a different reference point to offer. Talking to grandma doesn’t help; she doesn’t remember a truly different reality either. So larger-scale imagination gets hamstrung. I did a thread about this a few days ago. The good news is, if you make it past 2030, you’ll be able to tell kids being born today all about how the world used to be different once.

Lost-decade pauses typically feature anti-grand-narratives, like H. P. Lovecraft’s Cthulhu mythos from the 1920s or the more restricted Godzilla mythos from post-World War 2 Japan. Such anti-grand-narratives induce shrunken rather than extended universes in the imagination. They center human helplessness in the face of larger powers, rather than human agency and universe-denting powers. It’s not that extended universes cannot be imagined, but that they cannot be anthropomorphized and imagined as belonging to humans. The human sphere is temporarily reduced to a footnote in larger cosmic dramas starring non-human forces. Spiritual tendencies get amplified, new religions and cults form, new artistic and literary movements take off. These last are disposed to take a very hard look at the assumptions of the receding age.

To the extent creative production is a way to stay alive to the world, the mood shift is an imperative to either change your pattern of production or grow increasingly dead to the world. So if you are a writer or other sort of creative producer, you have to pivot with the times, and establish a new relationship with the shifting mood.

But because the shift will take an unsettled decade for the world at large to navigate, your new relationship will be a pattern of active negotiation with shifting realities rather than a decoupled one-shot response to them. You’ll be pivoting towards either greater engagement or greater detachment. You’ll either help invent the future, or retreat with the declining age and turn into a producer of nostalgia.

Clockmaking: 2

This entry is part 2 of 2 in the series Clockmaking

Well, I finished building my ROKR kit clock, and it works. Fully wound-up it runs for about 5-6 hours before friction defeats it. It makes a pleasantly organic tick-tock sound that I’m now addicted to. Makes me feel a bit like a GOD who created LIFE out of lifeless bits of matter! 😎😇.

It strikes me (heh!) that the default identity of a clock should be the signal it generates, translated to a sound or flow, rather than what it looks like physically. So I am going to lead with a tick-tock sound clip, the hello world of this clock…

hello world dot clock

I highly recommend this kit if you are in a mood to grok the headspace of the early scientific revolution circa 1600-1660, between Galileo and Huygens. Having built this clock, suddenly the Big Mood of that era feels a lot clearer.

I’m going to walk through the highlights of the build in this post, mainly with an eye to interesting appreciative reflections, but I’ll also share a few frustration-avoidance tips for those who might want to actually build this kit. While I was tweeting about this, a few people mentioned they’d bought and abandoned this kit halfway, or built it but failed to get it running, which is a pity, since it is a very satisfying build.

[Read more…]

MJD 59,128

This entry is part 7 of 21 in the series Captain's Log

Covid is the first global downgrade in the average human quality of life since World War 2. Some of the individual downgrades are adaptive for climate change as well, and will likely get locked in for a longer term. Lowered global human mobility at all scales, from local driving to international flying, feels like the biggest downgrade to me. It feels sad. Sad like a pay cut, but also sad like the ending of The Lord of the Rings. The end of an era. A paradise lost.

One sense seems to mitigate the sadness, and that is the sense of a closer connection to distant histories and futures. The world just got bigger in space, but smaller in time.

Reading about the Spanish Flu 100 years ago (1918-19), or the Black Death 670 years ago (1348-50), feels in some ways like reviewing my own memories from 6 months ago. Equally, the future 100 or 670 years out suddenly feels a lot more real. I now feel a lot closer to 2120, when Covid will merely be yet another endemic seasonal sniffle, and climate change impacts will be peaking. And to 2690, when the climate wars will likely have settled as a distant memory of a war won (or at least nobly or ignobly fought and survived by a few).

Thanks to Covid, we can now live more fully in what the Stewart Brand crowd calls the long now. It is one of the few tastes that’s easier to satisfy, rather than harder, post-Covid.

After a few decades of collective historic presentism, attended by a certain historical amnesia and future-blindness, we are once again tapping into historical currents that connect us to lives and events far in the past and future. An abstract sense of history and the future has suddenly turned as visceral as personal memory. Living humans, mostly born 1930-2020, have clumsily merged their stories into the larger story of humanity, 20,000 BC to 3000 AD. Through the merge conflict, those of us alive today can, if we choose, outgrow the sense of temporal exceptionalism that has been the human default for decades.

Paul Erdos was good at living in the long now:

In 1970 I preached in Los Angeles on ‘my first two-and-a-half billion years in mathematics.’ When I was a child, the earth was said to be two billion years old. Now scientists say it’s four and a half billion. So that makes me two-and-a-half billion. The students at the lecture drew a time line that showed me riding a dinosaur. I was asked, ‘How were the dinosaurs?’ Later, the right answer occurred to me: ‘You know, I don’t remember, because an old man only remembers the very early years, and the dinosaurs were born yesterday, only a hundred million years ago.'”

(from The Man Who Loved Only Numbers, by Paul Hoffman)

Before Covid, I was 45 years old, in the middle of say a 90-year life (if I’m lucky). Post-Covid I suddenly feel 670 years old, looking forward to a personal future that extends another 670 years out. I’m in the middle of a 1340-year long now. Some of my old posts in this vein (Immortality in the Ocean of Infinite Memories, 2014 and A Beginner’s Guide to Immortality, 2013) suddenly feel a lot more real.

From a 1340-year long-now perspective, looking out at world events somehow feels less sad. The world eventfully existed for long before you and I were around, and will continue to exist for long after. We may be just passing through, playing at most a small part overall, but we don’t have to restrict our presence in the world to our lifespans. We can expand it to occupy the temporal span of all events we are capable of viscerally feeling.

Still, we can’t actually live for 1340 years, even if we get viscerally better at inhabiting such a long now. Living in the long now means feeling more time than you can touch. That begets a longing. Long-nowing means longing-now.

Mansionism 2: Bungalows

This entry is part 2 of 2 in the series Mansionism

Though I’m big on climate-resilient futures, I have an ambivalent relationship with density as a means to achieve them. I mostly grew up in company bungalows on generous-sized lots, and loved it. Both the word and the architectural style are Indian in origin. The style originated in feudal-era Bengal and spread across north India during the British Raj. In North Indian languages, the word bangla refers both to the style of house and the Bengali language.

As a prominent Bengali nobleman, Rabindranath Tagore likely had mansion-scale super-bungalows in mind when he wrote (emphasis mine):

Where knowledge is free
Where the world has not been broken up into fragments by narrow domestic walls.

I don’t mean to be snarky, but this is the mansion whose narrow domestic walls he was born and raised in:

Jorasanko Thakurbari, now Rabindra Bharati University (P. K. Niyogi, CC-BY-SA-3)

Tagore is an anglicization of Thakur, which is a feudal title like Lord, not a last name. The Tagore family mansion pictured above, Jorasanko Thakurbari, is now part of the campus of Rabindra Bharati University in Calcutta, a public university dedicated to a Tagoresque tradition of education.

[Read more…]

Notes: The Marshall Plan by Benn Steil

This entry is part 5 of 6 in the series Book Notes

I read this next book, The Marshall Plan: Dawn of the Cold War, by Benn Steil, in an attempt to take the idea of a “Marshall Plan for post-Covid recovery” seriously.

I’m glad I did because I apparently had an entirely misguided understanding of what the plan was, the context in which it was undertaken, how it worked, and how well it worked.

In the decades since the OG plan arguably saved postwar Europe from collapse, the idea of a “Marshall Plan for X” has become something of a cliche in policy circles, and an event like the Covid19 pandemic is perhaps the most tempting sort of binding for X. I myself tweeted on March 28 that maybe we should shoot for a “bottom-up OODA Marshall Plan” sometime in March.

Now, having read the book, I have to say, the Marshall Plan is perhaps not the best precedent to look at for today’s needs, even though there are elements worth learning from, mostly in the what not to do department. If there are lessons here for post-Covid, they are not the obvious ones.Here is the original thread. On to the notes.

[Read more…]