“The Internationalists” and making war illegal

At Astral Codex Ten, there’s a great review essay on The Internationalists, a book about “the 1928 Kellogg-Briand Peace Pact” (I hadn’t heard of it either), which sought to “declare war illegal.” There are some obvious ways in which war has continued, but the thrust of The Internationalists and the essay seems to be that things have overall been moving in the right direction. Even authoritarian countries like Russia work to play down their warfare and conquest aims, particularly to their own populations. Part of the reason countries appear to have historically gone to war is to get rich by stealing things from other people, and to get more “land” for one’s people. These reasons haven’t made sense for many decades, if they ever did; today, the largest companies in the world are tech companies, and you can’t steal Apple, Google, Microsoft, or Amazon through invasion. Even if these companies were in Ukraine, attempting to “steal” them through invasion wouldn’t work because the vast majority of their value is in their people and systems, who would flee (in the case of people) and which would disintegrate (in the case of systems) in the event of invasion.

China has gotten rich in the last few decades by making stuff people want, not by attempting to forcibly steal things through invasion. China might change this strategy through invading Taiwan, and in the process destroy companies like TSMC, but it’s almost certainly not going to get richer in the process, and will likely achieve the opposite. In many countries, including the United States, we could immediately become vastly richer by changing some of our laws, rather than invading other countries: Hsieh and Moretti, for example, “quantify the amount of spatial misallocation of labor across US cities and its aggregate costs. Misallocation arises because high productivity cities like New York and the San Francisco Bay Area have adopted stringent restrictions to new housing supply, effectively limiting the number of workers who have access to such high productivity. Using a spatial equilibrium model and data from 220 metropolitan areas we find that these constraints lowered aggregate US growth by 36 percent from 1964 to 2009.”

36 percent! That’s a huge amount of growth—imagine making 36% more per year than you are right now. Like a lot of countries (though not Japan), we can dramatically increase aggregate wealth by liberalizing land-use laws. Essentially all countries have plenty of “space” for people—if we choose to let land owners do what they want to with their land. We’ve decided to be collectively poorer by not doing so, which seems unwise to me, but I’m one guy.

In most countries, too, birthrates are now at or below replacement levels. We’re not collectively able to reproduce ourselves, let alone need to somehow go find more “space” for others. Polling consistently shows American women want two or three kids, but most are having one or two, perhaps because they feel they can’t afford to have more. Maybe we should try to make the cost of living lower, so that more people can enjoy it—that is, the “living.” Instead, we’re perversely doing the opposite. “Perversity” may be the theme of this essay.

The anonymous reviewer says that “The US keeps starting or engaging in wars, like in Libya, Afghanistan, and Iraq,” but he or she doesn’t go further: There’s an interesting counterfactual history of the United States in which we don’t invade Iraq, spending around $2 trillion (“trillion” with a “t”). Let’s say we spend 10% of that, or $200 billion, on other things, such as true energy independence. Although Iraq wasn’t really about “stealing” Iraqi oil, Iraq—like Russia and Iran—wouldn’t have the money to create globally significant mischief without selling oil. What could we have done instead of invading Iraq? We could have invested substantially in battery technology and manufacturing, thus driving the cost of batteries for car applications, five to ten years earlier than actually happened—and we could’ve cut gas and oil usage far faster than we did. We’d get environmental benefits, too, on top of the geopolitical ones.

There are arguments like this around nuclear fusion power plants:

“Fusion is 30 years away and always will be.”

What happened? Why has fusion failed to deliver on its promise in the past?

By the 1970s, it was apparent that making fusion power work is possible, but very hard. Fusion would require Big Science with Significant Support. The total cost would be less than the Apollo Program, similar to the International Space Station, and more than the Large Hadron Collider at CERN. The Department of Energy put together a request for funding. They proposed several different plans. Depending on how much funding was available, we could get fusion in 15-30 years.

How did that work out?

fusion_funding

Along with the plans for fusion in 15-30 years, there was also a reference: ‘fusion never’. This plan would maintain America’s plasma physics facilities, but not try to build anything new.

Actual funding for fusion in the US has been less than the ‘fusion never’ plan.

The reason we don’t have fusion already is because we, as a civilization, never decided that it was a priority. Fusion funding is literally peanuts: In 2016, the US spent twice as much on peanut subsidies as on fusion research.

We’ve been consistently spending less on fusion than we did in the ’70s. The largest fusion project, the International Thermonuclear Experimental Reactor (ITER), is now going to cost around $21 billion—or about half of the $40 billion in weapons we’re shipping to Ukraine (Russia is a petro state and, without income from oil and gas sales, it would be unlikely to be able to fund a true war effort). $21 billion is also about 1% of what we’ve spent on the Iraq war. Maybe we’d not have working, commercially viable nuclear fusion here in 2022, but we’d be far closer than we are. Instead of investing in true energy independence, we’ve been investing in warfare, which seems like a bad trade-off. MRNA vaccines have made the world billions if not trillions of dollars richer, apart from saving a million lives in the United States alone. Maybe we should do more of that (I’m using the word “maybe” with some archness).

There’s a world in which we take the long view in an attempt to stop funding authoritarian regimes and stop invading them, and we instead focus on trying to get to the future faster. Most of the wars involving the United States in the last 30 years have been at least partially traceable to oil and gas (Saudi Arabia being the home of 15 of the 19 9/11 attackers, and being a putative ally of the U.S. but not exactly the good guys). Instead of saying, “Hey, maybe we ought to think about this relationship between warfare and gas,” we’ve decided to keep fighting random wars piecemeal. As of this writing, we’re not fighting Russia directly, but we’re not not fighting Russia. Simultaneously, had Germany invested heavily in conventional nuclear fission plants, it would’ve imported billions less in gas from Russia, and it would be poised to switch to electric vehicles. Russia’s warfare capabilities would likely be far worse than they are. Germany’s emissions could be far lower than they are. (France, to its credit, gets most of its electricity from nuclear sources: contrary to stereotype, the country isn’t composed entirely of Houellebecqian bureaucrats, sex workers, and waiters.)

Making war illegal is good, but making it uneconomical is also good, and the latter may help encourage the former. War is dumb and people get richer without it—one hopes the Chinese Community Party (CCP) sees this, as we did not during 2001 – 2003. Making war even more uneconomical than it is now requires a civilization that thinks further than a few months into the future. Maybe we should get on that. Things that are illegal and dumb aren’t very enticing.

The death of literary culture

At The Complete Review, Michael Orthofer writes of John Updike that

Dead authors do tend to fade fast these days — sometimes to be resurrected after a decent interval has passed, sometimes not –, which would seem to me to explain a lot. As to ‘the American literary mainstream’, I have far too little familiarity with it; indeed, I’d be hard pressed to guess what/who qualifies as that.

Orthofer is responding to a critical essay that says: “Much of American literature is now written in the spurious confessional style of an Alcoholics Anonymous meeting. Readers value authenticity over coherence; they don’t value conventional beauty at all.” I’m never really sure what “authenticity” and its cousin “relatability” mean, and I have an unfortunate suspicion that both reference some lack of imagination in the speaker; still, regarding the former, I find The Authenticity Hoax: How We Get Lost Finding Ourselves persuasive.

But I think Orthofer and the article are subtly pointing towards another idea: literary culture itself is mostly dead. I lived through its final throes—perhaps like someone who, living through the 1950s, saw the end of religious Christianity as a dominant culture, since it was essentially gone by the 1970s—though many claimed its legacy for years after the real thing had passed. What killed literary culture? The Internet is the most obvious, salient answer, and in particular the dominance of social media, which is in effect its own genre—and, frequently, its own genre of fiction. Almost everyone will admit that their own social media profiles attempt to showcase a version of their best or ideal selves, and, thinking of just about everyone I know well, or even slightly well, the gap between who they really are and what they are really doing, and what appears on their social media, is so wide as to qualify as fiction. Determining the “real” self is probably impossible, but determining the fake selves is easier, and the fake is everywhere. Read much social media as fiction and performance and it will make more sense.

Everyone knows this, but admitting it is rarer. Think of all the social media photos of a person ostensibly alone—admiring the beach, reading, sunbathing, whatever—but the photographer is somewhere. A simple example, maybe, but also one without the political baggage of many other possible examples.

Much of what passes for social media discourse makes little or no sense, until one considers that most assertions are assertions of identity, not of factual or true statements, and many social media users are constructing a quasi-fictional universe not unlike the ones novels used to create. “QAnon” might be one easy modern example, albeit one that will probably go stale soon, if it’s not already stale; others will take its place. Many of these fictions are the work of group authors. Numerous assertions around gender and identity might be a left-wing-valenced version of the phenomenon, for readers who want balance, however spurious balance might be. Today, we’ve in some ways moved back to a world like that of the early novel and the early novelists, when “fact” and “fiction” were much more disputed, interwoven territories, and many novels claimed to be “true stories” on their cover pages. The average person has poor epistemic hygiene for most topics not directly tied to income and employment, but the average person has a very keen sense of tribe, belonging, and identity—so views that may be epistemically dubious nonetheless succeed if they promote belonging (consider also The Elephant in the Brain by Robin Hanson and Kevin Simler for a more thorough elaboration on these ideas). Before social media, did most people really belong, or did they silently suffer through the feeling of not belonging? Or was something else at play? I don’t know.

In literary culture terms, the academic and journalistic establishment that once formed the skeletal structure upholding literary culture has collapsed, while journalists and academics have become modern clerics, devoted more to spreading ideology than exploring the human condition, or to art, or to aesthetics. Academia has become more devoted to telling people what to think, than helping people learn how to think, and students are responding to that shift. Experiments like the Sokal Affair and its successors show as much. The cult of “peer review” and “research” fits poorly in the humanities, but they’ve been grafted on, and the graft is poor.

Strangely, many of the essays lamenting the fall of the humanities ignore the changes in the content of the humanities, in both schools and universities. The number of English majors in the U.S. has dropped by about 50% from 2000 to 2021:

Decline of English majors

History and most of other humanities majors obviously show similar declines. Meanwhile, the number of jobs in journalism has approximately halved since the year 2000; academic jobs in the humanities cratered in 2009, from an already low starting point, and have never recovered; even jobs teaching in high school humanities subjects have a much more ideological, rather than humanistic, cast than they did ten years ago. What’s taken the place of reading, if anything? Instagram, Snapchat, TikTok, and, above all, Twitter.

Twitter, in particular, seems to promote negative feedback and fear loops, in ways that media and other institutions haven’t yet figured out how to resist. The jobs that supported the thinkers, critics, starting-out novelists, and others, aren’t there. Whatever might have replaced them, like Twitter, isn’t equivalent. The Internet doesn’t just push most “content” (songs, books, and so forth) towards zero—it also changes what people do, including the people who used to make up what I’m calling literary culture or book culture. The costs of housing also makes teaching a non-viable job for a primary earner in many big cities and suburbs.

What power and vibrancy remains in book culture has shifted towards nonfiction—either narrative nonfiction, like Michael Lewis, or data-driven nonfiction, with too many examples to cite. It still sells (sales aren’t a perfect representation of artistic merit or cultural vibrancy, but they’re not nothing, either). Dead authors go fast today not solely or primarily because of their work, but because the literary culture is going away fast, if it’s not already gone. When John Updike was in his prime, millions of people read him (or they at last bought Couples and could spit out some light book chat about it on command). The number of writers working today who the educated public, broadly conceived of, might know about is small: maybe Elena Ferrante, Michel Houllebecq, Sally Rooney, and perhaps a few others (none of those three are American, I note). I can’t even think of a figure like Elmore Leonard: someone writing linguistically interesting, highly plotted material. Bulk genre writers are still out there, but none who I’m aware of who have any literary ambition.

See some evidence for the decline of literary cultures in the decline of book advances; the Authors Guild, for example, claims that “writing-related earnings by American authors [… fell] to historic lows to a median of $6,080 in 2017, down 42 percent from 2009.” The kinds of freelancing that used to exist has largely disappeared too, or become economically untenable. In If You Absolutely Must by Freddie deBoer, he warns would-be writers that “Book advances have collapsed.” Money isn’t everything but the collapse of already-shaking foundations of book writing is notable, and quantifiable. Publishers appear to survive and profit primarily off very long copyright terms; their “backlist” keeps the lights on. Publishers seem, like journalists and academics, to have become modern-day clerics, at least for the time being, as I noted above.

Consider a more vibrant universe for literary culture, as mentioned in passing here:

From 1960 to 1973, book sales climbed 70 percent, but between 1973 and 1979 they added less than another six percent, and declined in 1980. Meanwhile, global media conglomerates had consolidated the industry. What had been small publishers typically owned by the founders or their heirs were now subsidiaries of CBS, Gulf + Western (later Paramount), MCA, RCA, or Time, Inc. The new owners demanded growth, implementing novel management techniques. Editors had once been the uncontested suzerains of title acquisition. In the 1970s they watched their power wane.

A world in which book sales (and advances) are growing is very different from one of decline. It’s reasonable to respond that writing has rarely been a path to fame or fortune, but it’s also reasonable to note that, even against the literary world of 10 or 20 years ago, the current one is less remunerative and less culturally central. Writers find the path to making any substantial money from their writing harder, and more treacherous. Normal people lament that they can’t get around to finishing a book; they rarely lament that they can’t get around to scrolling Instagram (that’s a descriptive observation of change).

At Scholar’s Stage, Tanner Greer traces the decline of the big book and the big author:

the last poet whose opinion anybody cared about was probably Allen Ginsberg. The last novelist to make waves outside of literary circles was probably Tom Wolfe—and he made his name through nonfiction writing (something similar could be for several of other prominent essayists turned novelists of his generation, like James Baldwin and Joan Didion). Harold Bloom was the last literary critic known outside of his own field; Allan Bloom, the last with the power to cause national controversy. Lin-Manuel Miranda is the lone playwright to achieve celebrity in several decades.

I’d be a bit broader than Greer: someone like Gillian Flynn writing Gone Girl seemed to have some cultural impact, but even books like Gone Girl seem to have stopped appearing. The cultural discussion rarely if ever revolves around books any more. Publishing and the larger culture have stopped producing Stephen Kings. Publishers, oddly to my mind, no longer even seem to want to try producing popular books, preferring instead to pursue insular ideological projects. The most vital energy in writing has been routed to Substack.

I caught the tail end of a humane and human-focused literary culture that’s largely been succeeded by a political and moral-focused culture that I hesitate to call literary, even though it’s taken over what remains of those literary-type institutions. This change has also coincided with a lessening of interest in those institutions: very few people want to be clerics and scolds—many fewer than wonder about the human condition, though the ones who do want to be clerics and scolds form the intolerant minority in many institutions. Shifting from the one to the other seems like a net loss to me, but also a net loss that I’m personally unable to arrest or alter. If I had to pick a date range for this death, it’d probably be 2009 – 2015: the Great Recession eliminates many of the institutional jobs and professions that once existed, along with any plausible path into them for all but the luckiest, and by 2015 social media and scold culture had taken over. Culture is define but easy to feel as you exist within and around it. By 2010, Facebook had become truly mainstream, and everyone’s uncle and grandma weren’t just on the Internet for email and search engines, but for other people and their opinions.

Maybe mainstream literary culture has been replaced by some number of smaller micro-cultures, but those microcultures don’t add up to what used to be a macroculture.

In this essay, I write:

I’ve been annoying friends and acquaintances by asking, “How many books did you read in the last year?” Usually this is greeted with some suspicion or surprise. Why am I being ambushed? Then there are qualifications: “I’ve been really busy,” “It’s hard to find time to read,” “I used to read a lot.” I say I’m not judging them—this is true, I will emphasize—and am looking for an integer answer. Most often it’s something like one or two, followed by declamations of highbrow plans to Read More In the Future. A good and noble sentiment, like starting that diet. Then I ask, “How many of the people you know read more than a book or two a year?” Usually there’s some thinking, and rattling off of one or two names, followed by silence, as the person thinks through the people they know. “So, out of the few hundred people you might know well enough to know, Jack and Mary are the two people you know who read somewhat regularly?” They nod. “And that is why the publishing industry works poorly,” I say. In the before-times, anyone interested in a world greater than what’s available around them and on network TV had to read, most often books, which isn’t true any more and, barring some kind of catastrophe, probably won’t be true again.

Reading back over this I realize it has the tone and quality of a complaint, but it’s meant as a description, and complaining about cultural changes is about as effective as shaking one’s fist at the sky: I’m trying to look at what’s happening, not whine about it. Publishers go woke and see the sales of fiction fall and respond by doubling down, but I’m not in the publishing business and the intra-business signaling that goes on there. One could argue changes noted are for the better. Whining about aggregate behavior and choices has rarely, if ever, changed it. I don’t think literary culture will ever return, any more Latin, epic poetry, classical music, opera, or any number of other once-vital cultural products and systems will.

In some ways, we’re moving backwards, towards a cultural fictional universe with less clearly demarcated lines between “fact” and “fiction” (I remember being surprised, when I started teaching, by undergrads who didn’t know a novel or short stories are fiction, or who called nonfiction works “novels”). Every day, each of us is helping whatever comes next, become. The intertwined forces of technology and culture move primarily in a single direction. The desire for story will remain but the manifestation of that desire aren’t static. Articles like “Leisure reading in the U.S. is at an all-time low” appear routinely. It’s hard to have literary culture among a population that doesn’t read.

See also:

* What happened with Deconstruction? And why is there so much bad writing in academia?

* Postmodernisms: What does that mean?

Where are the woke on Disney and China?

I have sat through numerous talks and seen numerous social media messages about the evils of imperialism, and in particular western imperialism—so where’s the mass outrage over China today, and the efforts by Disney and Hollywood to court China? China is a literal, real-world imperialist power, today; China has crushed Hong Kong’s independent, imprisoned perhaps a million of its own people based on their race and religion, and invaded and occupied Tibet—and Taiwan may be next. But I never read “imperialist” or “racist” critiques from the usual suspects. Why not?

Search for “imperialism” on Twitter, for example, and you’ll find numerous people denouncing what they take to be “imperialism” or various kinds of imperialisms, but few dealing with China. This bit about Bob Iger’s complicity with Chinese government repression got me thinking about why some targets draw much “woke” ire while others don’t. My working hypothesis is that China seems far away from the United States and too different to understand—even though companies and individuals are regularly attacked for their associations with other Americans, they rarely seem to be for their associations with China. The NBA, to take another example, fervently favors police reform in the United States, but is largely silent on China (to be sure, I don’t agree with all the posturing at the link, but pay attention to the underlying point). My working theory is that the situation between the woke and China is analogous to the way that comparisons to your wife’s sister’s husband’s income can create a lot of jealousy while comparisons to the truly wealthy don’t.

In addition, could it be that Disney’s specialty in child-like stories of simple, Manichaean stories of good versus evil appeal to the same people, or kinds of person, most likely to be attracted to the quasi-religious “woke” mindset? To my knowledge, I’ve not seen these questions asked, and Disney products, like Star Wars movies and TV shows, seem to remain broadly popular, including on the far left. It’s also worth emphasizing that some have spoken about Disney’s action’s; the Twitter thread about Iger links to “Why Disney’s new ‘Mulan’ is a scandal.” But the issue seems to elicit relatively little ire and prominence, compared to many others. Few sustained movements or organizations are devoted to these issues.

What views make someone a pariah, and why? What associations make someone a pariah, and why? What views and associations elicit intense anger, and why? I don’t have full answers to any of these questions but think them worth asking. No one seems to be calling for boycotts of Disney, even though Disney is toadying to an actual imperialist state.

Dissent, insiders, and outsiders: Institutions in the age of Twitter

How does an organization deal with differing viewpoints among its constituents, and how do constituents dissent?

Someone in Google’s AI division was recently fired, or the person’s resignation accepted, depending on one’s perspective, for reasons related to a violation of process and organizational norms, or something else, again depending on one’s perspective. The specifics of that incident can be disputed, but the more interesting level of abstraction might ask how organizations process conflict and what underlying conflict model participants have. I recently re-read Noah Smith’s essay “Leaders Who Act Like Outsiders Invite Trouble;” he’s dealing with the leadup to World War II but also says: “This extraordinary trend of rank-and-file members challenging the leaders of their organizations goes beyond simple populism. There may be no word for this trend in the English language. But there is one in Japanese: gekokujo.” And later, “The real danger of gekokujo, however, comes from the establishment’s response to the threat. Eventually, party bosses, executives and other powerful figures may get tired of being pushed around.”

If you’ve been reading the news, you’ll have seen gekokujo, as institutions are being pushed by the Twitter mob, and by the Twitter mob mentality, even when the mobbing person is formally within the institution. I think we’re learning, or going to have to re-learn, things like “Why did companies traditionally encourage people to leave politics and religion at the door?” and “What’s the acceptable level of discourse within the institution, before you’re not a part of it any more?”

Colleges and universities in particular seem to be susceptible to these problems, and some are inculcating environments and cultures that may not be good for working in large groups. One recent example of these challenges occurred at Haverford college, but here too the news has many other examples, and the Haverford story seems particularly dreadful.

The basic idea that organizations have to decide who’s inside and who’s outside is old: Albert Hirschman’s Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States is one great discussion. Organizations also used to unfairly exclude large swaths of the population based on demographic factors, and that’s (obviously) bad. Today, though, many organizations have in effect, if not intent, decided that it’s okay for some of their members to attack the good faith of other members of the organization, and to attack the coherentness of the organization itself. There are probably limits to how much this can be done, and still retain a functional organization, let alone a maximally functional organization.

The other big change involves the ability to coordinate relatively large numbers of people: digital tools have made this easier, in a relatively short time—thus the “Twitter mob” terminology that came to mind a few paragraphs ago; I kept the term, because it seems like a reasonable placeholder for that class of behavior. Digital tools ease the ability of a small percentage of total people to be a large absolute number of people. For example, if 100,000 people are interested in or somehow connected to an organization, and one percent of them want to fundamentally disrupt the organization, change its direction, or arrange an attack, that’s 1,000 people—which feels like a lot. It’s far above the Dunbar number and too many for one or two public-facing people to deal with. In addition, in some ways journalists and academics have become modern-day clerics, and they’re often eager to highlight and disseminate news of disputes of this sort.

Over time, I expect organizations are going to need to develop new cultural norms if they’re going to maintain their integrity in the face of coordinated groups that represent relatively small percentages of people but large absolute numbers of people. The larger the organization, the more susceptible it may be to these kinds of attacks. I’d expect more organizations to, for example, explicitly say that attacking other members of the organization in bad faith will result in expulsion, as seems to have happened in the Google example.

Evergreen College, which hosted an early example of this kind of attack (on a biology professor named Bret Weinstein), has seen its enrollment drop by about a third.

Martin Gurri’s book The Revolt of The Public and the Crisis of Authority in the New Millennium examines the contours of the new information world, and the relative slowness of institutions to adapt to it. Even companies like Google, Twitter, and Facebook, which have enabled sentiment amplification, were founded before their own user bases became so massive.

Within organizations, an excess of conformity is a problem—innovation doesn’t occur from simply following orders—but so is an excess of chaos. Modern intellectual organizations, like tech companies or universities, probably need more “chaos” (in the sense of information transfer) than, say, old-school manufacturing companies, which primarily needed compliance. “Old-school” is a key phrase, because from what I understand, modern manufacturing companies are all tech companies too, and they need the people closest to the process to be able to speak up if something is amiss or needs to be changed. Modern information companies need workers to speak up and suggest new ideas, new ways of doing things, and so on. That’s arguably part of the job of every person in the organization.

Discussion at work of controversial identity issues can probably function if all parties assume good faith from the other parties (Google is said to have had a freewheeling culture in this regard from around the time of its founding up till relatively recently). Such discussions probably won’t function without fundamental good faith, and good faith is hard to describe, but most of us know it when we see it, and defining every element of it would probably be impossible, while cultivating it as a general principle is desirable. Trying to maintain such an environment is tough: I know that intimately because I’ve tried to maintain it in classrooms, and those experiences led me to write “The race to the bottom of victimhood and ‘social justice’ culture.” It’s hard to teach, or run an information organization, without a culture that lets people think out loud, in good faith, with relatively little fear of arbitrary reprisal. Universities, in particular, are supposed to be oriented around new ideas and discussing ideas. Organizations also need some amount of hierarchy: without it, decisions can’t or don’t get made, and the organizational processes themselves don’t function. Excessive attacks lead to the “gekokujo” problem Smith describes. Over time organizations are likely going to have to develop antibodies to the novel dynamics of the digital world.

A lot of potential learning opportunities aren’t happening, because we’re instead dividing people into inquisitors and heretics, when very few should be the former, and very are truly the latter. One aspect of “Professionalism” might be “assuming good faith on the part of other parties, until proven otherwise.”

On the other hand, maybe these cultural skirmishes don’t matter much, like brawlers in a tavern across the street from the research lab. Google’s AlphaFold has made a huge leap in protein folding efforts (Google reorganized itself, so technically both Google and AlphaFold are part of the “Alphabet” parent company). Waymo, another Google endeavor, may be leading the way towards driverless cars, and it claims to be expanding its driverless car service. Compared to big technical achievements, media fights are minor. Fifty years from now, driverless cars will be taken for granted, along with customizable biology, people will be struggling to understand what was at stake culturally, in much the way most people don’t get what the Know-Nothing party, of the Hundred Years War, were really about, but we take electricity and the printing press for granted.

EDIT: Coinbase has publicly taken a “leave politics and religion at the door” stand. They’re an innovator, or maybe a back-to-the-future company, in these terms.

 

Personal epistemology, free speech, and tech companies

The NYT describes “The Problem of Free Speech in an Age of Disinformation, and in response Hacker News commenter throwaway13337 says, in part, “It’s not unchecked free speech. Instead, it’s unchecked curation by media and social media companies with the goal of engagement.” There’s some truth to the idea that social media companies have evolved to seek engagement, rather than truth, but I think the social media companies are reflecting a deeper human tendency. I wrote back to throwaway13337: “Try teaching non-elite undergrads, and particularly assignments that require some sense of epistemology, and you’ll discover that the vast majority of people have pretty poor personal epistemic hygiene—it’s not much required in most people, most of the time, in most jobs.”

From what I can tell, we evolved to form tribes, not to be “right:” Jonathan’s Haidt’s The Righteous Mind: Why Good People Are Divided by Politics and Religion deals with this topic well and at length, and I’ve not seen any substantial rebuttals of it. We don’t naturally take to tracking the question, “How do I know what I know?” Instead, we naturally seem to want to find “facts” or ideas that support our preexisting views. In the HN comment thread, someone asked for specific examples of poor undergrad epistemic hygiene, and while I’d prefer not to get super specific for reasons of privacy, I’ve had many conversations that take the following form: “How do you know article x is accurate?” “Google told me.” “How does Google work?” “I don’t know.” “What does it take to make a claim on the Internet.” “Um. A phone, I guess?” A lot of people—maybe most—will uncritically take as fact whatever happens to be served up by Google (it’s always Google and never Duck Duck Go or Bing), and most undergrads whose work I’ve read will, again uncritically, accept clickbait sites and similar as accurate. Part of the reason for this reasoning is that undergrads’s lives are minimally affected by being wrong or incomplete about some claim done in a short assignment that’s being imposed by some annoying professor toff standing between them and their degree.

The gap between elite information discourse and everyday information discourse, even among college students, who may be more sophisticated than their peer equivalents, is vast—so vast that I don’t think most journalists (who mostly talk to other journalists and to experts) and to other people who work with information, data, and ideas really truly understand it. We’re all living in bubbles. I don’t think I did, either, before I saw the epistemic hygiene most undergrads practice, or don’t practice. This is not a “kids these days” rant, either: many of them have never really been taught to ask themselves, “How do I know what I know?” Many have never really learned anything about the scientific method. It’s not happening much in most non-elite schools, so where are they going to get epistemic hygiene from?

The United States alone has 320 million people in it. Table DP02 in the Census at data.census.gov estimates that 20.3% of the population age 25 and older has a college bachelor’s degree, and 12.8% have a graduate or professional degree. Before someone objects, let me admit that a college degree is far from a perfect proxy for epistemic hygiene or general knowledge, and some high school dropouts perform much better at cognition, meta cognition, statistical reasoning, and so forth, than do some people with graduate degrees. With that said, though, a college degree is probably a decent approximation for baseline abstract reasoning skills and epistemic hygiene. Most people, though, don’t connect with or think in terms of aggregated data or abstract reasoning—one study, for example, finds that “Personal experiences bridge moral and political divides better than facts.” We’re tribe builders, not fact finders.

Almost anyone who wants a megaphone in the form of one of the many social media platforms available now has one. The number of people motivated by questions like “What is really true, and how do I discern what is really true? How do I enable myself to get countervailing data and information into my view, or worldview, or worldviews?” is not zero, again obviously, but it’s not a huge part of the population. And many very “smart” people in an IQ sense use their intelligence to build better rationalizations, rather than to seek truth (and I may be among the rationalizers: I’m not trying to exclude myself from that category).

Until relatively recently, almost everyone with a media megaphone had some kind of training or interest in epistemology, even they didn’t call it “epistemology.” Editors would ask, “How do you know that?” or “Who told you that?” or that sort of thing. Professors have systems that are supposed to encourage greater-than-average epistemic hygiene (these systems were not and are not perfect, and nothing I have written so far implies that they were or are).

Most people don’t care about the question, “How do you know what you know?” are fairly surprised if it’s asked, implicitly or explicitly. Some people are intrigued by it but most aren’t, and view questions about sources and knowledge to be a hindrance. This is less likely to be true of people who aspire to be researchers or work in other knowledge-related professions, but that describes only a small percentage of undergraduates, particularly at non-elite schools. And the “elite schools” thing drives a lot of the media discourse around education. One of the things I like about Professor X’s book In the Basement of the Ivory Tower is how it functions as a corrective to that discourse.

For most people, floating a factually incorrect conspiracy theory online isn’t going to negatively affect their lives. If someone is a nurse and gives a patient a wrong medication or incorrect medication, that person is not going to be a nurse for long. If the nurse states or repeats a factually incorrect political or social idea online, particularly but not exclusively under a pseudonym, that nurse’s life likely won’t be affected. There’s no truth feedback loop. The same is true for someone working in, say, construction, or engineering, or many other fields. The person is free to state things that are factually incorrect, or incomplete, or misleading, and doing so isn’t going to have many negative consequences. Maybe it will have some positive consequences: one way to show that you’re really on team x is to state or repeat falsehoods that show you’re on team x, rather than on team “What is really true?”

I don’t want to get into daily political discourse, since that tends to raise defenses and elicit anger, but the last eight months have demonstrated many people’s problems with epistemology, and in a way that can have immediate, negative personal consequences—but not for everyone.

Pew Research data indicate that a quarter of US adults didn’t read a book in 2018; this is consistent with other data indicating that about half of US adults read zero or one books per year. Again, yes, there are surely many individuals who read other materials and have excellent epistemic hygiene, but this is a reasonable mass proxy, given the demands that reading makes on us.

Many people driving the (relatively) elite discourse don’t realize how many people are not only not like them, but wildly not like them, along numerous metrics. It may also be that we don’t know how to deal with gossip at scale. Interpersonal gossip is all about personal stories, while many problems at scale are best understood through data—but the number of people deeply interested in data and data’s veracity is small. And elite discourse has some of its own possible epistemic falsehoods, or at least uncertainties, embedded within it: some of the populist rhetoric against elites is rooted in truth.

A surprisingly large number of freshmen don’t know the difference between fiction and nonfiction, or that novels are fiction. Not a majority, but I was surprised when I first encountered confusion around these points; I’m not any longer. I don’t think the majority of freshmen confuse fiction and nonfiction, or genres of nonfiction, but enough do for the confusion to be a noticeable pattern (modern distinctions between fiction and nonfiction only really arose, I think, during the Enlightenment and the rise of the novel in the 18th Century, although off the top of my head I don’t have a good citation for this historical point, apart perhaps from Ian Watt’s work on the novel). Maybe online systems like Twitter or Facebook allow average users to revert to an earlier mode of discourse in which the border between fiction and nonfiction is more porous, and the online systems have strong fictional components that some users don’t care to segregate.

We are all caught in our bubble, and the universe of people is almost unimaginably larger than the number of people in our bubble. If you got this far, you’re probably in a nerd bubble: usually, anything involving the word “epistemology” sends people to sleep or, alternately, scurrying for something like “You won’t believe what this celebrity wore/said/did” instead. Almost no one wants to consider epistemology; to do so as a hobby is rare. One person’s disinformation is another person’s teambuilding. If you think the preceding sentence is in favor of disinformation, by the way, it’s not.

Bringing Up Bébé – Pamela Druckerman

This is really a book about how to do things, and about how the way we do things says things about who we are. Fiction is often about culture and so is Bringing Up Bébé. Cross-cultural comparisons are (still) underrated and we should do more of them; you can think of Michel Houellebecq’s work as being about the dark side of France and Druckerman’s as being about the light side of France (noting that she’s a transplanted American). Bringing Up Bébé is a parenting book, yes, but also a living book—that is, how to live. I bought it, let it sit around for a while, and only started it when I couldn’t find anything else to read, only to be delighted, and surprised. Let me quote from a section of the book; each new paragraph is a separate section, but put them together and one can see the differences between American-style families and French-style families:

French experts and parents believe that hearing “no” rescues children from the tyranny of their own desires.

As with teaching kids to sleep, French experts view learning to cope with “no” as a crucial step in a child’s evolution. It forces them to understand that there are other people in the world, with needs as powerful as their own.

French parents don’t worry that they’re going to damage their kids by frustrating them. To the contrary, they think their kids will be damaged if they can’t cope with frustration.

Walter Mischel says that capitulating to kids starts a dangerous cycle: “If kids have the experience that when they’re told to wait, that if they scream, Mommy will come and the wait will be over, they will very quickly learn not to wait. Non-waiting and screaming and carrying on and whining are being rewarded.”

“You must teach your child frustration” is a French parenting maxim.

As with sleep, we tend to view whether kids are good at waiting as a matter of temperament. In our view, parents either luck out and get a child who waits well or they don’t.

Since the ’60s, American parents seem to have become less inclined to say no and let kids live with some frustration, and yet we need some frustration and difficulty in order to become whole people. I’m sure many teachers and professors are reading the quotes above and connecting them to their own classroom experiences. The tie into Jean Twenge’s book iGen and Jonathan Haidt’s The Coddling of the American Mind is almost too obvious to state; Haidt and Twenge’s books concern what smartphones are doing to the state of education, educational discourse, and educational institutions, and, while they cover smartphones and social media, those two technologies aren’t occurring in isolation. Excessive permissiveness appears to create neuroticism, unhappiness, and fragility, and excessive permissiveness seems to start for American parents somewhere between a few weeks and a few months after birth—and it never ends. But most of us don’t recognize it in the absence of an outside observer, the same way we often don’t recognize our psychological challenges in the absence of an outside observer.

In Druckerman’s rendition, French parents are good at establishing boundaries, saying “no” and, with babies, implementing “the pause”—that is, not rushing to to the baby’s aid every time the baby makes some small noise or sound. She writes about how the way many children are “stout,” to use the French euphemism for “fat,” comes from not having established mealtimes but instead of having continuous snacking, in part because parents won’t say “No, you need to wait” to their kids.

Failing to create reasonable boundaries from an early age leads to the failure to develop emotional resilience. “Reasonable” is an important word: it is possible to be strict or to let kids struggle too much, just as it’s possible to do the opposite, and the right mix will likely depend on the kid or the situations.

French parenting culture spills into schools:

When Benoît took a temporary posting at Princeton, he was surprised when students accused him of being a harsh grader. “I learned that you had to say some positive things about even the worst essays,” he recalls. In one incident, he had to justify giving a student a D. Conversely, I hear that an American who taught at a French high school got complaints from parents when she gave grades of 18:20 and 20:20. The parents assumed that the class was too easy and that the grades were “fake.”

The whines I got from students also make sense: in many U.S. schools, there’s not as strong a culture of excellent as there is a culture of “gold stars for everyone.” I understand the latter desire, having felt it myself in many circumstances, but it’s also telling how important a culture of excellence is once the school train tracks end and the less-mapped wilderness of the “real world” (a phrase that is misused at times) begins.

I routinely get feedback that class is too hard, likely because most classes and professors have no incentive to fight grade inflation, and the easiest way to get along is for them to pretend to learn and us to pretend to teach. Real life, however, is rarely an “everybody gets an A” experience, and almost no one treats it that way: most people who eat bad food at a bad restaurant complain about it; most people whose doctor misses a diagnosis complain about the miss (and want excellence, not just kindness); most people prefer the best consumer tech products, like MacBook Airs or Dell XPS laptops, not the “good try” ones. Excellence itself is a key aspect of the real world but is often underemphasized in the current American education system (again, it is possible to over-emphasize it as well).

In my own work as a grant writing consultant, “good job” never occurs if the job is not good, and “you suck” sometimes occurs even if the job is good. Clients demand superior products and most people can’t write effectively, so they can’t do what I do. I’m keen to impart non-commodity skills that will help differentiate students from the untrained and poorly educated masses, but this demands a level of effort and precision beyond what most American schools seem to expect.

Having read Bringing Up Bébé, I’m surprised it’s not become a common read among professors and high school teachers—I think because it’s pitched as more of a parenting book and a popular “two different cultures” book. But it’s much subtler and more sociological than I would have thought, so perhaps I bought into its marketing too. There is also much to be said for how to teach and think about teaching in this book. The French are arguably too strict and too mean to students. Americans are probably not strict enough, not demanding enough, and don’t set adequate standards. The optimal place is likely somewhere between the extremes.

Druckerman is also funny: “I realize how much I’ve changed when, on the metro one morning, I instinctively back away from the man sitting next to the only empty seat, because I have the impression that he’s deranged. On reflection, I realize my only evidence for this is that he’s wearing shorts.” Could shorts not be an indication of derangement? And Druckerman cops to her own neuroticisms, which a whole industry of parenting guides exists to profit from:

What makes “Is It Safe?” so compulsive is that it creates new anxieties (Is it safe to make photocopies? Is it safe to swallow semen?) but then refuses to allay them with a simple “yes” or “no.” Instead, expert respondents disagree with one another and equivocate.

Bébé is a useful contrast from the France depicted in Houellebecq novels. Same country, very different vantages. In Druckerman’s France, the early childhood education system works fairly well, not having to have a car is pleasant, food isn’t a battle, and pleasant eroticism seems to fuel most adults’s lives—including parents’s. “Pleasant” is used twice deliberately. In Houellebecq’s France, empty nihilism reigns, most people are isolated by their attachment to machines, and and most actions are or feel futile.

So who’s right? Maybe both writers. But Druckerman may also point to some reasons why France, despite pursuing many bad economic policies at the country level, is still impressively functional and in many ways a good place to live. The country’s education system is functioning well and so is its transit systems—for example, Paris’s Metro is being massively expanded, at a time when the U.S. is choking on traffic and struggling with absurdly high subway costs that prevent us from building out alternatives. New York’s last main trunk subway line was completed before World War II. Small and useful extensions have been completed since, but there is no substitute for opening a dozen or more new stations and 10+ miles at a time. Improved subway access reduces the need for high-cost cars and enables people to live better lives—something France is doing but the U.S. seems unable to achieve. AAA estimates the average total cost of an American car to be $9,282. If French people can cut that to say $3,000 (taxes included) for subways, the French may be able to do a lot more with less.

France’s bad macro policies and overly rigid labor market may be offset by good childcare and transit policies; Bébé could help explain why that is. Druckerman says, “Catering to picky kids is a lot of work” (“cater” appears four times in Bébé). If the French don’t do that, Americans may be spending a lot of hours at work, rather than leisure, that the French aren’t spending—therefore raising the total quality of French life. Mismeasurement is everywhere, and, while I don’t want to praise France too much on the basis of a single work, I can see aspects of French culture that make sense and aspects of American culture that, framed correctly, don’t.

Have journalists and academics become modern-day clerics?

This guy was wrongly and somewhat insanely accused of sexual impropriety by two neo-puritans; stories about individual injustice can be interesting, but this one seems like an embodiment of a larger trend, and, although the story is long and some of the author’s assumptions are dubious, I think there’s a different, conceivably better, takeaway than the one implied: don’t go into academia (at least the humanities) or journalism. Both fields are fiercely, insanely combative for very small amounts of money; because the money is so bad, many people get or stay in them for non-monetary ideological reasons, almost the way priests, pastors, or other religious figures used to choose low incomes and high purpose (or “purpose” if we’re feeling cynical). Not only that, but clerics often know the answer to the question before the question has even been asked, and they don’t need free inquiry because the answers are already available—attributes that are very bad, yet seem to be increasingly common, in journalism and academia.

Obviously journalism and academia have never been great fields for getting rich, but the business model for both has fallen apart in the last 20 years. The people willing to tolerate the low pay and awful conditions must have other motives (a few are independently wealthy) to go into them. I’m not arguing that other motives have never existed, but today you’d have to be absurdly committed to those other motives. That there are new secular religions is not an observation original to me, but once I heard that idea a lot of other strange-seeming things about modern culture clicked into place. Low pay, low status, and low prestige occupations must do something for the people who go into them.

Once an individual enters the highly mimetic and extremely ideological space, he becomes a good target for destruction—and makes a good scapegoat for anyone who is not getting the money or recognition they think they deserve. Or for anyone who is simply angry or feels ill-used. The people who are robust or anti-fragile stay out of this space.

Meanwhile, less ideological and much wealthier professions may not have been, or be, immune from the cultural psychosis in a few media and academic fields, but they’re much less susceptible to mimetic contagions and ripping-downs. The people in them have greater incomes and resources. They have a greater sense of doing something in the world that is not primarily intellectual, and thus probably not primarily mimetic and ideological.

There’s a personal dimension to these observations, because I was attracted to both journalism and academia, but the former has shed at least half its jobs over the last two decades and the latter became untenable post-2008. I’ve enough interaction with both fields to get the cultural tenor of them, and smart people largely choose more lucrative and less crazy industries. Like many people attracted to journalism, I read books like All the President’s Men in high school and wanted to model Woodward and Bernstein. But almost no reporters today are like Woodward and Bernstein. They’re more likely to be writing Buzzfeed clickbait, and nothing generates more clicks than outrage. Smart people interested in journalism can do a minimal amount of research and realize that the field is oversubscribed and should be avoided.

When I hear students say they’re majoring in journalism, I look at them cockeyed, regardless of gender; there’s fierce competition coupled with few rewards. The journalism industry has evolved to take advantage of youthful idealism, much like fashion, publishing, film, and a few other industries. Perhaps that is why these industries attract so many writers to insider satires: the gap between idealistic expectation and cynical reality is very wide.

Even if thousands of people read this and follow its advice, thousands more persons will keep attempting to claw their way into journalism or academia. It is an unwise move. We have people like David Graeber buying into the innuendo and career attack culture. Smart people look at this and do something else, something where a random smear is less likely to cost an entire career.

We’re in the midst of a new-puritan revival and yet large parts of the media ecosystem are ignoring this idea, often because they’re part of it.

It is grimly funny to have read the first story linked next to a piece that quotes Solzhenitsyn: “To do evil a human being must first of all believe that what he’s doing is good, or else that it’s a well-considered act in conformity with natural law. . . . it is in the nature of a human being to seek a justification for his actions.” Ideology is back, and destruction is easier the construction. Our cultural immune system seems to have failed to figure this out, yet. Short-form social media like Facebook and Twitter arguably encourage black and white thinking, because there’s not enough space to develop nuance. There is enough space, however, to say that the bad guy is right over there, and we should go attack that bad guy for whatever thought crimes or wrongthink they may have committed.

Ideally, academics and journalists come to a given situation or set of facts and don’t know the answer in advance. In an ideal world, they try to figure out what’s true and why. “Ideal” is repeated twice because, historically, departures from the ideal is common, but having ideological neutrality and an investigatory posture is preferable to knowing the answer in advance and judging people based on demographic characteristics and prearranged prejudices, yet those traits seem to have seeped into the academic and journalistic cultures.

Combine this with present-day youth culture that equates feelings with facts and felt harm with real harm, and you get a pretty toxic stew—”toxic” being a favorite word of the new clerics. See further, America’s New Sex Bureaucracy. If you feel it’s wrong, it must be wrong, and probably illegal; if you feel it’s right, it must be right, and therefore desirable. This kind of thinking has generated some backlash, but not enough to save some of the demographic undesirables who wander into the kill zone of journalism or academia. Meanwhile, loneliness seems to be more acute than ever, and we’re stuck wondering why.

Is literature dead?

Is Literature Dead? The question can be seen as “more of the same,” and I’ll answer no: plenty of people, myself included, still find most video-based material boring. It’s not sufficiently information-dense and represents human interiority and thought poorly. A reasonable number of people in their teens or 20s who feel the same way, despite growing up in iGen. Fewer, maybe, than in previous generations, but still some and still enough to matter.

Literature has probably always been a minority pursuit, and it has been for as long as I’ve been alive and cognizant. It’ll continue being a minority pursuit—but I don’t think it will go away, in part for aesthetic reasons and in part for practical ones. Reading fiction is still a powerful tool for understanding other people, their drives, their uncertainties, their strengths—all vital components of organizations and organizational structures. TV and movies can replace some fraction of that but not all of it, and it’s notable how often video mediums launch from literary ones, like a parasite consuming its host.

That said, the marginal value of literature may have shrunk because there’s a lot of good written material in non-literature form—more articles, more essays, more easily available and read. All that nonfiction means that literature, while still valuable, has more competition. I’ve also wondered if the returns to reading fiction diminish at some point: after the thousandth novel, does each one after stop being as meaningful? Do you see “enough” of the human drama? If you’ve seen 92%, does getting to 92.5% mean anything? I phrase this as a question, not an answer, deliberately.

The biggest problem in my view is that a lot of literature is just not that good. Competition for time and attention is greater than it was even 20 or 30 years ago. Literature needs to recognize that and strive to be better: better written, better plotted, better thought-out, and too often it does not achieve those things. The fault is not all with Instagram-addled persons. I still find readers in the most unlikely of places. They—we—will likely keep showing up there.

The college bribery scandal vs. Lambda School

Many of you have seen the news, but, while the bribery scandal is sucking up all the attention in the media, Lambda School is offering a $2,000/month living stipend to some students and Western Governors University is continuing to quietly grow. The Lambda School story is a useful juxtaposition with the college-bribery scandal. Tyler Cowen has a good piece on the bribery scandal (although to me the scandal looks pretty much like business-as-usual among colleges, which are wrapped up in mimetic rivalry, rather than a scandal as such, unless the definition of a scandal is “when someone accidentally tells the truth”):

Many wealthy Americans perceive higher education to be an ethics-free, law-free zone where the only restraint on your behavior is whatever you can get away with.

This may be an overly cynical take, but to what extent do universities act like ethics-free, law-free zones? They accept students (and their student loan payments) who are unlikely to matriculate; they have no skin in the game regarding student loans; insiders understand the “paying for the party” phenomenon, while outsiders don’t; too frequently, universities don’t seem to defend free speech or inquiry. In short, many universities are exploiting information asymmetries between them and their students and those students’s parents—especially the weakest and worst-informed students. Discrimination against Asians in admissions is common at some schools and is another open secret, albeit less secret than it once was. When you realize what colleges are doing to students and their families, why is it a surprise when students and their families reciprocate?

To be sure, this is not true of all universities, not all the time, not all parts of all universities, so maybe I am just too close to the sausage factory. But I see a whole lot of bad behavior, even when most of the individual actors are well-meaning. Colleges have evolved in a curious set of directions, and no one attempting to design a system from scratch would choose what we have now. That is not a reason to imagine some kind of perfect world, but it is worth asking how we might evolve out of the current system, despite the many barriers to doing so. We’re also not seeing employers search for alternate credentialing sources, at least from what I can ascertain.

See also “I Was a College Admissions Officer. This Is What I Saw.” In a social media age, why are we not seeing more of these pieces? (EDIT: Maybe we are? This is another one, scalding and also congruent with my experiences.) Overall, I think colleges are really, really good at marketing, and arguably marketing is their core competency. A really good marketer, however, can convince you that marketing is not their core competency.

%d bloggers like this: