The death of literary culture

At The Complete Review, Michael Orthofer writes of John Updike that “Dead authors do tend to fade fast these days — sometimes to be resurrected after a decent interval has passed, sometimes not –, which would seem to me to explain a lot. As to ‘the American literary mainstream’, I have far too little familiarity with it; indeed, I’d be hard pressed to guess what/who qualifies as that.” He’s responding to a critical essay that says: “Much of American literature is now written in the spurious confessional style of an Alcoholics Anonymous meeting. Readers value authenticity over coherence; they don’t value conventional beauty at all.” I’m never really sure what “authenticity” and its cousin “relatability” mean; regarding the former, I find The Authenticity Hoax: How We Get Lost Finding Ourselves persuasive.

Literary culture itself is mostly dead, and I lived through its final throes—perhaps like someone who, living through the 1950s, saw the end of religious Christianity as a dominant culture, since it was essentially gone by the 1970s—though many claimed its legacy for years after the real thing had passed. What killed literary culture? The Internet is the most obvious, salient answer, and in particular the dominance of social media, which is in effect its own genre—and, frequently, its own genre of fiction. Almost everyone will admit that their own social media profiles mostly showcase a version of their best or ideal selves, and, thinking of just about everyone I know well, or even slightly well, the gap between who they really are and what they are really doing, and what appears on their social media, is so wide as to qualify as fiction. Determining the “real” self is probably impossible, but determining the fake selves is easier, and the fake is everywhere.

Everyone knows this, but admitting it is rarer. Think of all the social media photos of a person ostensibly alone—admiring the beach, reading, sunbathing, whatever—but the photographer is somewhere. A simple example, maybe, but also one without the political baggage of many other possible examples.

Much of what passes for social media discourse makes little or no sense, until one considers that most assertions are assertions of identity, not of factual or true statements, and many social media users are constructing a quasi-fictional universe not unlike the ones novels used to create. “QAnon” might be one easy modern example, albeit one that will probably go stale soon, if it’s not already stale; others will take its place. Many of these fictions are the work of group authors. Numerous assertions around gender and identity might be a left-wing-valenced version of the phenomenon, for readers who want balance, however spurious balance might be. Today, we’ve in some ways moved back to a world like that of the early novel and the early novelists, when “fact” and “fiction” were much more disputed, interwoven territories, and many novels claimed to be “true stories” on their cover pages. Today, the average person has poor epistemic hygiene for most topics not directly tied to income and employment, but the average person has a very keen sense of tribe, belonging, and identity—so views that may be epistemically dubious nonetheless succeed if they promote belonging (consider also The Elephant in the Brain by Robin Hanson and Kevin Simler for a more thorough elaboration on these ideas). Before social media, did most people really belong, or did they silently suffer through the feeling of not belonging? Or was something else at play? I don’t know.

In literary culture terms, the academic and journalistic establishment that once formed the skeletal structure upholding literary culture has collapsed, and today journalists and academics have become modern clerics. The number of jobs in journalism has approximately halved since the year 2000; academic jobs in the humanities cratered in 2009, from an already low starting point, and never recovered; even jobs teaching in high school humanities subjects have a much more ideological, rather than humanistic, cast than they did ten years ago. What’s taken its place, if anything? Instagram, Snapchat, TikTok, and, above all, Twitter. Twitter, in particular, seems to promote negative feedback and fear loops, in ways that media and other institutions haven’t yet figured out how to resist. The jobs that supported the thinkers, critics, starting-out novelists, and others, aren’t there. Whatever might have replaced them, like Twitter, isn’t equivalent. The Internet doesn’t just push most “content” (songs, books, and so forth) towards zero—it also changes what people do, including the people who used to make up what I’m calling literary culture or book culture.

Today, the most power and vibrancy of and in book culture has shifted towards nonfiction—either narrative nonfiction, like Michael Lewis, or data-driven nonfiction, with too many examples to cite. It still sells (sales aren’t a perfect representation of artistic merit or cultural vibrancy, but they’re not nothing, either). Dead authors go fast today not solely or primarily because of their work, but because the literary culture is going away fast, if it’s not already gone away. When John Updike was in his prime, millions of people read him (or they at last bought Couples and could spit out some light book chat about it on command). The number of writers working today who the educated public, broadly conceived of, might know about is small: maybe Elena Ferrante, Michel Houllebecq, Sally Rooney, and perhaps a few others (none of those three are American, I note). I can’t even think of a figure like Elmore Leonard: someone writing linguistically interesting, highly plotted material. Bulk genre writers are still out there, but none who I’m aware of who have any literary ambition.

I caught the tail end of a humane and human-focused literary culture that’s largely been succeeded by a political and moral-focused culture that I hesitate to call literary, even though it’s taken over what remains of those literary-type institutions. It’s not surprising to me that this change has also coincided with a lessening of interest in those institutions: very few people want to be clerics and scolds—many fewer than wonder about the human condition. Shifting from the one to the other seems like a net loss to me, but also a net loss that I’m mostly unable to arrest or alter. If I had to pick a date range for this death, it’d probably be 2009 – 2015: the Great Recession eliminates many of the institutional jobs and professions that once existed, along with any plausible path into them for all but the luckiest, and by 2015 social media and scold culture had taken over. Culture is define but easy to feel as you exist within and around it. By 2010, Facebook had become truly mainstream, and everyone’s uncle and grandma weren’t just on the Internet for email and search engines, but for other people and their opinions.

Maybe mainstream literary culture has been replaced by some number of smaller micro-cultures, but those microcultures don’t add up to what used to be a macroculture.

Reading back over this I realize it has the tone and quality of a complaint, but it’s meant mostly as a description: I’m trying to look at what’s happening, not whine about it. One could argue this change is for the better. Whining about aggregate behavior and choices has rarely, if ever, changed it. I don’t think literary culture will ever return, any more Latin, epic poetry, classical music, opera, or any number of other once-vital cultural products and systems will. In some ways, we’re moving backwards, towards a cultural fictional universe with less clearly demarcated lines between “fact” and “fiction” (I remember being surprised, when I started teaching, by undergrads who didn’t know a novel or short stories are fiction, or who called nonfiction works “novels”). Every day, each of us is helping whatever comes next, become. The intertwined forces of technology and culture move primarily in a single direction. The desire for story will remain but the manifestation of that desire won’t.

Where are the woke on Disney and China?

I have sat through numerous talks and seen numerous social media messages about the evils of imperialism, and in particular western imperialism—so where’s the mass outrage over China today, and the efforts by Disney and Hollywood to court China? China is a literal, real-world imperialist power, today; China has crushed Hong Kong’s independent, imprisoned perhaps a million of its own people based on their race and religion, and invaded and occupied Tibet—and Taiwan may be next. But I never read “imperialist” or “racist” critiques from the usual suspects. Why not?

Search for “imperialism” on Twitter, for example, and you’ll find numerous people denouncing what they take to be “imperialism” or various kinds of imperialisms, but few dealing with China. This bit about Bob Iger’s complicity with Chinese government repression got me thinking about why some targets draw much “woke” ire while others don’t. My working hypothesis is that China seems far away from the United States and too different to understand—even though companies and individuals are regularly attacked for their associations with other Americans, they rarely seem to be for their associations with China. The NBA, to take another example, fervently favors police reform in the United States, but is largely silent on China (to be sure, I don’t agree with all the posturing at the link, but pay attention to the underlying point). My working theory is that the situation between the woke and China is analogous to the way that comparisons to your wife’s sister’s husband’s income can create a lot of jealousy while comparisons to the truly wealthy don’t.

In addition, could it be that Disney’s specialty in child-like stories of simple, Manichaean stories of good versus evil appeal to the same people, or kinds of person, most likely to be attracted to the quasi-religious “woke” mindset? To my knowledge, I’ve not seen these questions asked, and Disney products, like Star Wars movies and TV shows, seem to remain broadly popular, including on the far left. It’s also worth emphasizing that some have spoken about Disney’s action’s; the Twitter thread about Iger links to “Why Disney’s new ‘Mulan’ is a scandal.” But the issue seems to elicit relatively little ire and prominence, compared to many others. Few sustained movements or organizations are devoted to these issues.

What views make someone a pariah, and why? What associations make someone a pariah, and why? What views and associations elicit intense anger, and why? I don’t have full answers to any of these questions but think them worth asking. No one seems to be calling for boycotts of Disney, even though Disney is toadying to an actual imperialist state.

Dissent, insiders, and outsiders: Institutions in the age of Twitter

How does an organization deal with differing viewpoints among its constituents, and how do constituents dissent?

Someone in Google’s AI division was recently fired, or the person’s resignation accepted, depending on one’s perspective, for reasons related to a violation of process and organizational norms, or something else, again depending on one’s perspective. The specifics of that incident can be disputed, but the more interesting level of abstraction might ask how organizations process conflict and what underlying conflict model participants have. I recently re-read Noah Smith’s essay “Leaders Who Act Like Outsiders Invite Trouble;” he’s dealing with the leadup to World War II but also says: “This extraordinary trend of rank-and-file members challenging the leaders of their organizations goes beyond simple populism. There may be no word for this trend in the English language. But there is one in Japanese: gekokujo.” And later, “The real danger of gekokujo, however, comes from the establishment’s response to the threat. Eventually, party bosses, executives and other powerful figures may get tired of being pushed around.”

If you’ve been reading the news, you’ll have seen gekokujo, as institutions are being pushed by the Twitter mob, and by the Twitter mob mentality, even when the mobbing person is formally within the institution. I think we’re learning, or going to have to re-learn, things like “Why did companies traditionally encourage people to leave politics and religion at the door?” and “What’s the acceptable level of discourse within the institution, before you’re not a part of it any more?”

Colleges and universities in particular seem to be susceptible to these problems, and some are inculcating environments and cultures that may not be good for working in large groups. One recent example of these challenges occurred at Haverford college, but here too the news has many other examples, and the Haverford story seems particularly dreadful.

The basic idea that organizations have to decide who’s inside and who’s outside is old: Albert Hirschman’s Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States is one great discussion. Organizations also used to unfairly exclude large swaths of the population based on demographic factors, and that’s (obviously) bad. Today, though, many organizations have in effect, if not intent, decided that it’s okay for some of their members to attack the good faith of other members of the organization, and to attack the coherentness of the organization itself. There are probably limits to how much this can be done, and still retain a functional organization, let alone a maximally functional organization.

The other big change involves the ability to coordinate relatively large numbers of people: digital tools have made this easier, in a relatively short time—thus the “Twitter mob” terminology that came to mind a few paragraphs ago; I kept the term, because it seems like a reasonable placeholder for that class of behavior. Digital tools ease the ability of a small percentage of total people to be a large absolute number of people. For example, if 100,000 people are interested in or somehow connected to an organization, and one percent of them want to fundamentally disrupt the organization, change its direction, or arrange an attack, that’s 1,000 people—which feels like a lot. It’s far above the Dunbar number and too many for one or two public-facing people to deal with. In addition, in some ways journalists and academics have become modern-day clerics, and they’re often eager to highlight and disseminate news of disputes of this sort.

Over time, I expect organizations are going to need to develop new cultural norms if they’re going to maintain their integrity in the face of coordinated groups that represent relatively small percentages of people but large absolute numbers of people. The larger the organization, the more susceptible it may be to these kinds of attacks. I’d expect more organizations to, for example, explicitly say that attacking other members of the organization in bad faith will result in expulsion, as seems to have happened in the Google example.

Evergreen College, which hosted an early example of this kind of attack (on a biology professor named Bret Weinstein), has seen its enrollment drop by about a third.

Martin Gurri’s book The Revolt of The Public and the Crisis of Authority in the New Millennium examines the contours of the new information world, and the relative slowness of institutions to adapt to it. Even companies like Google, Twitter, and Facebook, which have enabled sentiment amplification, were founded before their own user bases became so massive.

Within organizations, an excess of conformity is a problem—innovation doesn’t occur from simply following orders—but so is an excess of chaos. Modern intellectual organizations, like tech companies or universities, probably need more “chaos” (in the sense of information transfer) than, say, old-school manufacturing companies, which primarily needed compliance. “Old-school” is a key phrase, because from what I understand, modern manufacturing companies are all tech companies too, and they need the people closest to the process to be able to speak up if something is amiss or needs to be changed. Modern information companies need workers to speak up and suggest new ideas, new ways of doing things, and so on. That’s arguably part of the job of every person in the organization.

Discussion at work of controversial identity issues can probably function if all parties assume good faith from the other parties (Google is said to have had a freewheeling culture in this regard from around the time of its founding up till relatively recently). Such discussions probably won’t function without fundamental good faith, and good faith is hard to describe, but most of us know it when we see it, and defining every element of it would probably be impossible, while cultivating it as a general principle is desirable. Trying to maintain such an environment is tough: I know that intimately because I’ve tried to maintain it in classrooms, and those experiences led me to write “The race to the bottom of victimhood and ‘social justice’ culture.” It’s hard to teach, or run an information organization, without a culture that lets people think out loud, in good faith, with relatively little fear of arbitrary reprisal. Universities, in particular, are supposed to be oriented around new ideas and discussing ideas. Organizations also need some amount of hierarchy: without it, decisions can’t or don’t get made, and the organizational processes themselves don’t function. Excessive attacks lead to the “gekokujo” problem Smith describes. Over time organizations are likely going to have to develop antibodies to the novel dynamics of the digital world.

A lot of potential learning opportunities aren’t happening, because we’re instead dividing people into inquisitors and heretics, when very few should be the former, and very are truly the latter. One aspect of “Professionalism” might be “assuming good faith on the part of other parties, until proven otherwise.”

On the other hand, maybe these cultural skirmishes don’t matter much, like brawlers in a tavern across the street from the research lab. Google’s AlphaFold has made a huge leap in protein folding efforts (Google reorganized itself, so technically both Google and AlphaFold are part of the “Alphabet” parent company). Waymo, another Google endeavor, may be leading the way towards driverless cars, and it claims to be expanding its driverless car service. Compared to big technical achievements, media fights are minor. Fifty years from now, driverless cars will be taken for granted, along with customizable biology, people will be struggling to understand what was at stake culturally, in much the way most people don’t get what the Know-Nothing party, of the Hundred Years War, were really about, but we take electricity and the printing press for granted.

EDIT: Coinbase has publicly taken a “leave politics and religion at the door” stand. They’re an innovator, or maybe a back-to-the-future company, in these terms.

 

Personal epistemology, free speech, and tech companies

The NYT describes “The Problem of Free Speech in an Age of Disinformation, and in response Hacker News commenter throwaway13337 says, in part, “It’s not unchecked free speech. Instead, it’s unchecked curation by media and social media companies with the goal of engagement.” There’s some truth to the idea that social media companies have evolved to seek engagement, rather than truth, but I think the social media companies are reflecting a deeper human tendency. I wrote back to throwaway13337: “Try teaching non-elite undergrads, and particularly assignments that require some sense of epistemology, and you’ll discover that the vast majority of people have pretty poor personal epistemic hygiene—it’s not much required in most people, most of the time, in most jobs.”

From what I can tell, we evolved to form tribes, not to be “right:” Jonathan’s Haidt’s The Righteous Mind: Why Good People Are Divided by Politics and Religion deals with this topic well and at length, and I’ve not seen any substantial rebuttals of it. We don’t naturally take to tracking the question, “How do I know what I know?” Instead, we naturally seem to want to find “facts” or ideas that support our preexisting views. In the HN comment thread, someone asked for specific examples of poor undergrad epistemic hygiene, and while I’d prefer not to get super specific for reasons of privacy, I’ve had many conversations that take the following form: “How do you know article x is accurate?” “Google told me.” “How does Google work?” “I don’t know.” “What does it take to make a claim on the Internet.” “Um. A phone, I guess?” A lot of people—maybe most—will uncritically take as fact whatever happens to be served up by Google (it’s always Google and never Duck Duck Go or Bing), and most undergrads whose work I’ve read will, again uncritically, accept clickbait sites and similar as accurate. Part of the reason for this reasoning is that undergrads’s lives are minimally affected by being wrong or incomplete about some claim done in a short assignment that’s being imposed by some annoying professor toff standing between them and their degree.

The gap between elite information discourse and everyday information discourse, even among college students, who may be more sophisticated than their peer equivalents, is vast—so vast that I don’t think most journalists (who mostly talk to other journalists and to experts) and to other people who work with information, data, and ideas really truly understand it. We’re all living in bubbles. I don’t think I did, either, before I saw the epistemic hygiene most undergrads practice, or don’t practice. This is not a “kids these days” rant, either: many of them have never really been taught to ask themselves, “How do I know what I know?” Many have never really learned anything about the scientific method. It’s not happening much in most non-elite schools, so where are they going to get epistemic hygiene from?

The United States alone has 320 million people in it. Table DP02 in the Census at data.census.gov estimates that 20.3% of the population age 25 and older has a college bachelor’s degree, and 12.8% have a graduate or professional degree. Before someone objects, let me admit that a college degree is far from a perfect proxy for epistemic hygiene or general knowledge, and some high school dropouts perform much better at cognition, meta cognition, statistical reasoning, and so forth, than do some people with graduate degrees. With that said, though, a college degree is probably a decent approximation for baseline abstract reasoning skills and epistemic hygiene. Most people, though, don’t connect with or think in terms of aggregated data or abstract reasoning—one study, for example, finds that “Personal experiences bridge moral and political divides better than facts.” We’re tribe builders, not fact finders.

Almost anyone who wants a megaphone in the form of one of the many social media platforms available now has one. The number of people motivated by questions like “What is really true, and how do I discern what is really true? How do I enable myself to get countervailing data and information into my view, or worldview, or worldviews?” is not zero, again obviously, but it’s not a huge part of the population. And many very “smart” people in an IQ sense use their intelligence to build better rationalizations, rather than to seek truth (and I may be among the rationalizers: I’m not trying to exclude myself from that category).

Until relatively recently, almost everyone with a media megaphone had some kind of training or interest in epistemology, even they didn’t call it “epistemology.” Editors would ask, “How do you know that?” or “Who told you that?” or that sort of thing. Professors have systems that are supposed to encourage greater-than-average epistemic hygiene (these systems were not and are not perfect, and nothing I have written so far implies that they were or are).

Most people don’t care about the question, “How do you know what you know?” are fairly surprised if it’s asked, implicitly or explicitly. Some people are intrigued by it but most aren’t, and view questions about sources and knowledge to be a hindrance. This is less likely to be true of people who aspire to be researchers or work in other knowledge-related professions, but that describes only a small percentage of undergraduates, particularly at non-elite schools. And the “elite schools” thing drives a lot of the media discourse around education. One of the things I like about Professor X’s book In the Basement of the Ivory Tower is how it functions as a corrective to that discourse.

For most people, floating a factually incorrect conspiracy theory online isn’t going to negatively affect their lives. If someone is a nurse and gives a patient a wrong medication or incorrect medication, that person is not going to be a nurse for long. If the nurse states or repeats a factually incorrect political or social idea online, particularly but not exclusively under a pseudonym, that nurse’s life likely won’t be affected. There’s no truth feedback loop. The same is true for someone working in, say, construction, or engineering, or many other fields. The person is free to state things that are factually incorrect, or incomplete, or misleading, and doing so isn’t going to have many negative consequences. Maybe it will have some positive consequences: one way to show that you’re really on team x is to state or repeat falsehoods that show you’re on team x, rather than on team “What is really true?”

I don’t want to get into daily political discourse, since that tends to raise defenses and elicit anger, but the last eight months have demonstrated many people’s problems with epistemology, and in a way that can have immediate, negative personal consequences—but not for everyone.

Pew Research data indicate that a quarter of US adults didn’t read a book in 2018; this is consistent with other data indicating that about half of US adults read zero or one books per year. Again, yes, there are surely many individuals who read other materials and have excellent epistemic hygiene, but this is a reasonable mass proxy, given the demands that reading makes on us.

Many people driving the (relatively) elite discourse don’t realize how many people are not only not like them, but wildly not like them, along numerous metrics. It may also be that we don’t know how to deal with gossip at scale. Interpersonal gossip is all about personal stories, while many problems at scale are best understood through data—but the number of people deeply interested in data and data’s veracity is small. And elite discourse has some of its own possible epistemic falsehoods, or at least uncertainties, embedded within it: some of the populist rhetoric against elites is rooted in truth.

A surprisingly large number of freshmen don’t know the difference between fiction and nonfiction, or that novels are fiction. Not a majority, but I was surprised when I first encountered confusion around these points; I’m not any longer. I don’t think the majority of freshmen confuse fiction and nonfiction, or genres of nonfiction, but enough do for the confusion to be a noticeable pattern (modern distinctions between fiction and nonfiction only really arose, I think, during the Enlightenment and the rise of the novel in the 18th Century, although off the top of my head I don’t have a good citation for this historical point, apart perhaps from Ian Watt’s work on the novel). Maybe online systems like Twitter or Facebook allow average users to revert to an earlier mode of discourse in which the border between fiction and nonfiction is more porous, and the online systems have strong fictional components that some users don’t care to segregate.

We are all caught in our bubble, and the universe of people is almost unimaginably larger than the number of people in our bubble. If you got this far, you’re probably in a nerd bubble: usually, anything involving the word “epistemology” sends people to sleep or, alternately, scurrying for something like “You won’t believe what this celebrity wore/said/did” instead. Almost no one wants to consider epistemology; to do so as a hobby is rare. One person’s disinformation is another person’s teambuilding. If you think the preceding sentence is in favor of disinformation, by the way, it’s not.

Have journalists and academics become modern-day clerics?

This guy was wrongly and somewhat insanely accused of sexual impropriety by two neo-puritans; stories about individual injustice can be interesting, but this one seems like an embodiment of a larger trend, and, although the story is long and some of the author’s assumptions are dubious, I think there’s a different, conceivably better, takeaway than the one implied: don’t go into academia (at least the humanities) or journalism. Both fields are fiercely, insanely combative for very small amounts of money; because the money is so bad, many people get or stay in them for non-monetary ideological reasons, almost the way priests, pastors, or other religious figures used to choose low incomes and high purpose (or “purpose” if we’re feeling cynical). Not only that, but clerics often know the answer to the question before the question has even been asked, and they don’t need free inquiry because the answers are already available—attributes that are very bad, yet seem to be increasingly common, in journalism and academia.

Obviously journalism and academia have never been great fields for getting rich, but the business model for both has fallen apart in the last 20 years. The people willing to tolerate the low pay and awful conditions must have other motives (a few are independently wealthy) to go into them. I’m not arguing that other motives have never existed, but today you’d have to be absurdly committed to those other motives. That there are new secular religions is not an observation original to me, but once I heard that idea a lot of other strange-seeming things about modern culture clicked into place. Low pay, low status, and low prestige occupations must do something for the people who go into them.

Once an individual enters the highly mimetic and extremely ideological space, he becomes a good target for destruction—and makes a good scapegoat for anyone who is not getting the money or recognition they think they deserve. Or for anyone who is simply angry or feels ill-used. The people who are robust or anti-fragile stay out of this space.

Meanwhile, less ideological and much wealthier professions may not have been, or be, immune from the cultural psychosis in a few media and academic fields, but they’re much less susceptible to mimetic contagions and ripping-downs. The people in them have greater incomes and resources. They have a greater sense of doing something in the world that is not primarily intellectual, and thus probably not primarily mimetic and ideological.

There’s a personal dimension to these observations, because I was attracted to both journalism and academia, but the former has shed at least half its jobs over the last two decades and the latter became untenable post-2008. I’ve enough interaction with both fields to get the cultural tenor of them, and smart people largely choose more lucrative and less crazy industries. Like many people attracted to journalism, I read books like All the President’s Men in high school and wanted to model Woodward and Bernstein. But almost no reporters today are like Woodward and Bernstein. They’re more likely to be writing Buzzfeed clickbait, and nothing generates more clicks than outrage. Smart people interested in journalism can do a minimal amount of research and realize that the field is oversubscribed and should be avoided.

When I hear students say they’re majoring in journalism, I look at them cockeyed, regardless of gender; there’s fierce competition coupled with few rewards. The journalism industry has evolved to take advantage of youthful idealism, much like fashion, publishing, film, and a few other industries. Perhaps that is why these industries attract so many writers to insider satires: the gap between idealistic expectation and cynical reality is very wide.

Even if thousands of people read this and follow its advice, thousands more persons will keep attempting to claw their way into journalism or academia. It is an unwise move. We have people like David Graeber buying into the innuendo and career attack culture. Smart people look at this and do something else, something where a random smear is less likely to cost an entire career.

We’re in the midst of a new-puritan revival and yet large parts of the media ecosystem are ignoring this idea, often because they’re part of it.

It is grimly funny to have read the first story linked next to a piece that quotes Solzhenitsyn: “To do evil a human being must first of all believe that what he’s doing is good, or else that it’s a well-considered act in conformity with natural law. . . . it is in the nature of a human being to seek a justification for his actions.” Ideology is back, and destruction is easier the construction. Our cultural immune system seems to have failed to figure this out, yet. Short-form social media like Facebook and Twitter arguably encourage black and white thinking, because there’s not enough space to develop nuance. There is enough space, however, to say that the bad guy is right over there, and we should go attack that bad guy for whatever thought crimes or wrongthink they may have committed.

Ideally, academics and journalists come to a given situation or set of facts and don’t know the answer in advance. In an ideal world, they try to figure out what’s true and why. “Ideal” is repeated twice because, historically, departures from the ideal is common, but having ideological neutrality and an investigatory posture is preferable to knowing the answer in advance and judging people based on demographic characteristics and prearranged prejudices, yet those traits seem to have seeped into the academic and journalistic cultures.

Combine this with present-day youth culture that equates feelings with facts and felt harm with real harm, and you get a pretty toxic stew—”toxic” being a favorite word of the new clerics. See further, America’s New Sex Bureaucracy. If you feel it’s wrong, it must be wrong, and probably illegal; if you feel it’s right, it must be right, and therefore desirable. This kind of thinking has generated some backlash, but not enough to save some of the demographic undesirables who wander into the kill zone of journalism or academia. Meanwhile, loneliness seems to be more acute than ever, and we’re stuck wondering why.

Is literature dead?

Is Literature Dead? The question can be seen as “more of the same,” and I’ll answer no: plenty of people, myself included, still find most video-based material boring. It’s not sufficiently information-dense and represents human interiority and thought poorly. A reasonable number of people in their teens or 20s who feel the same way, despite growing up in iGen. Fewer, maybe, than in previous generations, but still some and still enough to matter.

Literature has probably always been a minority pursuit, and it has been for as long as I’ve been alive and cognizant. It’ll continue being a minority pursuit—but I don’t think it will go away, in part for aesthetic reasons and in part for practical ones. Reading fiction is still a powerful tool for understanding other people, their drives, their uncertainties, their strengths—all vital components of organizations and organizational structures. TV and movies can replace some fraction of that but not all of it, and it’s notable how often video mediums launch from literary ones, like a parasite consuming its host.

That said, the marginal value of literature may have shrunk because there’s a lot of good written material in non-literature form—more articles, more essays, more easily available and read. All that nonfiction means that literature, while still valuable, has more competition. I’ve also wondered if the returns to reading fiction diminish at some point: after the thousandth novel, does each one after stop being as meaningful? Do you see “enough” of the human drama? If you’ve seen 92%, does getting to 92.5% mean anything? I phrase this as a question, not an answer, deliberately.

The biggest problem in my view is that a lot of literature is just not that good. Competition for time and attention is greater than it was even 20 or 30 years ago. Literature needs to recognize that and strive to be better: better written, better plotted, better thought-out, and too often it does not achieve those things. The fault is not all with Instagram-addled persons. I still find readers in the most unlikely of places. They—we—will likely keep showing up there.

The college bribery scandal vs. Lambda School

Many of you have seen the news, but, while the bribery scandal is sucking up all the attention in the media, Lambda School is offering a $2,000/month living stipend to some students and Western Governors University is continuing to quietly grow. The Lambda School story is a useful juxtaposition with the college-bribery scandal. Tyler Cowen has a good piece on the bribery scandal (although to me the scandal looks pretty much like business-as-usual among colleges, which are wrapped up in mimetic rivalry, rather than a scandal as such, unless the definition of a scandal is “when someone accidentally tells the truth”):

Many wealthy Americans perceive higher education to be an ethics-free, law-free zone where the only restraint on your behavior is whatever you can get away with.

This may be an overly cynical take, but to what extent do universities act like ethics-free, law-free zones? They accept students (and their student loan payments) who are unlikely to matriculate; they have no skin in the game regarding student loans; insiders understand the “paying for the party” phenomenon, while outsiders don’t; too frequently, universities don’t seem to defend free speech or inquiry. In short, many universities are exploiting information asymmetries between them and their students and those students’s parents—especially the weakest and worst-informed students. Discrimination against Asians in admissions is common at some schools and is another open secret, albeit less secret than it once was. When you realize what colleges are doing to students and their families, why is it a surprise when students and their families reciprocate?

To be sure, this is not true of all universities, not all the time, not all parts of all universities, so maybe I am just too close to the sausage factory. But I see a whole lot of bad behavior, even when most of the individual actors are well-meaning. Colleges have evolved in a curious set of directions, and no one attempting to design a system from scratch would choose what we have now. That is not a reason to imagine some kind of perfect world, but it is worth asking how we might evolve out of the current system, despite the many barriers to doing so. We’re also not seeing employers search for alternate credentialing sources, at least from what I can ascertain.

See also “I Was a College Admissions Officer. This Is What I Saw.” In a social media age, why are we not seeing more of these pieces? (EDIT: Maybe we are? This is another one, scalding and also congruent with my experiences.) Overall, I think colleges are really, really good at marketing, and arguably marketing is their core competency. A really good marketer, however, can convince you that marketing is not their core competency.

The elite case against big product “x” (today it’s Facebook)

For most of my life I’ve been reading well-structured, well-supported, well-written, and well-cited pieces arguing for why and how people should not do extremely popular thing x, where x can change based on the person making the argument. Often the argument is quite good but doesn’t create mass behavior change on the ground. I often agree with the argument, but whether I agree with it or not is less relevant than whether the majority of the population changes its behavior in measurable ways (for truly popular products and services, they don’t). Today, the x is Facebook.

Based on past examples of “the elite case against ‘x,'” I predict that today’s NYT and BBC articles do very little to change real-world, measurable behavior around Facebook and social media. To the extent people move away from Facebook, it will be toward some other Facebook property like Instagram or toward some other system that still has broadly similar properties, like Discord, Snapchat, etc. Today’s case against Facebook, or social media more generally, reminds me of the elite case against:

* TV. TV rots your brain and is worse than reading books. It destroys high culture and is merely a vehicle for advertising. Sophisticated pleasures are better than reality TV and the other “trash” on TV.” Yet TV remains popular. Even in 2017, “Watching TV was the leisure activity that occupied the most time (2.8 hours per day). And 2.8 hours per day is lower than the “four hours per day” time I’ve seen quoted elsewhere. Today, though, most people, even cultural elites, don’t even bother arguing against TV.

* Fast food, especially McDonald’s, Taco Bell, etc. It’s filled with sugar and, rather than being called “food,” it should probably be called, “an edible food-like substance.” There is also an elite case against factory farming and animal torture, which pretty much all fast food suppliers do. Yet McDonald’s, Taco Bell, and similar companies remain massive. Michael Pollan has done good work articulating the elite case against fast food.

* Oil companies. Oil use has led us to more than 400ppm CO2 in the atmosphere. We’re on the way to cooking ourselves. Yet the market response to hybrid vehicles has been to ignore them. Almost no one walks or bikes to work. Again, I would argue that more people should do these things, but what I think people should do, and what people do, are quite different. We like to attack oil companies instead of the consumer behavior that supports oil companies.

Oddly, I see the elite case against car companies and airplane companies much less frequently than I do against oil companies.

* Tobacco. It gives you lung cancer and smoking cigarettes isn’t even that good. While it appears that smoking rates have been declining for decades, 15.5% of adults still smoke. Taxation may be doing more to drive people away from tobacco than asserting the number and ways that tobacco is bad.

* Video games. They’re a way to evade the real world and perform activities that feel like fitness-enhancing activities but are actually just mental masturbation, but without the physical limits imposed by actual masturbation. They simulate the social world in a way that makes us more isolated and frustrated than ever before.

What other examples am I missing?

Today, we have the elite case against social media. It may be accurate. It’s generated good books, like Cal Newport’s Deep Work and Nicholas Carr’s The Shallows. Social media has generated lots of op-eds and parenting guides. Some individuals have announced publicly that they’re deleting their Facebook or Instagram page, yet Facebook is a public company and keeps reporting massive levels of use and engagement.

It turns out that what people want to do, is quite different from what The New York Times thinks people should do.

The most despicable sentences I’ve read recently

In November, NASA announced it would be conducting a “cultural assessment study” of SpaceX and Boeing to ensure the companies were meeting NASA’s requirements of “adherence to a drug-free environment.” The Washington Post reported that officials had indicated “the review was prompted by the recent behavior of SpaceX’s founder, Elon Musk.”

From this piece. Boeing is good at hewing to bureaucratic edicts issued by bureaucratic organizations but is bad at recovering rocket stages and decreasing the price of space launch. SpaceX is great at, you know, putting shit into space, which is what both companies are putatively supposed to be doing. For Boeing, compliance with infinite rules and regulations takes precedence over lowering the cost of space access.

The quoted paragraph reminds me of Peter Thiel’s point in Zero to One: as HP floundered, it was still really good at “following the rules,” but really terrible at building products people want. Senior administrators were adepts at process but novices at results. Many people who are good at results do not care for excessive process.

Perhaps we should focus less on virtue signaling and demographics, and more on results. I suspect the NASA of the 1960s was not terribly interested in its employees’s private lives, but it was very interested in putting a man on the moon. Today, NASA seems unable to do the latter but very good at the former.

We need fewer bureaucrats and bureaucratic barriers and more people with a piratical gleam in their eye trying new things. Elon Musk has that piratical gleam and that is part of what makes him a hero, despite his flaws (which are real). Online, it is easy to tear people down (The Revolt of The Public and the Crisis of Authority in the New Millennium describes how the Internet enables nihilism and tearing people down while doing too little real building of new things—comprehensive post this important book will be forthcoming). It costs a billion dollars a mile to build new urban rail in the United States, since contractors must specialize in placating politicians, employing too many people at too high waves (“In his exposé, Rosenthal talked about labor problems: severe overstaffing, with some workers doing jobs that are no longer necessary, and wages well into six figures”), and dealing with lawsuits rather than specializing in building shit quickly. We need to find our way to a new, better equilibrium that de-emphasizes drug testing for harmless substances and emphasizes getting the thing done.

Giving and receiving books

Tyler Cowen writes, “Why you should hesitate to give books as gifts and instead just throw them out,” which is a fine post, but I’d note that many people are cost-constrained when it comes to books, and many used books now end up on Amazon, where they must be specifically sought out. And I love to give friends books (and receive books), but the following rules for giving books must be obeyed:

1. Zero expectation. The sender must not expect the receiver to read or even consider the book. Books should only be given, never returned, particularly in the age of Amazon. Amazon has made book scarcity a thing of the past. It is even possible to rapidly scan books, using the right equipment, which may be relatively inexpensive. The majority of books I give or send are probably never read, and that’s fine with me.

2. Despite “zero expectation,” the sender must think the book will interest the receiver or be at least as good as the median book the receiver might otherwise read.

3. This is my own idiosyncrasy, but I very rarely throw out books, though I will donate unwanted ones in batches. Someone with different inclinations and hourly rates might automate the process of selling older books on Amazon. The net take from selling a book for even $10 or $12 on Amazon is like $4 – $6—not worth it for me.

4. I like writing in books and like it when my friends do. Receiving a book my friend has annotated is like getting the pleasure of the book and the pleasure of conversation.

5. “Zero expectation” also means “zero expectation” in terms of time. I mail books in batches whenever there are enough and it’s convenient for me. It may be months after I finish a book, and that’s okay. I have a stack sitting around right now, waiting to go out.

6. I like it when publishers send me books! But they often send emails first asking if I’ll promise a review, etc. My stock reply is always the same: Send the book, but I promise nothing.

7. When I was younger I thought I’d be rich when I have the money to buy all the books I can read. Now I have to limit the number of physical books I have due to space and practicality constraints. Large numbers of physical books are not compatible with high levels of mobility. This is very annoying but also true. Bad city zoning makes this problem worse by artificially increasing the price per square foot most people pay for housing in a given locale. Would we have a better media if writers had more space for books and consequently read more?

“How good is the very best next book that you haven’t read but maybe are on the verge of picking up? So many choices in life hinge on that neglected variable.” I say my problem today is finding the best book, which I no longer do so well on my own; if the five best readers I know would send me more books, I would be very happy, even if only one works for me.

It’s striking for me how many people with nothing to say get on social media to say it, relative to simply reading more or learning more. We have all these communication media and too little to fill them with, in my view. It could be that I’m guilty of that right now.

A good rule is, “Would you buy this friend a beer or coffee?” If yes, why not a book? I’d like to see book-giving become more of a social norm, like getting a round of drinks.

%d bloggers like this: