Life: Self-aggrandizement edition

“‘What rules the world is ideas,’ Kristol once wrote, ‘because ideas define the way reality is perceived.'”

—Quoted in “Can the G.O.P. Be a Party of Ideas?” This is sort of true, but, alternately, idea producers and disseminators may want this to be true because it flatters them and raises their own status.

Kristol’s view is plausible but I remain unconvinced.

Links: Happiness Advice, Writing Tips, Nymphomaniac, Legal Drugs, “Rape Culture” Hysteria, and more

* “The dream-crushing grind of the academic job market;” I really ought to stop reading (and posting) articles like this but the same almost subconscious impulse that draws the eye to car crashes and nude photos draws mine to them.

* “Advice for a Happy Life by Charles Murray: Consider marrying young. Be wary of grand passions. Watch ‘Groundhog Day’ (again). Advice on how to live to the fullest,” most of which may apply most to the author than to everyone.

* “101 Practical Writing Tips From Hollywood Screenwriter Brian Koppelman.”

* “Lars’s Real Girl: Charlotte Gainsbourg on Nymphomaniac and Working With von Trier.” Unfortunately, the movie adds up to very little.

* My Amazon review of Madison Young’s surprisingly dull book, “Daddy: A Memoir.”

* “The Drugging of the American Boy: By the time they reach high school, nearly 20 percent of all American boys will be diagnosed with ADHD.” Most diagnoses are probably wrong.

* “The Value Of An Engineering Degree.”

* Someone found this blog by searching for “swear word count in book asking anna by jake seliger.” I can’t imagine why anyone would want to know this. Someone else found this blog by searching for “gandalf sex,” which may make even less sense.

* Crowd funding is market research.

* “It’s Time to End ‘Rape Culture’ Hysteria.”

Why “tit-for-tat” might be so hard to implement in a romantic/dating context

The other day a friend with love problems described them, and I offered a solution applicable to a wide range of similar issues: tit-for-tat, in which you respond to another person’s response. If the other person is cooling off, cool off in turn; if the other person is heating up, heat up in turn. This avoids wasted effort in pursing someone unavailable and also prevents the (frequently) unattractive behavior of being too available.*

There’s a large challenge in TFT, however: it’s really hard for most of us to implement, even among people who know, intellectually, that it’s a good idea. We often want the world to arrange itself according to our wishes. In most endeavors increased effort leads to increased reward. But there is a class of endeavors—getting a job, finding romance, succeeding in book proposals—where too much effort is a negative signal that shows desperation or low status.**

In that problem class, TFT is a pretty good way of checking a sense of hope against the reality of a situation. In the real world we can’t control what other people do but we can control our reactions. That’s not a new idea but it is a really important one, and one that a lot of people (especially when they’re romantically inexperienced) fail to really understand.

I suspect that the roots of misunderstanding romantic behavior starts in childhood. When you’re a child your parents love you unconditionally and tell you that you’re special (because you are, to them), and your teachers try to help you (for the most part) and encourage you even when you fuck up. If you show your parents or family or teachers that you’re really trying hard or care or whatever they usually reward you.

But eventually you hit puberty, get some hair on your beanbag or a righteous set of jugs, and you start splashing around with dating. Except in that domain a lot of people you may be interested in don’t care about you no matter what you do or how much you care. You care so much—why don’t they? If you’re overly demonstrative in this, however, at best you’ll be taken advantage of and at worse you’ll be ignored.

smoking-0730The real solution is to realize that you can’t force other people to be romantically (or otherwise) interested in you. In a romantic context, extended ambiguity sucks, and one effective way to end it may be to introduce a rival. Find some guy or girl and make sure the real target knows. If that doesn’t spur the love interest to action nothing will, because it says, “Hey, either take this spot or lose it.”

That’s not quite TFT, but it is one way to force decisions.

In books and movies, almost no one employs TFT, and things tend to work out anyway—but that’s because most books and movies are fantasies that give us what we wish were true, rather than what is true. Which may be why inexperienced people have so much trouble: their only guidelines are really poor.

Most of the stuff I imbibed from pop culture between birth and age 16 or so, for example, did absolutely nothing to prepare me for the real world and if anything it was harmful. Part of this was my own fault—I had a penchant for pulp fantasy novels in which not only the dragons were imaginary but so too were the female characters—but not all of it. Consequently, almost everyone has to discover the same lessons for themselves, over and over again, often without any useful guidance whatsoever. Parents are of little help because their own interests diverge in systematic ways from their children’s interests. Peers are often equally ignorant. Non-parent adults by and large don’t interact with highly inexperienced teens or early 20-somethings. So people are left with pop culture and its wish-fulfillment fantasies.

There are some people building a theory of reality—like Esther Perel or Roosh—but little of it has filtered into the culture at large so far. Maybe it never will.


* I’m not the first to notice these issues: “Sexual Attraction and Game Theory” popped up in my RSS feed about a week after the discussion.

** This post had its origins in a much more specific (and explicit!) email, but it’s been generalized and (somewhat) sanitized.

The modern art (and photography) problem

In “Modern art: I could have done that… so I did: After years of going to photography exhibitions and thinking he could do better, Julian Baggini gave it a go. But could he convince The Royal West of England Academy with his work?“, Baggini writes:

there are times when we come across something so simple, so unimpressive, and so devoid of technical merit that we just can’t help believing we could have done as well or better ourselves.

He’s right—except that this happens entirely too often and helps explain much of modern art’s bogosity. I’m not the only person to have noticed—in Glittering Images, Camille Paglia writes:

the big draws [for museums] remain Old Master or Impressionist painting, not contemporary art. No galvanizing new style has emerged since Pop Art, which killed the avant-garde by embracing commercial culture. Art makes news today only when a painting is stolen or auctioned at a record price.

She’s right too; many people have noticed this but few apparently have in the art world itself, which seems to have become more interested in marketing than making (a problem afflicting the humanities in academia too). But there are enough people invested in and profiting from propagating bogosity that they can remain indifferent to countervailing indifference.

OLYMPUS DIGITAL CAMERAYears ago I was at the Seattle Art Museum and looking various pieces of modern supposed “art” that consisted mostly of a couple lines or splotches and what not, and they made me think: “there’s a hilarious novel in here about a director who surreptitiously hangs her own work—and no one notices.” Unfortunately, now I’ve realized that people have already done this, or things like it, in the real world—and no one cared. It’s barely possible to generate scandal in the art world anymore; conservatives have mostly learned about the Streisand effect and thus don’t react to the latest faux provocation. The artists themselves often lack both anything to say and any coherent way of saying it.

To the extent people respond to art, they respond to the art that people made when it took skill be an artist.

Photography has a somewhat similar problem, except that it’s been created by technology. Up until relatively recent it took a lot of time, money, and patience to become a reasonably skilled photographer. Now it doesn’t take nearly as much of any of those things: last year’s cameras and lenses still work incredibly well; improvements in autofocus, auto-exposure, and related technologies make photos look much better; and it’s possible to take, review, and edit hundreds or thousands of photos at a time, reducing the time necessary to go from “I took a picture” to expert.

The results are obvious for anyone who pays attention. Look through Flickr, or 500px, or any number of other sites and you’ll see thousands of brilliant, beautiful photos. I won’t say “anyone can do it,” but many people can. It’s also possible to take great photos by accident, with the machine doing almost all the work apart from the pointing and clicking. Adding a little bit of knowledge to the process is only likely to increase the keeper rate. Marketing seems to be one of the primary differentiators among professional photographers; tools like Lightroom expand the range of possibility for recovering from error.

One of the all-time top posts on Reddit’s photography section is “I am a professional photographer. I’d like to share some uncomfortable truths about photography,” where the author writes that “It’s more about equipment than we’d like to admit” and “Photography is easier than we’d like to admit.”

The profession is dying, for reasons not identical to painting but adjacent to it. In photography, we’re drowning in quality. In fine art, we’re drowning in bogosity, and few people appear to be interested in rescuing the victim.

One definition of brilliance: willingness to appear to be the fool

From The Making of the Atomic Bomb:

[In 1939] Szilard told Einstein about the Columbia secondary-neutron experiments and his calculations toward a chain reaction in uranium and graphite. Long afterward he would recall his surprised that Einstein had not yet heard of the possibility of a chain reaction. When he mentioned it Einstein interjected, “Daran habe ich gar nicht gedacht!“—”I never thought of that!” He was nevertheless, says Szilard, “very quick to see the implications and perfectly willing to do anything that needed to be done. He was willing to assume responsibility for sounding the alarm even though it was quite possible that the alarm might prove to be a false alarm. The one thing most scientists are really afraid of is to make fools of themselves. Einstein was free from such a fear and this above all is what made his position unique on this occasion.

“Einstein was free from such a fear:” are you?

(Incidentally, as Eric R. Weinstein points out, “Over the past two decades I have been involved with the war on excellence:”

In the past, many scientists lived on or even over the edge of respectability with reputations as skirt chasing, hard drinking, bigoted, misogynistic, childish, slutty, lazy, politically treacherous, incompetent, murderous, meddlesome, monstrous and mentally unstable individuals such as von Neumann, Gamow, Shockley, Watson, Einstein, Curie, Smale, Oppenheimer, Crick, Ehrenfest, Lang, Teller and Grothendieck (respectively) who fueled such epithets with behaviors that indicated they appeared to care little for what even other scientists thought of their choices.

But such disregard, bordering on deviance and delinquency, was often outweighed by feats of genius and heroism. We have spent the last decades inhibiting such socially marginal individuals or chasing them to drop out of our research enterprise and into startups and hedge funds. As a result our universities are increasingly populated by the over-vetted specialist to become the dreaded centers of excellence that infantilize and uniformize the promising minds of greatest agency.

Are you part of that war? I suspect Einstein cared little for respectability except when it came to being right.)

Looks matter and always will because they convey valuable information, and a note about the media

In “The Revolution Will Not Be Screen-Printed on a Thong” Maureen O’Connor laments that people judge each other based on looks (“Why can’t we just not obsess about bodies?”), and then kind of answers her own question:

I ask that in earnest — it’s possible that we actually can’t stop, that this compulsive corporeal scrutiny is some sort of biological imperative, or species-wide neurosis left over from millennia of treating women as chattel.

We judge each based on looks because, as Geoffrey Miller describes in Spent and others have described elsewhere, looks convey a lot of useful information about age, fertility, and health. Beyond that, women are competitive with each other in this domain because they know (correctly) that men judge them based on looks (among other things).

In addition, as Tim Harford discusses in The Logic of Life, speed dating and other research shows that women reject about 90% of those in any given speed-dating event, and men reject about 80% of women. Both men and women usually report that they want similar things—men want youth and beauty; women want height and humor. But researchers devised clever experiments in which dating pools of either men or women have changed systematically—for example, by having entirely very tall men or very short men. Yet the rate at which men and women accept or decline dates remains the same.

That implies “compulsive corporeal scrutiny” is based partially on the knowledge that any particular person will be judged based on the other people around.

I don’t bring this up merely to correct a point in an article; it’s also to observe that a lot of the stuff one reads online is based on limited knowledge. As I get older I increasingly get the impression that a lot of journalists would be better served, at least intellectually speaking, to spend more time reading books and less time… doing other things?

One thing I like about journalists or journalist-blogger hybrids like Megan McArdle and Matt Yglesias is their wide, deep reading, and their willingness to connect wide, deep reading with the subjects they write about. One might disagree with them for ideological or other reasons, but they do at least know what they’re talking about and usually try to learn when they don’t. Too much of the media—whether in The Seattle Times or The Wall Street Journal or New York Magazine—is just making noise.*

Given the choice between most media and books, choose books. The challenge, of course, is finding them.

EDIT: Maybe Ezra Klein’s new mystery venture will solve some of the complaints above; he mentions “the deficiencies in how we present information” and promises “context.” I hope so, and certainly I’m not the first person to notice the many problems with the way much of the media works.


* Granted, I may be contributing to this in my own small way by contributing a link and possibly hits to a noise-making article that should be better than it is.

Love, Actually: Make a move already

After reading a spate of essays about Love, Actually (“Loathe, Actually,” “The Six Cinematic Crimes of ‘Love Actually,’” “Love Actually Is the Least Romantic Film of All Time“) I watched the movie and realized that none of the writers nail what started bothering me halfway through. It isn’t the gender politics of the movie, which are primarily disliked for the usual reason: people in reality behave differently than writers of essays and feminists would like them to behave. It’s because the characters in Love Actually are stuck at age 15.

The movie’s plot is essentially a series of attraction deferments: someone feels attraction, often quite strongly, and then doesn’t act on it. Instead of going up and saying, “Let’s get a drink later” or “let’s see a movie,” they blush and stutter and wonder. One character says, “takes me ages to get the courage up” to even talk to the other one. That’s a real problem I had when I was, say, 15, and would respond to attraction by hiding.

Why do teenagers do this? They’re stuck in a nasty social situation: high school. They’re inexperienced idiots. That described me fairly well.* There also might be good evolutionary reasons to avoid making romantic moves unlikely to be requited: for most of human history, humans lived in relatively small bands, and making a romantic move was probably a potentially dangerous and life-changing experience. Today, it’s relatively minor, and if one person says no you just move on to the next one. Humiliation is minor and generally forgotten by everyone except the person turned down. We live in a world so different than our ancestral environment that it’s hard to remember how poorly adapted we are to modern life.

The above paragraph might be wrong—it’s a just-so story, and I’m not even sure how to test these ideas—but it is plausible. Still, most of us realize what’s effective in modern life and start doing that as we get older, rather than persisting in endless crushes. In many domains a “no” is actually better than not knowing, or a “maybe,” since a “no” means that you can go on to find someone who says “yes.” Getting to “no” has value in itself.

In life most of us realize that missed opportunities just sort of suck—so when they arise, you seize them. Instead, the characters in Love, Actually pointlessly defer them; in real life, the opposing party often comes up with a boyfriend or girlfriend in the interim. But in movie-land, it all works out, and everyone gets laid. The fellow with the hot Portuguese flatmate should’ve tried speaking the language of love while he was there.

One definition of stupidity is the failure to learn from experience. But the experiences of the characters in the movie are so limited that there isn’t enough screen time for the ups and downs more typical of romantic comedies. All the characters, regardless of their age, also seem to have very little life experiences. The 50-year-olds are mentally 16, but with wrinkles. They lack the forthrightness uncommon in teens but fairly common by… let me make up a number and say 24.

Love Actually isn’t a terrible movie—I laughed, sometimes, and frequently when the exasperated, washed-up singer had to do his hilarious bit—but I can’t see wanting to watch it again. It was also British, which meant there were more nude scenes than an equivalent American movie would have, and those are always welcome. I also get that its characters are, if not caricatures, then at least “broadly drawn.”


* Some people would argue that it still does describe me.

Antifragile: Things That Gain From Disorder — Nassim Taleb

The Black Swan is so good that I’ve been running around telling everyone to read it, which naturally led me to its successor, Antifragile. It, by contrast, is an excellent book to get from the library (per this accurate warning) and an excellent book to read skeptically, given its many dubious claims and stories.

Antifragile_coverThe top-level idea of Antifragile is a good one (many random events trade small gains for tremendous losses, and vice-versa; focus on making sure that you can sustain small losses for big gains, which makes you “antifragile,” as opposed to merely “robust”), but rarely have I read a book with such a correct thesis and so many misrepresentations, needless ad-hominem attacks, and dubious stories that may not demonstrate what the author thinks they demonstrate. Many amuse along the way but could have been removed; for a guy who is fond of the term “narrative fallacy,” Taleb is awfully fond of narratives that could be called fallacious.

I meant to write a real review, but I’ve been beaten to it and would direct you instead to the link, where David Runciman does a better job than I’m likely to. If The Black Swan is an unexpectedly fascinating and insightful work with a deceptively simple main idea that is helpfully explained and elaborated on with virtually every page, Antifragile is the sort of book that can be better read through the reviews than the book itself. As noted in the first paragraph, if you feel the need to verify this claim, at least get Antifragile from the library.

Let’s take one small example of a dubious claim: on pages 83 – 84, Taleb tells a parable about two men, one a banker and one a taxi driver. In that parable the taxi driver differs from the banker:

Because of the variability of his income, [the taxi driver]” keeps moaning that he does not have the job security of his brother—but in fact this is an illusion, for he has a bit more. [. . .] Artisans, say, taxi drivers, prostitutes (a very, very old profession), carpenters, plumbers, tailors, and dentists, have some volatility in their income but they are rather robust to a minor professional Black Swan, one that would bring their income to a complete halt.

But taxi drivers are interesting example because we’re approaching the point at which self- driving cars may become common, which would be a major professional Black Swan for taxi drivers. The Industrial Revolution has been hell on “Artisans,” who today still find it very hard to compete with factories. To be sure, the Internet has made it easier for artisans to do their thing by allowing them to sell on their own websites and on aggregators like Etsy, but artisans as a group are never going to be as important as they were in, say, 1700.

There are also paragraphs so stupid that they defy rational explanation:

both governments and universities have done very, very little for innovation and discovery, precisely because, in addition to their blinding rationalism, they look for the complicated, the lurid, the newsworthy, the narrated, the scientistic, and the grandiose, rarely for the wheel on the suitcase. Simplicity, I realized, does not lead to laurels.

Government and universities have been pivotal in everything from computers to nuclear power to medicine, and saying they “have done very, very little for innovation and discovery” is incredibly, stupidly wrong—the sort of wrong that tempts one to disregard the entire rest of the book. It is useful to have outsiders throwing intellectual stones at academic insiders, but only when the outsiders know more than the insiders. In this case, Taleb is just a crank. A few pages later he does qualify the quoted paragraph, but he shouldn’t have written it.

Elsewhere, Taleb writes that “The intellectual today is vastly more powerful and dangerous than before,” which I find flattering but also unlikely; I also suspect many if not most intellectuals would agree with that assessment, given how many of them write lamentations about their lack of influence.

There are moments like this, which is fascinating: “Criticism, for a book, is a truthful, unfaked badge of attention, signaling that it is not boring: and boring is the only very bad thing for a book,” and then Taleb considers books we now admire that were banned or controversial, like Madame Bovary. But he doesn’t consider other highly criticized books that are now shunned for good reason, like Mein Kampf, or The Protocols of the Elders of Zion. He also doesn’t consider books that are wrong and important and should be relegated to the highly of ideas, like Das Kapital, which still gets read and taken seriously in some academic precincts. All three are bad books for reasons that have been much discussed, and there is something else that can be very bad for a book: it inspires people to steal things from others or hurt others. That’s what all three encourage.

That’s four samples. And yet he also produces gems like “men of leisure become slaves to inner feelings of dissatisfaction and interests over which they have little control.” In many ways I am reminded of Camille Paglia, who also has much to say about the ruthlessness of nature, often mentions prostitutes, and often goes further than her evidence or ideas merit. Yet as far as I and Google know, very few others have observed the connection.

Let’s talk some more about the positive; one chapter in Antifragile, “Skin in the Game,” is an especially important way to assess the world and assess risk. Taleb quotes Hammurabi’s code:

If a builder builds a house and the house collapses and causes the death of the owner of the house—the builder shall be put to death. If it causes the death of the son of the owner of the house, a son of the builder shall be put to death. If it causes the death of a slave of the owner of the house—he shall give to the owner of the house a slave of equal value.

Someone who puts people at risk should be at equal risk themselves. CNN just published “Yemen says U.S. drone struck a wedding convoy, killing 14,” in which 14 people were murdered by the United States; they are probably classified internally as “collateral damage” or by some similarly Orwellian euphemism (although “U.S. officials declined to comment on the report,” because why does the truth matter, anyway?). Imagine that those who ordered or authorized the strike would have their own husbands and wives and children killed in proportion to the number of civilians they killed. No one would order such strikes.

Nor would politicians authorize such strikes if the struck could vote, or if CNN and Fox News covered them for weeks or months at a time, as they would if something similar happened in the U.S. As Conor Friedersdorf correctly observed in The Atlantic, “If a Drone Strike Hit an American Wedding, We’d Ground Our Fleet.” But those who are launching the missiles have no skin in the game, to use Taleb’s favored phrase.

If someone does something wrong, what bad thing happens to them? If a doctor screws up too badly, they at least get sued. But if, say, a teacher’s students can’t read or do math, nothing bad happens to them. Long before I read Taleb, I remember explaining to students that, if they emerge from my classes unable to write effectively, nothing bad happens to me—I’ve never heard about a U of Arizona T.A. being fired for incompetence. The only bad things happen to them, in that they won’t be as good at writing or reading as they would be otherwise. That seemed to be a revelation to them: I don’t think they’d considered how they, not their teachers, bear the risks of a lousy education. Those risks are even more pronounced at the high school level, where most public school teachers are unionized and can’t really be fired.

Academia and government share the property of not having much skin in the game. As Taleb says, “An academic is not designed to remember his opinions because he doesn’t have anything at risk from them.” You often don’t get tenure for being right; you get tenure for publishing, regardless of whether what you’ve said is right.

In his books Taleb is weirdly reluctant to address global warming and climate change, which may be the ultimate nonlinear Black Swan system of our age—prone to sudden shocks that may have catastrophic results. There is a brief mention of the issue on page 415 of Antifragile, but he doesn’t discuss the issue in any detail. The collective response of the world to these dangers is to shrug, make minor changes, and hope for the best.

The danger is real and yet almost no one does anything significant to mitigate those challenges—though the risk of catastrophic change is extraordinarily high and the things that could be done to reduce it are relatively easy compared to what may come. Taleb implicitly endorses action when catastrophic risks are high even when probabilities are low in this section:

which is more dangerous, to mistake a bear for a stone, or mistake a stone for a bear? It is hard for humans to make the first mistake; our intuitions make us overreact to the smallest probability of harm and fall for a certain class of false patterns—those who overreact upon seeing what may look like a bear have had a survival advantage, those who made the opposite mistake left the gene pool.

The metaphor is clear, yet he barely addresses, either in The Black Swan or Antifragile, the way we might be making the global climate extremely fragile through our use of fossil fuels. In the U.S., some of the obvious means to mitigate fossil fuel usage, like building denser urban cores or switching towards nuclear power, are barely on the national agenda and, even when they are on the national agenda, they can easily be blocked by short-sighted local NIMBYs. Nuclear power is particularly curious, since coal power emissions kill far more people in the West than any form of nuclear power. But coal kills people slowly, over time, and mostly invisibly; it never ends up in the news, while any problem with nuclear power sears the media’s collective eyes.

Overall, with climate change, we may be mistaking a bear for a stone, and we may collectively pay the price.

Since Taleb has cultivated an outsider’s persona and portrayed himself, often accurately, as a teller of truths no one else wants to hear, or whose logic others attack despite its accuracy, he seems to have decided that attacks on his logic or his perceived truths automatically make his logic or perceived truths correct. But that’s a simple error in itself, since the attacks are not sufficient to show that he is right, and with any much-attacked thinker there is a danger in becoming so impervious to outside criticism that the work suffers. I suspect that Taleb has moved into that latter category, and there may be an interesting psychological meta narrative about how he moved from his initial outsider, but intellectually rigorous position, to a hybrid insider-outsider in which he no longer feels as compelled to write tightly and correctly as he did before fame (justifiably) found him.

Success breeds the danger of surrounding yourself with yes-men, but a good editor should be the opposite and should tell you hard truths that you don’t necessarily want to hear. Taleb, one feels, is not the sort of guy looking for constructive disagreement. Yet despite the mess in the book, some of its ideas are important. If I were a philosopher I’d be more willing to excuse bad writing and a dubious execution. Since I’m not, it’s hard to get past the book’s many moments in which I went, “That’s not right” or “That doesn’t belong.” But Antifragile also can’t be dismissed outright, because some of its ideas are important and rarely discussed. Given the form those ideas take, I doubt they will get the attention they deserve.

Politics and pernicious expectations

Last month there was a long and mostly stupid discussion about “The Cheapest Generation,” and in the long and mostly stupid discussion someone mentioned delaying having children and that “everyone i know considers the world far too precarious to start a family.” I replied and said that, “By virtually every metric, the world is a much safer, healthier place than it used to be, as Steven Pinker observes in The Better Angels of our Nature. Someone else replied, “Safer? Yes. Healthier? Yes. Stable? No. Between income/job volatility and lack of proper social safety nets (at least in the US), its a dangerous gamble to start a family unless you’re in the right situation.”

I don’t know what “the right situation” is, but I think that person underestimates just how hard most people have worked throughout history and how low their material expectations were—and, by contrast, how high they are now. If you expect two cars, a large house with a room for each child and a spare room too, in a sweet coastal city location like L.A., San Francisco, or Seattle, then yeah, things are tough (though much of that is because of land-use policy, not because of “intrinsic” how prices). If you have lower expectations, however, it’s possible to move to Texas, buy a $140,000 house that maybe isn’t in the world’s best location but is okay, and has room for kids but maybe not tons of extra space, then things aren’t all that expensive. The “right” situation is much cheaper.

Most of the commentariat, however, is looking at NYC / L.A. / Seattle / etc., and wants the “best” schools, and wants a BMW, and interesting vacations to foreign countries, and, and, and… all those things add up. If you radically scale back expectations, a lot of things become more possible. If you realize what people use to expect, your expectations might change too. My grandparents barely escaped the Holocaust and, according to family lore, never really made it to the American middle class. Tales of living in Minneapolis without heat in the winter were and are common.

Along similar lines, Megan McArdle tells this story:

My grandfather worked as a grocery boy until he was 26, in the depths of the Great Depression. For six years, he supported a wife on that salary — and no, it’s not because You Used To Be Able To Support A Family On A Grocery Boy’s Wages Until These Republicans Ruined Everything. He and my grandmother moved into a room in his parents’ home, cut a hole through the wall for their stovepipe and set up housekeeping. They got married on Thanksgiving, because that was the only day he could get off. My grandmother spent six years carefully piecing his tiny salary into envelopes — so much for food, for rent, for gasoline for the car he needed to get eight miles into town. And they stayed married for 67 years, until my grandfather’s death in 2004.

“We didn’t have a dramatic increase in unwed childbearing back in the Great Depression,” sociologist Brad Wilcox told me. “That’s in part because we had a very different understanding of family life and sex and marriage back then. That tells us that it’s not just economic. It’s also about culture and law.”

By modern standards that sounds really crappy, but McArdle’s grandparents managed to have kids and be more-or-less okay in material conditions that would strike most contemporary Americans as being at a level of shocking privation. Yet the commenters above mention “income/job volatility” without noting that, in many circumstances, we have very high incomes—we just choose to spend them instead of save them (Note that I’m guilty of this too and am not throwing stones from my own glass house). In an environment of low or zero growth, or highly uneven growth, that may be a tremendous problem, and the problem has individual and political components—and responses.

To return to McArdle, this time in “How to Put the Brakes on Consumers’ Debt:”

[. . .] this is a conflict between what Walter Russell Mead calls the blue social model and the red-state world where Ramsey lives and finds most of his listeners. We can quibble about this or that [. . . ] But what it boils down to is that Olen thinks that rising economic insecurity calls for a massive expansion of the blue social model, while Ramsey thinks it calls for getting more entrepreneurial and adjusting your lifestyle to meet reduced income expectations. How well you think this works is probably closely connected to where you live.

(Emphasis added)

The whole piece is worth reading, but I think the debate between the two gurus McArdle cites is actually about a large-scale public response to current conditions versus how a particular individual or family should respond. Olen is arguing politics; Ramsey is arguing personal. I also think we’re going to see this change: “But for blue-state professionals, that’s something close to suggesting that they should abandon their kids in the street (or have them take out $150,000 in student loans, which is not much better). The social norm is that you send your kid to the best college he or she can get into, by any means necessary,” because for one thing I’m not convinced any school is worth $150,000 in student loans. Maybe one could make that argument for the very, very elite schools, but not many others.

The problem with arguing for a political response is that most individuals can’t do much on their own to change policies. But they can decide to say, “No, I can’t afford that house or vacation or dinner or whatever.” I also suspect that very few individuals have any coherent idea about how public policies work, as Bryan Caplan’s The Myth of the Rational Voter shows. My own pet peeve is urban land use controls, since those raise prices by preventing new development, but very few people connect high prices with supply limits. That’s basic econ, but almost no one acknowledges it. If we collectively can’t even understand that, I’m not optimistic about Olen-style political solutions. Those mostly seem to boil down to taking a lot of money and dumping it into systems and institutions that aren’t necessarily working all that well.

Education is one of those systems, and I suspect that a basic idea is taking hold: a lot of higher education has become increasingly exploitive over time, with student loans fueling the binge. There is very little incentive for most institutions to say, “We’re going to forget about a diversity department staff and counseling staff and subsidizing dorms, and we’re just going to provide professors, classrooms, and labs, and you buy the rest, unsubsidized, if you want to.” We’re going to see some non-elite schools go for radical cost reductions, and we’ll see if people go for them. If the price is good enough they might.

In the meantime, lots of people are moving to places like Texas, where land controls are low. On average they’re leaving places like New York and California. For people with medium or low incomes and high material expectations, that totally makes sense. Historically speaking, I think it’s easy to forget high low material expectations were: until recently, houses were shockingly smaller than they are today. In 1975 the median new home was 1,535 square feet, and now it’s 2,169 square feet—even as family size and children-per-woman has been declining.

To return to the original point in the first paragraph of this post, job volatility might not matter so much for someone who decides to live in a 1,535 square foot house instead of a 2,169 square foot house, and that basic dynamic can be extrapolated across a range of purchases. Most of us, however, ask ourselves, “Why not take the Vicuna?” Then we complain when the world doesn’t conform to our material expectations.

What makes a person special: Name of the Rose edition

“But there is no precise rule: it depends on the individuals, on the circumstances. This holds true also for the secular lords. Sometimes the city magistrates encourage the heretics to translate the Gospel into the vernacular: the vernacular by now is the language of the cities, Latin the language of Rome and the monasteries. And sometimes the magistrates support the Waldensians, because they declare that all, men and women, lowly and mighty, can teach and preach, and the worker who is a disciple after ten days hunts for another whose teacher he can become.”
“And so they eliminate the distinction that makes clerics irreplaceable!”

That’s from Umberto Eco’s The Name of the Rose, and we can see a similar situation happening now among many professional, privileged, and credentialed classes: with the Internet, the cost of being able to “teach and preach” goes down; anyone motivated can learn, or start to learn almost anything, and anyone inclined to teach can start writing or videoing on whatever topic they believe themselves to be an expert in. The key of course is motivation, which is in scant supply now and probably always will be.

Whether the existing power structures want to encourage self-learning, like many of the “secular lords” and “city magistrates,” or want to preserve existing institutions, depends on the person speaking and their aims. But “the distinction that makes clerics irreplaceable” is similar to the one that makes professors or other professional teachers irreplaceable. It’s a distinction that’s less important than the knowledge and skill underlying the distinction. Some with the distinction are not very good at their jobs and some without distinction are incredibly skilled. Those lines are blurring. Blurring slowly, to be sure. The language of knowledge is spreading. The issue of credentialing remains, but the number of jobs in which work product is a better examination than formal credentials is probably growing.

Does the average software startup want a famous degree, or an extensive Github repository? Right now I’m sifting through freelance fiction editors, and I’ve asked zero of them where they got their degrees or if they have any. I’m very interested in their sample edits and other novels they’ve edited. Clients almost never ask Seliger + Associates about formal degrees—they want to know if we can get the job done.

In writing this post, I am also conforming to the second of Umberto Eco’s “three ways” of reading The Name of the Rose:

The first category of readers will be taken by the plot and the coupes de scene, and will accept even the long bookish discussions and the philosophical dialogues, because it will sense that the signs, the traces and the revelatory symptoms are nesting precisely in those inattentive pages. The second category will be impassioned by the debate of ideas, and will attempt to establish connections (which the author refuses to authorize) with the present. The third will realize that this text is a textile of other texts, a ‘whodunnit’ of quotations, a book built of books.

Eco published this novel in 1980, around the dawn of the personal computer age and long before the consumer Internet. Whatever connections existed in the 1970s between The Name of the Rose and that era—the ones Eco presumably had in mind, whatever his view of authorization—are not the ones I most notice. That the novel’s correspondences can grow and change with decades make it so powerful and deep. Few works of art transcend their immediate context. This one does. It deals with the eternities much more than the news, though the author has demonstrated in essays his interest in the daily news.

If someone had told me before I read The Name of the Rose that a novel set in 1327 and utterly enmeshed in the recondite politics of Christianity would be one of my favorite novels, I would’ve scoffed. Religion as a subject is of little interest to me, except in meta sense. But sufficiently great novels transcend their context, even as they adapt the language, rhetoric, and world of their context. As Eco’s third category of reader indicates, the novel is composed of many other novels, books, articles, and speech. He has, it seems, 800 years of literary history composted into a single work. Few novels do, and fewer still do so in a novel with an actual plot.