Are you more than a consumer? “The Once and Future Liberalism” and some answers

This is one of the most insightful thing I’ve read about an unattractive feature of American society: we put an “emphasis on consumption rather than production as the defining characteristic of the good life.” It’s from “Beyond Blue 6: The Great Divorce,” where, in Walter Russell Mead’s reading, “Americans increasingly defined themselves by what they bought rather than what they did, and this shift of emphasis proved deeply damaging over time.” I’m not convinced this has happened equally for everybody, all the time, but it rings awfully true.

Which brings us back to the point made in the title: are you producing more than you consume? Are you focused on making things, broadly imagined, instead of “consuming” them? Is there more to your identity than the music you like and the clothes you wear? (“More” might mean things you know, or know how to do, or know how to make.) Can you do something or somethings few others can? If the answers are “no,” you might be feeling the malaise Mead is describing. In Anything You Want, Derek Sivers writes:

When you want to learn how to do something yourself, most people won’t understand. They’ll assume the only reason we do anything is to get it done, and doing it yourself is not the most efficient way.

But that’s forgetting about the joy of learning and doing.

If you never learn to do anything yourself—or anything beyond extremely basic tasks everyone else knows—you’re not going to lead a very satisfying life. Almost as bad, you probably won’t know it. You’ll only have that gnawing feeling you can’t name, a feeling that’s easy—too easy—to ignore most of the time. You can’t do everything yourself, and it would be madness to try. But you should be thinking about expanding what you can do. I’ve made a conscious effort to resist being defined by what I buy rather than what I do, and that effort has intensified since I read Paul Graham’s essay “Stuff;” notice especially where he says, “Because the people whose job is to sell you stuff are really, really good at it. The average 25 year old is no match for companies that have spent years figuring out how to get you to spend money on stuff. They make the experience of buying stuff so pleasant that “shopping” becomes a leisure activity.” To me it’s primarily tedious.

But this tedious activity is everywhere, and in Spent: Sex, Evolution, and Consumer Behavior, Geoffrey Miller describes how companies and advertisers have worked to exploit evolved human systems for mating and status in order to convince you that you need stuff. Really, as he points out, you don’t: five minutes of conversation does more signaling than almost all the stuff in the world. Still, I don’t really take a moral view of shopping, in that I don’t think disliking shopping somehow makes me more virtuous than someone who does like shopping, but I do think the emphasis on consumption is a dangerous one for people’s mental health and well-being. And I wonder if these issues are also linked to larger ones.

A lot of us are suffering from an existential crisis and a search for meaning in a complex world that often appears to lack it. You can see evidence in the Western world’s high suicide rates, in Viktor Frankl’s book Man’s Search for Meaning (he says, “I do not at all see in the bestseller status of my book so much an achievement and accomplishment on my part as an expression of the misery of our time: if hundreds of thousands of people reach out for a book whose very title promises to deal with the question of a meaning to life, it must be a question that burns under the fingernails”), in Irvin Yalom’s Existential Psychotherapy (especially the chapter on despair), in Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, in All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, in The Joy of Secularism: 11 Essays for How We Live Now, in the work of Michel Houellebecq. I could keep going. The question isn’t merely about the number of responses to present conditions, but about what those present conditions are, how they came about, what they say about contemporary politics (Mead makes the political connection explicit in “The Once and Future Liberalism: We need to get beyond the dysfunctional and outdated ideas of 20th-century liberalism“), and what they say about how the individual should respond.

People respond in all kinds of ways. Despair is one. Fanaticism, whether towards sports teams or political parties or organized religion is another, with religion being especially popular. You can retreat to religious belief, but most dogmatic religious beliefs are grounded in pre-modern beliefs and rituals, and too many religions are surrounded by fools (did Heinlein say, “It’s not God I have a problem with, it’s his fan club”? Google yields many variations). Those kinds of answers don’t look very good, at least to me. You have to look harder.

I think part of the answer has to lie in temperament, attitude, and finding a way to be more than a consumer. For a very long time, people had to produce a lot of what they consumed—including their music, food, and ideas. I don’t want to lapse into foolish romanticism about the pre-modern, pre-specialized world, since such a world would be impossible to recreate and ugly if we did. People conveniently forget about starvation and warfare when they discuss the distant past. Plus, specialization has too many benefits—like the iMac I’m looking at, the chair I’m sitting in, the program I’m using to write this, the tasty takeout I can order if I want it, the tea in my kitchen, the condoms in my bedroom, or the camera on my tripod. For all its virtues, though, I’m increasingly convinced that specialization has psychic costs that few of us are really confronting, even if many of us feel them, and those costs relate to how we related to meaning and work.

According to Mead, in the 19th Century, families “didn’t just play together and watch TV together; they worked together to feed and clothe themselves.” Today, disparate activities drive specialization even within the family, and family life has become an increasingly consumption, status-oriented experience. To Mead, “If we wonder why marriage isn’t as healthy today in many cases, one reason is surely that the increasing separation of the family from the vital currents of economic and social life dramatically reduces the importance of the bond to both spouses – and to the kids.” We’ve gotten wealthier as a society, and wealth enables us to make different kinds of choices. Marriage is much more of a consumer good: we choose it, rather than being forced into it because the alternative is distressingly high resource diminishment. Charles Murray observes some effects this has on marriage in Coming Apart: The State of White America, 1960-2010, since getting and staying married has enormous positive effects on income—even if “the vital currents of economic and social life” conspire to make spouses less dependent on each other.

Kids are less economically useful and simultaneously more dependent on their parents. It also means they’re separated from the real world for a very long time. To Mead, part of this is education:

As the educational system grew more complex and elaborate (without necessarily teaching some of the kids trapped in it very much) and as natural opportunities for appropriate work diminished, more and more young people spent the first twenty plus years of their lives with little or no serious exposure to the world of work.

It starts early, this emphasis on dubious education and the elimination of “natural opportunities for appropriate work”:

Historically, young people defined themselves and gained status by contributing to the work of their family or community. Childhood and adulthood tended to blend together more than they do now. [. . .] The process of maturation – and of partner-seeking – took place in a context informed by active work and cooperation.

In the absence of any meaningful connection to the world of work and production, many young people today develop identities through consumption and leisure activities alone. You are less what you do and make than what you buy and have: what music you listen to, what clothes you wear, what games you play, where you hang out and so forth. These are stunted, disempowering identities for the most part and tend to prolong adolescence in unhelpful ways. They contribute to some very stupid decisions and self-defeating attitudes. Young people often spend a quarter century primarily as critics of a life they know very little about: as consumers they feel powerful and secure, but production frightens and confuses them.

I’m familiar with those “stunted, disempowering identities” because I had one for along time. Most teenagers don’t spend their adolescence becoming expert hackers, like Mark Zuckerberg or Bill Gates, and they don’t spend their time becoming experts musicians, like innumerable musicians. They spend their adolescences alienated.

I’m quoting so many long passages from Mead because they’re essential, not incidental, to understanding what’s going on. The result of an “absence of any meaningful connection to the world of work and production” is Lord of the Flies meets teen drama TV and movies. Paul Graham gets this; in one of my favorite passages from “Why Nerds Are Unpopular,” he writes:

Teenage kids used to have a more active role in society. In pre-industrial times, they were all apprentices of one sort or another, whether in shops or on farms or even on warships. They weren’t left to create their own societies. They were junior members of adult societies.

Teenagers seem to have respected adults more then, because the adults were the visible experts in the skills they were trying to learn. Now most kids have little idea what their parents do in their distant offices, and see no connection (indeed, there is precious little) between schoolwork and the work they’ll do as adults.

And if teenagers respected adults more, adults also had more use for teenagers. After a couple years’ training, an apprentice could be a real help. Even the newest apprentice could be made to carry messages or sweep the workshop.

Now adults have no immediate use for teenagers. They would be in the way in an office. So they drop them off at school on their way to work, much as they might drop the dog off at a kennel if they were going away for the weekend.

What happened? We’re up against a hard one here. The cause of this problem is the same as the cause of so many present ills: specialization. As jobs become more specialized, we have to train longer for them. Kids in pre-industrial times started working at about 14 at the latest; kids on farms, where most people lived, began far earlier. Now kids who go to college don’t start working full-time till 21 or 22. With some degrees, like MDs and PhDs, you may not finish your training till 30.

But “school” is so often bad that 30% of teenagers drop out—against their own economic self-interest. Only about a third of people in their twenties have graduated from college. What gives? Part of it must be information asymmetry: teenagers don’t realize how important school is. But the other part of the problem is what Graham describes: how dull school seems, and how disconnected it is from what most people eventually do. And that disconnection is real.

So, instead of finding connections to skills and making things, teenagers pick up status cues from music and other forms of professionally-produced entertainment. Last year, I was on a train from Boston to New York and sat near a pair of 15-year-olds. We talked a bit, and one almost immediately asked me what kind of music I liked. The question struck me because it had been so long since I’d been asked it so early in a conversation with a stranger. In high school and early college, I was asked it all the time: high school-aged people sort themselves into tribes and evaluate others based on music. In college, the first question is, “What’s your major?”, and in the real world it’s, “What do you do?” The way people ask those early questions reveals a lot about the assumptions underlying the person doing the asking.

Now: I like music as much as the next guy, but after high school I stopped using it to sort people. Why should high school students identify themselves primarily based on music, as opposed to some other metric? It’s probably because they have nothing better to signal who they are than music. It would make sense to discuss music if you are a musician or a genuine music aficionado, but I wasn’t one and most of the people I knew weren’t either. Yet the “What’s your favorite music?” question always arose. Now, among adults, it’s more often “What do you do?”, which seems to me an improvement, especially given its proximity to the questions, “What can you do?” and “What do you know?”

But that’s not a very important question for most high school students. They aren’t doing anything hard enough that errors matter. And in some ways, mistakes don’t matter much in most modern walks of life: they don’t cause people to die, or to really live, or do things differently. So finding a niche where mistakes do matter—as they do when you run your own business, or in certain parts of the military, or in some parts of medicine, or as an individual artist accountable to fans—can lead to a fuller, more intensely lived life. But that requires getting off the standard path. Few of us have the energy to bother. Instead, we feel underutilized, with the best parts of ourselves rusting from disuse–or perhaps gone altogether, because we never tried to develop the best parts of ourselves. That might explain, almost as much as my desire to tell stories, why I spend so much time writing fiction that, as of this writing, has mostly been fodder for agents and friends, and why I persist in the face of indifference.

Individuals have to learn to want something more than idle consumption. They have to want to become artists, or hackers, or to change the world, or to make things, all of which are facets of the same central application of human creativity (to me, the art / science divide is bullshit for similar reasons). For much of the 20th Century, we haven’t found “something” in work:

Since work itself was so unrewarding for so many, satisfaction came from getting paid and being able to enjoy your free time in the car or the boat that you bought with your pay. It was a better deal than most people have gotten through history, but the loss of autonomy and engagement in work was a cost, and over time it took a greater and greater toll.

A friend once told me about why he left a high-paying government engineering job for the hazards and debts of law school: at his engineering job, everyone aspired to a boat or a bigger TV. Conversations revolved around what people had bought or were planning to buy. No one thought about ideas, or anything beyond consumption. So he quit to find a place where people did. I mean, who cares that you buy a boat? Maybe it makes getting laid marginally easier, at least for guys, but that time, money, and energy would probably be better spent going out and meeting people, rather than acquiring material objects.

I’ve seen people who have virtually no money be extraordinarily happy and extraordinarily successful with the sex of their choice, and people in the exact opposite condition. The people with no money and lots of sex tend to get that way because of their personalities and their ability to be vibrant (again: see Miller’s book Spent). Even if you’re bad at being vibrant, you can learn to be better: The Game is, at bottom, about how to be vibrant for straight men, and the many women’s magazines (like Cosmo) are, at bottom, about how to be vibrant for women. Neither, unfortunately, really teaches one to be tolerant of other people’s faults, which might be the most important thing in the game of sex, but perhaps that comes through in other venues.

I don’t wish to deify Mead or his argument; when he says, “There was none of the healthy interaction with nature that a farmer has,” I think he’s missing how exhausting farming was, how close farmers were to starvation for much of agricultural history, and how nasty nature is when you’re not protected from it by modern amenities (we only started to admire nature in the late eighteenth century, when it stopped being so dangerous to city dwellers.) It’s easy to romanticize farming when we don’t have to do it. Likewise, Mead says:

A consumption-centered society is ultimately a hollow society. It makes people rich in stuff but poor in soul. In its worst aspects, consumer society is a society of bored couch potatoes seeking artificial stimulus and excitement.

But I have no idea what he means by “poor in soul.” Are Mark Zuckerberg or Bill Gates “poor in soul?” Is Stephen King? Tucker Max? I would guess not, even though all four are “rich in stuff.” We’ve also been “A consumption-centered society” for much of the 20th century, if not earlier, and, all other things being equal, I’d rather have the right stuff than no stuff, even if the mindless acquisition of stuff is a growing hazard. The solution might be the mindful acquisition of stuff, but even that is hard and takes a certain amount of discipline, especially given how good advertisers are at selling. I would also include “politicians” as being among advertisers these days.

Contemporary politics are (mostly) inane, for the structural reasons Bryan Caplan describes in The Myth of the Rational Voter. So I’m predisposed to like explanations along these lines:

Nobody has a real answer for the restructuring of manufacturing and the loss of jobs to automation and outsourcing. As long as we are stuck with the current structures, nobody can provide the growing levels of medical and educational services we want without bankrupting the country. Neither “liberals” nor “conservatives” can end the generation-long stagnation in the wage level of ordinary American families. Neither can stop the accelerating erosion of the fiscal strength of our governments at all levels without disastrous reductions in the benefits and services on which many Americans depend.

Most people on the right and the left have “answers” about contemporary problems that miss large aspects of those problems or the inherent trade-offs involved. A lot of the debate that does occur is dumb, sometimes militantly and sometimes inadvertently, but dumb nonetheless. As Mead says: “We must come to terms with the fact that the debate we have been having over these issues for past several decades has been unproductive. We’re not in a “tastes great” versus “less filling” situation; we need an entirely new brew.” Yet we’re getting variations on old brews, in which liberals look like conservatives in their defense of 1930s-era policies, and conservatives look like conservatives in their veneration of 19th century-style free-market policies. Only a few commentators, like Tyler Cowen in The Great Stagnation, even try earnestly to identify real problems and discuss those problems in non-partisan terms.

This post started as a pair of links, but it ended in an essay because Mead’s essays are so important in the way they get at an essential aspect of contemporary life. If you’re a writer, you can’t afford to ignore what’s happening on the ground, unless you want to be, at best, irrelevant, and I wonder if one reason nonfiction may be outpacing fiction in the race for importance involves the way nonfiction sidesteps questions of meaning by focusing on real things with real effects, instead of how people can’t or won’t find meaning in a world where most of us succeed, at least on a material level, by following a conventional path.

Naturally, I also think about this in the context of fiction. A while ago, I wrote this to a friend: “Too much fiction is just about dumb people with dumb problems doing dumb things that the application of some minor amount of logic would solve. Bored with life because you’re a vaguely artistic hipster? Get a real job, or learn some science, or be a real artist, or do something meaningful. The world is full of unmet needs and probably always will be. But so many characters wander around protected by their own little bubbles. Get out! The world is a big place.” Mead, I think, would agree.

It’s hard to disentangle the individual, education, acquisition, ideas, society, and politics. I’ve somewhat conflated them in my analysis, above, because one inevitable leads to the other, since talking about how you as a person should respond inevitably leads one to questions about how you were educated, and education as a mass-process inevitably leads one to society, and so forth. But I, as an individual, can’t really change the larger systems in which I’m embedded, though I can do a limited amount to observe how those systems work and how I respond to them (which often entails writing like this and linking to other writers).

Why professors don’t bother

When I was an undergrad, I noticed that professors were often reluctant to deeply engage with students; when I got students of my own, I realized why and wrote “How to get your Professors’ Attention — along with Coaching or Mentoring” to explain it. Since then, I’ve noticed one other facet of this general phenomenon: when I do engage, or spend a lot of time offering advice or guidance, students often ignore it—making me feel like I wasted my time. Paul Graham’s footnote in A Word to the Resourceful catalyzed this realization for me:

My feeling with the bad groups [of tech startup founders from Y Combinator] is that coming into office hours, they’ve already decided what they’re going to do and everything I say is being put through an internal process in their heads, which either desperately tries to munge what I’ve said into something that conforms with their decision or just outright dismisses it and creates a rationalization for doing so. They may not even be conscious of this process but that’s what I think is happening when you say something to bad groups and they have that glazed over look. I don’t think it’s confusion or lack of understanding per se, it’s this internal process at work.

This happens with students too. A few weeks ago a former student wrote to me about career choices and whether she should major in biochem or English; she started with biochem but struggled in classes (which isn’t at all unusual in science classes). A friend majored in biochem major, so together we wrote a thorough response that turned into an essay called “How to think about science, becoming a scientist, and life” that should go up soon. After spending a couple hours detailing an array of issues, we sent the e-mail, and I got back a response saying. . . she’s going to go to law school and “become a judge.”

So all of the considered reasoning and description and discussion was merely “put through an internal process in” her head. (She’s not the only student to have done this, but she’s merely the most recent example.) Reading her response was painful because she has no ability to understand what being a lawyer or judge is actually like and no ability to project what she’s going to feel like or want in a couple of years, let alone ten, let alone twenty. She’s not alone in this: most people can’t anticipate what they’ll want in the future, and most of us can’t even remember what we were like in the past; we tend to imagine ourselves always having been more or less as we are now. That’s one of Daniel Gilbert’s remarkable insights in Stumbling on Happiness.

Now, I might be overwrought about this, and I might be wrong; one commenter said:

I’m not saying your student didn’t have a pre-filter as you describe. On the other hand, you may have been just one source of advice for your student. Asking for advice doesn’t mean that taking it is always the best course, it’s information to be weighed against all other advice and information.

This is certainly true, but I have’t gotten the sense that most students are doing this. My sense is that most are trying “to munge what I’ve said into something that conforms with their decisions,” or they just “outright dismiss it and create a rationalization for doing so.” The worst part isn’t even that they’re doing so: the worst part is that they’re probably not even aware they’re doing it.

(Observing this phenomenon also makes me wonder about how much I listened when I was an undergrad or just out of college; I may have been no better than the student I’m describing above.)

There’s a second reason why I suspect professors don’t bother and build intellectual moats, and it relates to “25 Things I Learned From Opening a Bookstore;” someone in a Hacker News thread about it said, “Turns out mild loathing towards users isn’t unique to software.”

I suspect that, in retailing, 95% of the customers are fine, but that last 5% take up a disproportionate amount of time and mental energy, whether because they’re clueless or morons or mean or whatever. That’s how I think jaded teachers / professors develop: most students are okay, but that small percentage of “story” students create all kinds of artificial barriers and special exceptions and so on that make the teacher / professor not real pleasant. (I won’t defend the exact percentage of 95 and 5 in teaching, but I will say that the vast majority of students are okay and thus not terribly memorable, while the bad ones or the jerks are entirely too easy to recall.)

One jerk makes a vastly larger impression than twenty nice students, customers, or waiters. The jerk sticks in your mind as an example, and the more you build defenses against the jerk, the worse you’re going to react to the average, reasonable student, customer, or waiter, because you’re calibrating your defaults to dealing with the tiny minority who are jerks or irrational or irrationally demanding, when you should try to ignore those experiences with the jerk minority. If you don’t, you’re going to be overly brusque or defensive, corroding the quality of your teaching, selling, or life. The rules you make to deal with the jerks also apply to the normal, pleasant students or customers. Paul Graham discusses this at the scale of companies in The Other Half of “Artists Ship”:

The gradual accumulation of checks in an organization is a kind of learning, based on disasters that have happened to it or others like it. After giving a contract to a supplier who goes bankrupt and fails to deliver, for example, a company might require all suppliers to prove they’re solvent before submitting bids.

As companies grow they invariably get more such checks, either in response to disasters they’ve suffered, or (probably more often) by hiring people from bigger companies who bring with them customs for protecting against new types of disasters.

It’s natural for organizations to learn from mistakes. The problem is, people who propose new checks almost never consider that the check itself has a cost.

Over time, business and government accretes rules that are designed to prevent mistakes, but those rules themselves can eventually become so onerous that they stifle legitimately good ideas. As professors or other people with power and knowledge begin building defenses based on the 5%, a lot of the 95% are harmed too—which is unfortunate. I’m also not sure there’s anything that can be done about this at the institutional level, because the incentives point to the value of building a moat. But by reminding individuals of the cost of the moat, and implicitly telling students how to get over it, perhaps a few people will have a better overall experience.

EDIT: Here’s Graham on funding startups: “The reason we want to fund the most successful founders is that they’re the most fun to work with. It’s exhausting trying to pep up founders who aren’t really cut out for startups, whereas talking to the best founders is net energizing.” Replace “founder” with “student” and “startup” with your field, and the same thing applies. So if you’re a student, you want to at least look, and ideally be, energetic and resourceful.

Why professors don't bother

When I was an undergrad, I noticed that professors were often reluctant to deeply engage with students; when I got students of my own, I realized why and wrote “How to get your Professors’ Attention — along with Coaching or Mentoring” to explain it. Since then, I’ve noticed one other facet of this general phenomenon: when I do engage, or spend a lot of time offering advice or guidance, students often ignore it—making me feel like I wasted my time. Paul Graham’s footnote in A Word to the Resourceful catalyzed this realization for me:

My feeling with the bad groups [of tech startup founders from Y Combinator] is that coming into office hours, they’ve already decided what they’re going to do and everything I say is being put through an internal process in their heads, which either desperately tries to munge what I’ve said into something that conforms with their decision or just outright dismisses it and creates a rationalization for doing so. They may not even be conscious of this process but that’s what I think is happening when you say something to bad groups and they have that glazed over look. I don’t think it’s confusion or lack of understanding per se, it’s this internal process at work.

This happens with students too. A few weeks ago a former student wrote to me about career choices and whether she should major in biochem or English; she started with biochem but struggled in classes (which isn’t at all unusual in science classes). A friend majored in biochem major, so together we wrote a thorough response that turned into an essay called “How to think about science, becoming a scientist, and life” that should go up soon. After spending a couple hours detailing an array of issues, we sent the e-mail, and I got back a response saying. . . she’s going to go to law school and “become a judge.”

So all of the considered reasoning and description and discussion was merely “put through an internal process in” her head. (She’s not the only student to have done this, but she’s merely the most recent example.) Reading her response was painful because she has no ability to understand what being a lawyer or judge is actually like and no ability to project what she’s going to feel like or want in a couple of years, let alone ten, let alone twenty. She’s not alone in this: most people can’t anticipate what they’ll want in the future, and most of us can’t even remember what we were like in the past; we tend to imagine ourselves always having been more or less as we are now. That’s one of Daniel Gilbert’s remarkable insights in Stumbling on Happiness.

Now, I might be overwrought about this, and I might be wrong; one commenter said:

I’m not saying your student didn’t have a pre-filter as you describe. On the other hand, you may have been just one source of advice for your student. Asking for advice doesn’t mean that taking it is always the best course, it’s information to be weighed against all other advice and information.

This is certainly true, but I have’t gotten the sense that most students are doing this. My sense is that most are trying “to munge what I’ve said into something that conforms with their decisions,” or they just “outright dismiss it and create a rationalization for doing so.” The worst part isn’t even that they’re doing so: the worst part is that they’re probably not even aware they’re doing it.

(Observing this phenomenon also makes me wonder about how much I listened when I was an undergrad or just out of college; I may have been no better than the student I’m describing above.)

There’s a second reason why I suspect professors don’t bother and build intellectual moats, and it relates to “25 Things I Learned From Opening a Bookstore;” someone in a Hacker News thread about it said, “Turns out mild loathing towards users isn’t unique to software.”

I suspect that, in retailing, 95% of the customers are fine, but that last 5% take up a disproportionate amount of time and mental energy, whether because they’re clueless or morons or mean or whatever. That’s how I think jaded teachers / professors develop: most students are okay, but that small percentage of “story” students create all kinds of artificial barriers and special exceptions and so on that make the teacher / professor not real pleasant. (I won’t defend the exact percentage of 95 and 5 in teaching, but I will say that the vast majority of students are okay and thus not terribly memorable, while the bad ones or the jerks are entirely too easy to recall.)

One jerk makes a vastly larger impression than twenty nice students, customers, or waiters. The jerk sticks in your mind as an example, and the more you build defenses against the jerk, the worse you’re going to react to the average, reasonable student, customer, or waiter, because you’re calibrating your defaults to dealing with the tiny minority who are jerks or irrational or irrationally demanding, when you should try to ignore those experiences with the jerk minority. If you don’t, you’re going to be overly brusque or defensive, corroding the quality of your teaching, selling, or life. The rules you make to deal with the jerks also apply to the normal, pleasant students or customers. Paul Graham discusses this at the scale of companies in The Other Half of “Artists Ship”:

The gradual accumulation of checks in an organization is a kind of learning, based on disasters that have happened to it or others like it. After giving a contract to a supplier who goes bankrupt and fails to deliver, for example, a company might require all suppliers to prove they’re solvent before submitting bids.

As companies grow they invariably get more such checks, either in response to disasters they’ve suffered, or (probably more often) by hiring people from bigger companies who bring with them customs for protecting against new types of disasters.

It’s natural for organizations to learn from mistakes. The problem is, people who propose new checks almost never consider that the check itself has a cost.

Over time, business and government accretes rules that are designed to prevent mistakes, but those rules themselves can eventually become so onerous that they stifle legitimately good ideas. As professors or other people with power and knowledge begin building defenses based on the 5%, a lot of the 95% are harmed too—which is unfortunate. I’m also not sure there’s anything that can be done about this at the institutional level, because the incentives point to the value of building a moat. But by reminding individuals of the cost of the moat, and implicitly telling students how to get over it, perhaps a few people will have a better overall experience.

EDIT: Here’s Graham on funding startups: “The reason we want to fund the most successful founders is that they’re the most fun to work with. It’s exhausting trying to pep up founders who aren’t really cut out for startups, whereas talking to the best founders is net energizing.” Replace “founder” with “student” and “startup” with your field, and the same thing applies. So if you’re a student, you want to at least look, and ideally be, energetic and resourceful.

Paul Graham and not being as right as he could be in “The Age of the Essay”

Paul Graham often challenges people who say that he’s wrong to cite a particular sentence that is untrue; see, for example, this: “Can you give an example of something I said that you think is false?” Elsewhere, although I can’t find a link at the moment, he says that most people who say he’s said something wrong aren’t actually referring to something he’s said, but something they think he’s said, or imagines he might say. Hence my italicization of “something I said:” Internet denizens often extrapolate from or simplify his often nuanced positions in an attempt to pin ideas to him that he hasn’t explicitly endorsed. So I’m going to try not to do that, but I will nonetheless look at some of what he’s said about writing and writing education and describe some of my attempts to put his implied criticisms into action.

While I think Graham is right the vast majority of the time, I also think he’s off the mark regarding some of his comments about how writing is taught in schools. I wouldn’t call him wrong, exactly, but I would say that trying some of the things he suggests or implicitly suggests hasn’t worked out nearly as well as I’d hoped, especially when applied to full classrooms of students drawn from a wide spectrum of ability and interest.

I’ve long been bothered by the way writing and related subjects are taught in school. They’re made so boring and lifeless most of the time. Part of the problem, and perhaps the largest part, is the teachers. I’ve spent a lot of time contemplating how to improve the writing class experience. Some of that effort appears to be paying off: a surprisingly large number of students will say, either to me directly or in their evaluations, that they usually hate English classes but really like this one. Yes, I’m sure some are sucking up, but I don’t care about sucking up and suspect students can detect as much. I really care about what happens on their papers. But some of my experiments haven’t worked, and I’ll talk about them here.

In “The Age of the Essay,” Graham starts:

Remember the essays you had to write in high school? Topic sentence, introductory paragraph, supporting paragraphs, conclusion. The conclusion being, say, that Ahab in Moby Dick was a Christ-like figure.

Oy. So I’m going to try to give the other side of the story: what an essay really is, and how you write one. Or at least, how I write one.

Graham doesn’t say so explicitly, but the implication of “the other side of the story” and “what an essay really is” is that essay writing in school should be more like real essay writing. To some extent he’s right, but trying to make school essay writing like real essay writing doesn’t yield the kinds of results I’d hoped for. Graham is right that he hasn’t directly said that school writing should be more like real writing, but it’s an obvious inference from this and other sections of “The Age of the Essay,” which I’ll discuss further below. He also does a lot with the word “Oy:” it expresses skepticism and distaste wrapped in one little word.

The way Graham puts it, writing a school essay sounds pretty bad; concluding “that Ahab in Moby Dick was a Christ-like figure” in a pre-structured essay is tedious, if for no other reason than because a million other students and a much smaller number of teachers and professors have already concluded or been forced to conclude the same thing. I think that a) teaching literature can be a much better experience and still serves some institutional purposes, and b) teaching writing in the context of other subjects might not be any better.

Passion and interest

Graham:

The most obvious difference between real essays and the things one has to write in school is that real essays are not exclusively about English literature. Certainly schools should teach students how to write. But due to a series of historical accidents the teaching of writing has gotten mixed together with the study of literature. And so all over the country students are writing not about how a baseball team with a small budget might compete with the Yankees, or the role of color in fashion, or what constitutes a good dessert, but about symbolism in Dickens.

I’d love to get well-developed essays on baseball, economics, and fashion. But most students either don’t appear to have the kind of passion that would be necessary to write such essays or don’t appear able to express it. Alternately, they have passion, but not knowledge behind the passion: someone who’d read Moneyball and other baseball research and could put together this kind of essay, but almost no students have. Even those who do have the passion don’t have much knowledge behind their passion. I’ve been implicitly testing this theory for the past three and a half years: on my assignment sheets, I always include a line that tells students something like this: they can write on “a book or subject of your own choosing. If you write on a book or idea of your own, you must clear your selection with me first.” Almost none exercise this choice.

Now, one could argue that students have been brainwashed by 12 years of school by the time I’ve got them, and to some extent that’s probably true. But if a student were really, deeply interested in a subject, I think she’d be willing to say, “Hey, what if I mostly write about the role of imagination among physicists,” and I’d probably say yes. This just doesn’t happen often.

I think it doesn’t happen because students don’t know where to start, and they aren’t skilled enough to closely read a book or even article on their own. They don’t know how to compare and contrast passages well—the very thing I’m doing here. So I could assign a book about baseball and work through the “close reading” practice in class, but most people aren’t that interested in the subject, and then the people interested in fashion or math will be left out (and most students who say they’re “interested in fashion” appear to mean they skim Cosmo and Vogue).

If you’re going to write about a big, somewhat vague idea, like money in baseball, you need a lot more knowledge and many more sources than you do to write about “symbolism in Dickens.” Novels and stories have the advantage of being self-contained. That’s part of what got the New Criticism technique of “close reading” so ingrained in schools: you could give students 1984 and rely on the text itself to argue about the text. This has always been a bit of a joke, of course, because knowing about the lead up to World War II and the beginnings of the Cold War will give a lot of contextual information about 1984, but one can still read the novel and analyze it on its own terms more easily than one can analyze more fact-based material. So a lot of teachers rely on closely reading novels, which I’ll come back to in a bit.

There may be more to the story of why students are writing about 1984 and not “what constitutes a good dessert” beyond “a series of historical accidents.” Those accidents are part of the story, but not all.

Amateurs and experts

What’s appropriate for amateurs may not be appropriate for experts; Daniel Willingham makes this point at length in his book Why Don’t Students Like School: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom; he says that “Cognition early in training is fundamentally different from cognition late in training” and, furthermore, “[. . .] years of practice make a qualitative, not quantitative, difference in the way [scientists, artists, and others] think compared to how a well-informed amateur thinks.” We don’t get there right away: “Experts don’t think in terms of surface features, as novices do; they think in terms of functions, or deep structure.” It takes years of that dedicated practice to become an expert, and ten often appears to be it: “There’s nothing magical about a decade; it just seems to take that long to learn the background knowledge to develop” what one really needs to do the new, interesting, creative work that defines an expert.”

Graham is an expert writer. He, like other expert writers, can write differently than amateurs and still produce excellent work. Novice writes usually can’t write effectively without a main point of some sort in mind. I couldn’t, either, when I was a novice (though I tried). Graham says:

The other big difference between a real essay and the things they make you write in school is that a real essay doesn’t take a position and then defend it. That principle, like the idea that we ought to be writing about literature, turns out to be another intellectual hangover of long forgotten origins.

He’s right in the sense that real essays don’t have to take a position and defend it, but teachers insist on thesis statements for the same reason bikes for three-year olds have training wheels: otherwise the student-writer will fall over. If you don’t get students to take a position, you’ll get—maybe—summarization. If you don’t ask for and emphasize thesis statements, which are basically the position to be defended, you’ll get wishy-washy essay that don’t really say much of anything. And it’s not that they don’t say much of anything because they’re trying to explore a complex problems: they don’t say much of anything because the writer doesn’t have anything to say, or is afraid of saying anything, or doesn’t know how to explore a problem space. If you want an academic-ized version of what essays are, Wolfgang Holdheim says in The Hermeneutic Mode: Essays on Time in Literature and Literary Theory that “[…] works in the essay genre (rather than presenting knowledge as a closed and often deceptively finished system) enact cognition in progress, knowledge as the process of getting to know.” Students don’t have the cognition in progress they need to enact Graham-style essays. They haven’t evolved enough to write without the scaffolding of a thesis statement.

When I started teaching, I didn’t emphasize thesis statements and got a lot of essays that don’t enact cognition or make a point. The better ones instinctively made a point of some kind; the worse ones summarized. After a while I realized that I could avoid a lot of heartache on the part of my students by changing the way I was offering instruction, because students weren’t ready to write essays without taking a position and defending it.

So now I teach thesis statements more or less like every other English instructor. I try to avoid boring theses and encourage deep ones, but it’s nonetheless true that I’ve realized I was wrong and have consequently moved on. I consider the no-thesis-emphasized experiment just that: an experiment that taught me how I should teach. In the future, I might try other experiments that could lead me away from emphasizing thesis statements. But for now, I do teach students to take a perspective and defend it. Many don’t end up doing so—their papers end up more exploratory than disputatious—but the overall effect of telling them to take a point of view and defend it is a positive one.

I’m not the first one to have noticed the problem. In Patrick Allitt’s I’m the Teacher, You’re the Student, he says this of student writing in a history class:

Certain errors are so common as to be almost universal. The first one is that almost no student really knows how to construct an argument and then deploy information to support and substantiate it. Usually student papers describe what happened, more or less, then throw in an indignant moral judgment or two before stopping abruptly.

I know the feeling: students, when they start my class, mostly want to summarize what they’ve read. And, as Allitt notes, they badly want to moralize, or castigate other people, or to valorize their own difference from the weakness of the writer’s. I find the moralizing most puzzling, especially because it makes me think I’m teaching a certain number of people who are a) hypocrites or b) lack the empathy to understand where other writers come from, even if they don’t agree with said writer. They use ad-hominem attacks. When I assign Graham’s essays “What You’ll Wish You’d Known” and “What You Can’t Say,” a surprisingly large number of students say things like, “Who is this guy?”

When I tell them something along the lines of, “He started an early Internet store generator called Viaweb and now writes essays and an early-stage startup investment program,” their follow-up questions are usually a bit incoherent but boils down to a real question: Who gives him the authority to speak to us? They’re used to reading much-lauded if often boring writers in school. When I say something like, “Who cares who he is?” or “Shouldn’t we judge people based on their writing, not on their status?” they eye me suspiciously, like six-year olds might eye an eight-year old who casts aspersions on the Tooth Fairy.

They’ve apparently been trained by school to think status counts for a lot, and status usually means being a) old, b) dead, c) critically acclaimed by some unknown critical body, and d) between hard or soft covers, ideally produced by a major publisher. I’m not again any of those things: many if not most of my favorite writers fit those criteria. But it’d be awfully depressing if every writer had to. More importantly, assuming those are the major criteria for good writing is fairly bogus since most old dead critically acclaimed writers who are chiefly found between hard covers were once young firebrands shaking up a staid literary, social, political, or journalistic establishment with their shockingly fresh prose and often degenerate ideas. If we want to figure out who the important dead people will be in the future, we need some way of assessing living writers right now. We need something like taste, which is incredibly hard to teach. Most schools don’t even bother: they rely on weak fallback criteria that are wrapped up in status. I’d like my students to learn how to do better, no matter how hard.

Some of the “Who is this guy?” questions regarding Graham come from a moralizing perspective: students think or imply that someone who publishes writing through means other than books are automatically somehow lesser writers than those whose work is published primarily between hard covers (Graham published Hackers & Painters, as well as technical books, but the students aren’t introduced to him in that fashion; I actually think it useful not to mention those books, in order to present the idea that writing published online can be valid and useful).

Anyway, trying to get students to write analytically—to be able to understand and explain a subject before they develop emotional or ethical reactions to it—is really, incredibly difficult (Allitt mentions this too). And having them construct and defend thesis statements seems to help this process. Few students understand that providing analysis and interpretation is a better, subtler way of eventually convincing others of whatever emotional or ethical point of view you might hold. They want to skip the analysis and interpretation and go straight to signaling what kind of person they want the reader to imagine them to be.

Not all students have all these problems, and I can think of at least one student who didn’t have any of them, and probably another dozen or so (out of about 350) who had none or very few of these problems when they began class. I’m dealing with generalizations that don’t apply to each individual student. But class requires some level of generalization: 20 to 30 students land in a room with me for two and a half hours per week, and I, like all instructors, have to choose some level of baseline knowledge and expectation and some level of eventual mastery, while at the same time ensuring that writing assignments are hard enough to be a challenge and stretch one’s abilities while not being so hard that they can’t be completed. When I see problems like the ones described throughout this essay, I realize the kinds of things I should focus on—and I also realize why teachers do the things they do the way they do them, instead of doing some of the things Graham implies.

Reading Allitt makes me realize I’m not alone, and he has the same issues in history I have in English. His other problems—like having students who “almost all use unnecessarily complicated language”—also resonate; I talk a lot about some of the best and pithiest writing advice I’ve ever read (“Omit unnecessary words“), but that advice is much easier to state than implement (my preceding sentence began life saying, “much easier to say than to implement,” but I realized I hadn’t followed my own rule).

Graham again:

I’m sometimes accused of meandering. In defend-a-position writing that would be a flaw. There you’re not concerned with truth. You already know where you’re going, and you want to go straight there, blustering through obstacles, and hand-waving your way across swampy ground. But that’s not what you’re trying to do in an essay. An essay is supposed to be a search for truth. It would be suspicious if it didn’t meander.

But defend-a-position essays, if they’re taught and written well, shouldn’t be completely opposed to meandering, and they’re not about “blustering through obstacles.” They’re about considering what might be true, possible objections to it, addressing those questions, building roads over “swamp ground,” changing your mind if necessary, and so on—eventually getting to something like truth. In Graham’s conception of defend-a-position essays, the result is probably going to be lousy. The same is likely to be true of students who are taught the “hand-waving your way” method of writing. They should be taught that, if they discover their thesis is wrong, they should change their thesis and paper via the magic of editing. I think Graham is really upset about the quality of teaching.

Thesis statements also prevent aimless wandering. Graham says that “The Meander (aka Menderes) is a river in Turkey. As you might expect, it winds all over the place. But it doesn’t do this out of frivolity. The path it has discovered is the most economical route to the sea.” Correct. But students do this out of frivolity and tend to get nowhere. Students don’t discover “the most economical route to the sea;” they don’t have a route at all. They’re more like Israelites wandering in the desert. Or a body of water that simply drains into the ground.

Why literature?

Graham:

It’s no wonder if this [writing essays about literature] seems to the student a pointless exercise, because we’re now three steps removed from real work: the students are imitating English professors, who are imitating classical scholars, who are merely the inheritors of a tradition growing out of what was, 700 years ago, fascinating and urgently needed work.

We may have gotten to teaching students how to write through literature via the means Graham describes, but I don’t think the practice persists solely because of the history. It persists because teaching through literature offers a couple of major conveniences: literature can be studied as a self-contained object via close reading and offers a narrower focus for students than larger subjects that require more background.

The rise of literature in university departments started in the nineteenth century and really took off in the first half of the twentieth. It was helped enormously by the rise of “close reading,” a method that had two major advantages: the trappings of rigor and a relative ease of application.

The “trappings of rigor” part is important because English (and writing) needed to look analytical and scientific; Louis Menand covers this idea extensively in a variety of forums, including The Marketplace of Ideas: Reform and Resistance in the American University, where he says that the argument “that there is such a thing as specifically literary language, and that literary criticism provides an analytical toolbox for examining it—was the basis for the New Criticism’s claim to a place in the structure of the research university.” So students look at literature because teachers and professors believe there is “specifically literary language” that’s different from other kinds of language. I used to not think so. Now I’m not so sure. After having students try to write analyses of various kinds of nonfiction, I can see the attraction in teaching them fiction that doesn’t have a specific message it’s trying to impart, primarily because a lot of students simply don’t have sufficient background knowledge to add anything to most of the nonfiction they read. They don’t read nonfiction very carefully, which means they have trouble making any statements other than bald assertion and frequently saying things that be countered through appeals to the text itself. Getting them to read it carefully through the asking of detailed questions is both hard and tedious.

Enter close reading. It supplies literature with a rationale, as stated above, but it also works pretty well when used in classrooms. As a method, it only requires knowledge of the tool and some text to apply it on. Like literature. To do close reading, you have to know you should pay attention to the text and how its writer or speaker is using the language it does. From there, the text becomes what Umberto Eco calls “a machine conceived for eliciting interpretations” in a way that a lot of nonfiction isn’t.

Paul Graham’s essay “What You’ll Wish You’d Known,” which I teach in my first unit, almost always generates vastly worse papers than James Baldwin’s short story “Sonny’s Blues” because Graham has deliberately covered most of the interesting territory relating to his subject. “Sonny’s Blues,” on the other hand, is just trying to tell a story, and the possible meanings of that story extend incredibly far outward, and they can be generated through close readings and relatively little other knowledge. Students who want to discuss “What You’ll Wish You’d Known” intelligently need a vast amount of life experience and other reading to even approach it cogently.

Students who want to discuss “Sonny’s Blues” intelligently need to pay attention to how the narrator shifts over the course of the story, how sound words recur, what music might mean, and a host of other things that are already mostly contained in the story. Students seem to have much more difficulty discovering this. When I teach Joyce Carol Oates’ short story “Where Are You Going, Where Have You Been?”, students almost never realize how the story subtly suggests that Connie is actually in a dream that plays out her anxieties regarding puberty, adulthood, and encroaching sexuality. It offers a lot more substance for discussion and decent papers than Graham’s essays and a lot of other nonfiction.

Perhaps the bad papers on Graham are my own fault, but I’ve tried a lot of ways to get students to write better papers on nonfiction, usually without much success. I’ve begun to suspect they’re just not ready. Students can be taught close reading that, in an ideal world, then gets applied to nonfiction. The reading of literature, in other words, is upwind of the reading of other kinds of nonfiction, however useful or interesting those other kinds of nonfiction might be. If you’re dealing with not-very-bright high school teachers and students who know even less than college students, the advantages of close reading literature as a method are magnified.

This is a relatively new affair, too; here’s Louis Menand discussing where English departments came from and how T.S. Eliot influenced them:

The English department is founded on the belief that people need to be taught how to read literature. This is not a self-evident proposition. Before there were English departments, people read stories, poems, and plays without assuming that special training was required. But most English professors think that people don’t intuitively get the way that literary writing works. Readers think that stories and poems are filled with symbols that ‘stand for’ something, or that the beliefs expressed in them are the author’s own, or that there is a hidden meaning they are supposed to find. They are unable to make sense of statements that are not simple assertions of fact. People read literature too literally.

Now, maybe people don’t “need to be taught how to read literature” as literature. But they do need to be taught how to read closely, because most people are really bad at it, and literature offers advantages to doing so.

Most students don’t have very good reading skills. They can’t synthesize information from books and articles effectively. So if you turn them loose on a library without direction, they’ll dutifully look some stuff up, and you’ll get back a lot of papers with citations from pages three to nine. Not very many cite page 221. And the citations they have feel random, rather than cohesive. In a structured class, one can spend a lot of time close reading: what does the author mean here? Why this sentence, why this turn of phrase? How is the piece structured? If it’s a story, who’s speaking? These skills are hard to build—I’m still building mine—and most freshmen simply don’t have them, and they don’t have the energy to engage with writing on its own terms in an unstructured environment.

Giving them a topic and telling them to write is akin to taking a random suburbanite, dropping them in northern Canada, and wishing them luck in finding their way back to civilization. Sure, a few hardy ones will make it. But to make sure most make it, you’ll have to impart a lot of skills first. That’s what good high school and undergrad classes should do. The key word in the preceding sentence, of course, is “good:” lots of humanities classes are bad and don’t teach much of anything, which gives the humanities themselves a bad rap, as people recall horrific English or history teachers. But one bad example doesn’t mean the entire endeavor is rotten, even if the structure of schools isn’t conducive to identifying and rewarding good teachers of the sort who will teach writing well.

Bad Teaching and the Real Problem with Literature

English, like most subjects, is easy to do badly. Most English teachers teach their subjects poorly; that’s been my experience, anyway, and it seems to be the experience of most people in school. I’m not sure broadening the range of subjects will help all that much if the teacher himself is lousy, or uninterested in class, or otherwise mentally absent.

It’s also easy to understand why English teachers eventually come to scorn their students: the students aren’t perfect, have interests of their own, aren’t really willing to grant you the benefit of the doubt, aren’t interested in your subject, and don’t understand your point of view. Notice that last one: students don’t understand the teacher’s point of view, but after a while the teacher stops trying to understand the students’s point of view. “What?” the teacher thinks. “Not everyone finds The Tempest and Middlemarch as fascinating as I do?” Er, no. And that kind of thing bleeds into papers. The world might be a better place if teachers could choose more of their own material; I’ve read most of Middlemarch and find it pretty damn tedious. Perhaps giving teachers more autonomy to construct their own curriculum around works students like better would solve some of the literature problem. But if the median student doesn’t read anything for pleasure, what then?

Too many teachers also don’t have a sense of openness and possibility to various readings. They don’t have the deft touch necessary to apply both rigor and openness to their own readings and students’s readings. Works of art don’t have a single meaning (and if they did, they’d be rather boring). But that doesn’t equate to “anything can mean anything and everything is subjective.” In teaching English, which is often the process of teaching interpretation, one has to balance these two scales. No one balances them perfectly, but too many teachers don’t seem to balance them at all, or acknowledge that they exist, or care that they exist. So you get those essays that find, “say, that Ahab in Moby Dick was a Christ-like figure.” Which is okay and probably true, but I wouldn’t want to read 30 papers that come to that conclusion, and I wouldn’t order my students to come to that conclusion. I’d want them to figure out what’s going on in the novel (then again, in composition classes I teach a lot of stuff outside the realm of “English literature”).

Not being a bogus teacher is really hard. Teachers aren’t incentivized to not be bogus: most public high school teachers effectively can’t be fired after two or three years, thanks to teachers’ unions, except in the case of egregious misconduct. Mediocrity, tedium, torpor, and the like aren’t fireable or punishable offenses. Students merely have to suffer through until they get to college, although some get lucky and find passionate, engaged teachers. But it’s mostly a matter of luck, and teaching seems to actively encourage the best to leave and the worst to stay. Even at college, however, big public schools incentivize professors and graduate students to produce research (or, sometimes “research,” but that’s a topic for another essay), not to teach. So it’s possible to go through 16 years of education without encountering someone who is heavily incentivized to teach well. Some people teach well because they care about teaching well—I’d like to think I’m one—but again, that’s a matter of luck, not a matter of systematic efforts to improve the education experience for the maximum number of students.

Teachers can, and do, however, get in trouble for being interesting. So there’s a systematic incentive to be boring.

In an essay that used to be called “Good Bad Attitude” and now goes by “The Word ‘Hacker,’” Graham says that “Hackers are unruly. That is the essence of hacking. And it is also the essence of American-ness.” Writers are unruly too. At least the good ones are. But many teachers hate unruliness and love conformity. So they teach writing (and reading—you can’t really do one without the other) on the factory model, where a novel or whatever goes in one end and is supposed to emerge on the other like a car, by making sure every step along the way is done precisely the same way. But writing (and, to some extent, reading) doesn’t really work that way, and students can sense as much in some inchoate way. Graham, too, senses that the way we teach writing and reading is busted, and he’s right that we’d be better off encouraging students to explore their own interests more. That’s probably less important than cultivating a sense of openness, explicitly telling students when you’re ordering them to do something for training-wheel purposes, admitting what you don’t know, acknowledging that there’s an inherent level of subjectivity to writing, and working on enumerating principles that can be violated instead of iron-clad rules that are almost certainly wrong.

Most students aren’t interested in English or writing; one can do a lot to make them interested, but it’s necessarily imperfect, and a lot of classrooms are unsatisfying to very bright people (like Graham and, I would guess, a lot of his readers), but that’s in part because classrooms are set up to hit the broad middle. And the broad middle needs thesis statements, wouldn’t know how to start with a wide-open prompt, and aren’t ready for the world of writing that Graham might have in mind.

While a series of historical accidents might’ve inspired the teaching we get now, I don’t think they’re solely responsible for the continuation of teaching literature. Teaching literature and close reading through literature continue to serve pedagogical purposes. So Graham isn’t wrong, but he’s missing a key piece of the story.

Writing this essay

When you’re thinking about a topic, start writing. I began this essay right after breakfast; I started thinking about it while making eggs and thinking about the day’s teaching. I had to interrupt it to go to class and do said teaching, but I got the big paragraph about “status” and a couple notes down. If you’re not somewhere you can write, use a notebook—I like pretentious Rhodia Webbies, but any notebook will do. If you don’t have a notebook, use a cell phone. Don’t have a phone? Use a napkin. Whatever. Good ideas don’t always come to you when you’re at your computer, and they often come while you’re doing something else. Paul Graham gets this: in “The Top Idea in Your Mind,” he wrote:

I realized recently that what one thinks about in the shower in the morning is more important than I’d thought. I knew it was a good time to have ideas. Now I’d go further: now I’d say it’s hard to do a really good job on anything you don’t think about in the shower.

Everyone who’s worked on difficult problems is probably familiar with the phenomenon of working hard to figure something out, failing, and then suddenly seeing the answer a bit later while doing something else. There’s a kind of thinking you do without trying to. I’m increasingly convinced this type of thinking is not merely helpful in solving hard problems, but necessary. The tricky part is, you can only control it indirectly.

Most students don’t do this and don’t think this way. If they did, or could be instructed to, I suspect Graham’s ideas would work better.

Knowing it

Students themselves, if they’re intellectually honest, intuit a lot of the advice in this essay. One recent paper writer said in a reflection that: “My first draft does not have a direction or a point, but my final draft does.” Not all writing needs a point, but if you read student writing, you find that very little of it lacks a point because the author is trying to discover something or explore something about the world. It lacks a point because it’s incoherent or meandering. Again: that’s not me trying to be a jerk, but rather a description of what I see in papers.

Here’s another: “You were correct in telling me that writing a paper by wrapping evidence around big ideas rather than literary analysis would be difficult, and I found that out the hard way.” These writers could be trying to suck up or tell me what I want to hear, but enough have said similar things in a sufficient number of different contexts to make me think their experiences are representative. And I offer warnings, not absolute rules: if students want to write “big idea” papers, I don’t order them not to, though many suffer as a result. Suffering can lead to growth. A few thrive. But such students show why English instructors offer the kinds of guidance and assignments they do. These can be parodied, and we’ve all had lousy English classes taught by the incompetent, inept, and burned out.

If I had given students assignments closer to the real writing that Graham does, most simply wouldn’t be able to do them. But I am pushing students in the direction of real writing—which is part of the reason I tell the ones who want to really write to read “The Age of the Essay.” I love the essay: it’s only some of the reasoning about why schools operate the way they do that bothers me, and even then I only came to discover why things are done the way they are by doing them.

If you think you can teach writing better, I encourage you to go try it, especially in a public school or big college. I thought I could. Turned out to be a lot harder than I thought. Reality has a surprising amount of detail.

EDIT: In A Jane Austen Education, William Deresiewicz writes:

My professor taught novels, and Catherine was mistaught by them, but neither he nor Austen was finally concerned with novels as such. Learning to read, they both knew, means learning to live. Keeping your eyes open when you’re looking at a book is just a way of teaching yourself to keep them open all the time.

Novels are tricky in this way: they’re filled with irony, which, at its most basic, means saying one thing while meaning something else, or saying multiple things and meaning multiple things. That’s part of what “learning to live” consists of, and fiction does a unique job of training people to keep their eyes “open all the time.” Most teachers are probably bad at conveying this, but I do believe that this idea, or something like it, lies underneath novels as tools for teaching students how to live in a way that essays and other nonfiction probably doesn’t do.

A lot of people seem very eager to stop learning how to live as quickly as possible. They might have the hardest time of all.

Facebook, go away—if I want to log in, I know where to find you

Facebook keeps sending me e-mails about how much I’m missing on Facebook; see the image at the right for one example. But I’m not convinced I’m missing anything, no matter how much Facebook wants me to imagine I am.

In “Practical Tips on Writing a Book from 23 Brilliant Authors,” Ben Casnocha says that writers need to “Develop a very serious plan for dealing with internet distractions. I use an app called Self-Control on my Mac.” Many other writers echo him. We have, all of us, a myriad of choices every day. We can choose to do something that might provide some lasting meaning or value. Or we can choose to tell people who are often effectively strangers what we ate for dinner, or that we’re listening to Lynyrd Skynyrd and Lil’ Wayne, or our inconsidered, inchoate opinions about the political or social scandal of the day, which will be forgotten by everybody except Wikipedia within a decade, if not a year.

Or we can choose to do something better—which increasingly means we have to control distractions—or, as Paul Graham puts it, “disconnect” them. Facebook and other entities that make money from providing distractions are, perhaps not surprisingly, very interested in getting you more interested in their distractions. That’s the purpose of their e-mails. But I’ve becoming increasingly convinced that Facebook offers something closer to simulacra than real life, and that the people who are going to do something really substantial are, increasingly, going to be the people who can master Facebook—just as the people who did really substantial things in the 1960 – 2005 period learned to master TV.

Other writers in the “Practical Tips” essay discuss the importance of setting work times (presumably distinct from Facebook times) or developing schedules or similar techniques to make sure you don’t let, say, six hours pass, then wonder what happened during those six hours—probable answers might include news, e-mail, social networks, TV, dabbling, rearrange your furniture, cleaning, whatever. All things that might be worthwhile, but only in their place. And Facebook’s place should be small, no matter how much the site itself encourages you to make it big. I’ll probably log on Facebook again, and I’m not saying you should never use Facebook, or that you should always avoid the Internet. But you should be cognizant of what you’re doing, and Facebook is making it increasingly easy not to be cognizant. And that’s a danger.

I was talking to my Dad, who recently got on Facebook—along with Curtis Sittenfeld joining, this is a sure sign Facebook is over—and he was creeped out by having Pandora find his Facebook account with no active effort on his part; the same thing happened when he was posting to TripAdvisor under what he thought was a pseudonym. On the phone, he said that everyone is living in Neuromancer. And he’s right. Facebook is trying to connect you in more and more places, even places you might not necessarily want to be connected. This isn’t a phenomenon unique to Facebook, of course, but my Dad’s experience shows what’s happening in the background of your online life: companies are gathering data from you that will reappear in unpredictable places.

There are defenses against the creeping power of master databases. I’ve begun using Ghostery, a brilliant extension for Firefox, Safari, and Chrome that lets one see web bugs, beacons, and third-party sites that follow your movements around the Internet. Here’s an example of the stuff Salon.com, a relatively innocuous news site, loads every time a person visits:

What is all that stuff? It’s like the mystery ingredients in so much prepackaged food: you wonder what all those polysyllabic substances are but still know, on some level, they can’t be good for you. In the case of Salon.com’s third-party tracking software, Ghostery can at least tell you what’s going on. It also gives you a way to block a lot of the tracking—hence the strikethroughs on the sites I’ve blocked. The more astute among you will note that I’m something of a hypocrite when it comes to a data trail—I still buy stuff from Amazon.com, which keeps your purchase history forever—but at least one can, to some extent, fight back against the companies who are tracking everything you do.

But fighting back technologically, through means like Ghostery, is only part of the battle. After I began writing this essay, I began to notice things like this, via a Savage Love letter writer:

I was briefly dating someone until he was a huge asshole to me. I have since not had any contact with him. However, I have been Facebook stalking him and obsessing over pictures of the guys I assume he’s dating now. Why am I having such a hard time getting over him? Our relationship was so brief! He’s a major asshole!

I don’t think Facebook is making it easier for the writer to get over him or improve your life. It wouldn’t be a great stretch to think Facebook is making the process harder. So maybe the solution is to get rid of Facebook, or at least limit one’s use, or unfriend the ex, or some combination thereof. Go to a bar, find someone else, reconnect with the real world, find a hobby, start a blog, realize that you’re not the first person with these problems. Optimal revenge, if you’re the sort of person who goes in that direction, is a life well-lived. Facebook stalking is the opposite: it’s a life lived through the lives of others, without even the transformative power of language that media like the novel offer.

Obviously, obsessive behavior predated the Internet. But the Internet and Facebook make it so much easier to engage in obsessive behavior—you don’t even have to leave your house!—that the lower friction costs make the behavior easier to indulge. One solution: remove the tool by which you engage in said obsessive behavior. Dan Savage observes, “But it sounds like you might still have feelings for this guy! Just a hunch!” And if those feelings aren’t reciprocated, being exposed to the source of those feelings on a routine basis, even in digital form, isn’t going to help. What is going to help? Finding an authentic way of spending your time; learning to get in a state of flow; building or making stuff that other people find useful. Notice that Facebook is not on that list.

Some of you might legitimately ask why I keep a Facebook account, given my ambivalence, verging on antipathy. The answers are several fold: the most honest is probably that I’m a hypocrite. The next-most honest is that, if / when my novels start coming out, Facebook might be useful as an ad tool. And some people use Facebook and only Facebook to send out messages about events and parties. It’s also a useful to figure out when I’m going to a random city who might’ve moved there. Those people you lost touch with back in college suddenly become much closer when you’re both strangers somewhere.

But those are rare needs. The common needs that Facebook fulfills—to quasi-live through someone else’s life, to waste time, to feel like you’re on an anhedonic treadmill of envy—shouldn’t be needs at all. Facebook is encouraging you to make them needs. I’m encouraging you to realize that the real answers to life aren’t likely to be found on Facebook, no matter how badly Facebook wants to lure you to that login screen—they’re likely going to be found within.


By the way, I love In Practical Tips on Writing a Book from 23 Brilliant Authors. I’ve read it a couple times and still love it. It’s got a lot of surface area for such a short post, which is why I keep linking to it in various contexts.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

Sexting and society: How do writers respond?

In a post on the relative quality of fiction and nonfiction, I mentioned that fiction should be affected by how society and social life changes. That doesn’t mean writers should read the news de jour and immediately copy plot points, but it does mean paying attention to what’s different in contemporary attitudes and expression. I got to thinking about “sexting,” an unfortunate but useful portmanteau, because it’s an example of a widespread, relatively fast cultural change enabled by technology. (Over a somewhat longer term, “From shame to game in one hundred years: An economic model of the rise in premarital sex and its de-stigmatisation” describes “a revolution in sexual behaviour,” which may explain why a lot of contemporary students find a lot of nineteenth century literature dealing with sexual mores to be tedious.)

Laws that cover sexting haven’t really caught up with what’s happening on the ground. Penelope Trunk wrote a an article called The Joys of Adult Sexting, in which she does it and thinks:

And what will his friends think of me? Probably nothing. Because they have women sending nude photos of themselves. It’s not that big a deal. You know how I know? Because the state of Vermont, (and other states as well) is trying to pass a law that decriminalizes sending nude photos of oneself if you are underage. That’s right: For years, even though kids were sending nude photos of themselves to someone they wanted to show it to, the act was illegal—an act of trafficking in child pornography.

But sending nude photos is so common today that lawmakers are forced to treat it as a mainstream courting ritual and legalize it for all ages.

Sending a naked photo of yourself is an emotionally intimate act because of the implied trust you have in the recipient. When you act in a trusting way—like trusting the recipient of the photo to handle it with care and respect—you benefit because being a generally trusting person is an emotionally sound thing to do; people who are trusting are better judges of character.

Trunk’s last paragraph explains why, despite all the PSAs and education and whatever in the world, people are going to keep doing it: because it shows trust, and we want significant others to prove their trust and we want to show significant others we trust them. You can already imagine the dialogue in a novel: “Why won’t you send me one? Don’t you trust me?” If the answer is yes, send them; if the answer is no, then why bother continuing to date? The test isn’t fair, of course, but since when are any tests in love and lust fair?

Over time, as enough kids of legislators and so forth get caught up in sexting scandals and as people who’ve lived with cell phone cameras grow up, I think we’ll see larger change. For now, the gap between laws / customs and reality make a fruitful space for novels, even those that don’t exploit present circumstances well, like Helen Shulman’s This Beautiful Life. Incorporating these kinds of social changes in literature is a challenge and will probably remain so; as I said above, that doesn’t mean novelists should automatically say, “Ah ha! Here’s today’s headlines; I’m going to write a novel based on the latest sex scandal/shark attack/celebrity bullshit,” but novelists need to be aware of what’s going on. I wrote a novel called Asking Alice that got lots of bites from agents but no representation, and the query letter started like this:

Maybe marriage would be like a tumor: something that grows on you with time. At least that’s what Steven Deutsch thinks as he fingers the ring in his pocket, trying to decide whether he should ask Alice Sherman to marry him. Steven is almost thirty, going on twenty, and the future still feels like something that happens to other people. Still, he knows Alice won’t simply agree to be his long-term girlfriend forever.

When Steven flies to Seattle for what should be a routine medical follow up, he brings Alice and hits on a plan: he’ll introduce her to his friends from home and poll them about whether, based on their immediate judgment, he should ask Alice. But the plan goes awry when old lovers resurface, along with the cancer Steven thought he’d beaten, and the simple scheme he hoped would solve his problem does everything but.

Asking Alice is asking questions about changes in dating and marriage; if you write a novel today about the agonies of deciding who to marry with the metaphysical angst such a choice engendered in the nineteenth century, most people would find that absurd and untrue: if you get married to a Casaubon, you divorce him and end up in about the same circumstance as you were six months before you started. But a lot of people still get married or want to get married, and the question is still important even if it can’t drive the plot of a novel very well. It can, however, provide a lot of humor, and that’s what Asking Alice does.

A lot of literature, like a lot of laws, is also based on the premise that women don’t like sex as much as men, don’t or won’t seek it out, and are automatically harmed by it or wanting it. This is a much more tenuous assertion than it used to be, especially as women write directly about sex. A novel liked Anita Shreve’s Testimony, discussed extensively by Caitlin Flanagan here and by me here, engages that idea and finds it somewhat wanting. So does the work of Belle de Jour (now revealed as Dr. Brooke Magnanti), who basically says, “I worked as a hooker for a long time, didn’t mind it, and made a shit ton of money because I made a rational economic decision.” A lot of academic fiction premised on professors having sex with students examines the idea that female students can want/use sex just as much as men; this is how Francine Prose’s Blue Angel works, and Prose is a canny observer of what’s going on and how it connects to the past.

Note that women wrote all these examples, which I don’t think is an accident, since they’re probably less likely to put other women on pedestals than men are. I’ve been reading a lot of sex memoirs / novels written by women (Never the Face; Nine and a Half Weeks; two of Mary Karr’s memoirs, which are good but overrated; Abby Lee (British sex blogger); Elisabeth Eaves’ Bare) in part because I want to write better female characters. After reading a lot of this stuff, I’m even less convinced than I was that there are stereotypically “male” or “female” ways of thinking or writing about the world, but knowledge itself never hurts and I don’t regret the time spent. On a similar note, Janice Radway’s Reading the Romance is totally fascinating, even when Radway tries to explain away retrograde features of romances or how women are often attracted to high-powered, high-status men.

She write in a time before sexting, but I wonder if she’s thought about doing a Young Adult version using similar methodology today. For writers and others, sexting shows that teenagers can make their own decisions as people too, even if those are arguably bad decisions. To me, this is another generational gap issue, and one that will probably close naturally over time. One older agent said on the phone that maybe I needed a younger agent, because her assistant loved Asking Alice but she didn’t want to rep it.

Damn.

I’m old enough to have lived through a couple medium-scale social changes: when I was in high school, people still mostly talked to each other on the phone. In college, people called using cell phones and often communicated via IM. After college, I kept using phones primarily for voice, especially to arrange drinks / quasi-dates, until I realized that most girls have no ability to talk on the phone anymore (as also described Philip Zimbardo and the ever-changing dynamics of sexual politics). As I result, I’d now use text messages if I were arranging drinks and so forth. Around the time I was 23, I realized that even if I did call, women would text back. That doesn’t mean one should race out and change every phone conversation in a novel that features a contemporary 19-year-old to a text conversation (which would be tedious in and of itself; in fiction I write, I tend not to quote texts very often), but it’s the kind of change that I register. Things changed between the time I was 16 and 23.

I’m in the McLuhan, “the medium changes what can be said,” which means that the text is probably changing things in ways not immediately obvious or evident. Sexting is one such way; it lowers the cost of transmission of nude pictures to the extent that you can now do so almost instantly. Laws are predicated on the idea that balding, cigar-chomping, lecherous 40-year-old men will try and coerce 16-year-old girls outside cheer practice, not ubiquitous cell phone cameras. Most parents will instinctively hate the cigar-chomping 40-year-old. They will not hate their own 14-year-old. So you get for all sorts of amusement where laws, putative morals, conventional wisdom, technology, and desire meet. Still, when pragmatics meet parents, expect parental anger / protectiveness to win for the moment but not for all time. Nineteenth and twentieth century American culture is not the only kind out there. As Melvin Konner wrote in The Evolution of Childhood:

Contrary to some claims of cultural historians, anthropologists find that liberal premarital sex mores are not new for a large proportion of the cultures of the ethnological record and that liberal sexual mores and even active sexual lives among adolescents do not necessarily produce pregnancies. In fact, a great many cultures permit or at least tolerate sex play in childhood (Frayser 1994). Children in these cultures do not play ‘doctor’ to satisfy their anatomical curiosity—they play ‘sex.’ They do play ‘house’ as Western children do, but the game often includes pretend-sex, including simulated intercourse. Most children in non-industrial cultures have opportunities to see and hear adult sex, and they mimic and often mock it.

Perhaps our modern aversion to sex among adolescents is in part because of the likelihood of pregnancy, economic factors, and others. Given the slow but real outcry from places like the Economist and elsewhere, this might eventually change. That’s pretty optimistic, however. A lot of social and legal structures merely work “good enough,” and the justice system is certainly one of those: we’ve all heard by now about cases where DNA evidence resulted in exoneration of people accused of murder or rape. So maybe we’re now heading towards a world in which laws about sexting are unfair, especially given current practice, but the laws remain anyway because the law doesn’t have to be optimal: it has to be good enough, and most people over 18 probably don’t care much about it unless it happens to be their son or daughter who gets enmeshed in a legal nightmare for behavior that doesn’t result in tangible harm.

Something like a quarter to a third of American adults have smoked pot, but we still have anti-pot laws. America can easily afford moral hypocrisy, at least for now, and maybe sexting will be something like weed: widely indulged in, a rite of passage, and something not likely to result in arrest unless you happen to be unlucky or in the wrong situation at the wrong time. The force generation the prohibition—that is, parents engaging in daughter-guarding—might be much stronger than the force of individual rights, utilitarianism, or pragmatic observations about the enforcement of laws against victimless crimes that do not result in physical harm.

There’s more of the legal challenges around this in Ars Technica’s article “14-year old child pornographers? Sexting lawsuits get serious,” which should replace “serious” with “ridiculous.” In the case, a 14-year-old girl sent a 14-year-old boy a video of herself masturbating, and then her family sued his. But how does a 14-year-old be guilty of the sexual exploitation of children,” as is claimed by the girl’s family—if a 14 year old can’t consent to consent to this kind of activity, then a 14-year old also can’t have the state of mind necessary to exploit another one. Paradoxes pile up, of the sort described in Regulating Sex: The Politics of Intimacy and Identity, where the writers show how the age of consent has been rising as the age of being tried as an adult has been falling. Somewhere inside that fact, or pair of facts, there’s a novel waiting to be written.

Questions like “What happens when people do things sexually that they’re not supposed to? How does the community respond? How do they respond?” are the stuff novelists feed on. They motivate innumerable plots, ranging from the beginnings of the English novel at Pamela and Clarissa all the way to the present. When Rose and Pinkie are first talking to each other in Brighton Rock, Rose lies about her age: ” ‘I’m seventeen,’ she said defiantly; there was a law which said a man couldn’t go with you before you were seventeen.” Brighton Rock was published in 1938. People have probably been evading age-of-consent laws for as long as there have been such laws, and they will probably continue to do so—whether those laws affect sex or depictions of the body.

Adults have probably been reinforcing prohibitions for as long as they’ve existed. Consider this quote, from the Caitlin Flanagan article about Testimony linked above:

Written by a bona fide grown-up (the author turned 63 last fall), Testimony gives us not just the lurid description of what a teen sex party looks like, but also an exploration of the ways that extremely casual sex can shape and even define an adolescent’s emotional life. One-night stands may be perfectly enjoyable exercises for two consenting adults, but teenagers aren’t adults; in many respects, they are closer to their childhoods than to the adult lives they will eventually lead. Their understanding of affection and friendship, and most of all their innocent belief, so carefully nurtured by parents and teachers, that the world rewards kindness and fairness, that there is always someone in authority to appeal to if you are being treated cruelly or not included in something—all of these forces are very much at play in their minds as they begin their sexual lives.

In Testimony, the sex party occurs at the fictional Avery Academy; Shreve imagines Siena, the girl at the center of the event, as a grifter, eager to exploit her new status as victim so that she can write a killer college essay about it, or perhaps even appear on Oprah. For the most part, the boys are callous and self-serving.

Flanagan has no evidence whatsoever that “teenagers aren’t adults” other than bald assertion. That “they are closer to their childhoods than to the adult lives they will eventually lead” has more to do with culture than with biology, as Robert Epstein argues in The Case Against Adolescence: Rediscovering the Adult in Every Teen and Alice Schlegel and Herbert Barry argue in Adolescence: An Anthropological Inquiry, and even then, it depends on when a particular person hits puberty, how they react, and how old they are; nineteen-year olds are probably closer to their adults selves than thirteen-year olds. Saying that teenagers believe, according to an ethos created by teachers, that “the world rewards kindness and fairness,” indicates that Flanagan must have had a very different school experience than I did or a lot of other people did (for more, see “Why Nerds are Unpopular.”) As I recall, school was capricious, arbitrary, and often stupid; the real world rewards fulfilling the desires of others, whether artistically, financially, sexually, or otherwise, while the school world rewards jumping through hopes and mindless conformity. If I don’t like the college I go to, I can transfer; if I don’t like my job, I can quit; if I don’t like some other milieu, I leave it. In contrast, school clumps everyone together based on an accident of geography.

In Testimony, Shreve misses or chooses not to emphasize that Sienna enjoys the attention, and she’s not actually got much beyond that. She says that “I”m going to start a new life. I can be, like, Sienna. I can whoever I want” {Shreve “Testimony”@27}. In Rob’s voice, Sienna is described this way:

I remember that Sienna started moving to the beat, a beer in her hand, as if she were in a world of her own, just slowly turning this way and that, and moving her hips to the music, and little by little the raucous laughter started to die down, and we were all just watching her. She was the music, she was the beat. Her whole little body had become this pure animal thing. She might have been dancing alone in her room. She didn’t look at any of us, even as she seemed to be looking at all of us. There was no smile on her face. If it was a performance, it was an incredible one. I don’t think anyone in the room had ever seen anything like it. She was in this light-blue halter top with these tight jeans. The heels and her little jacket were gone already. You just knew. Looking at her, you just knew.

She took off her own clothes, and “We watched as she untied her halter top at the neck. The blue cloth fell to reveal her breasts. They were beautiful and firm and rounded like her face. You knew at that moment you were in for good [. . .]” Later, he says “It was group seduction of the most powerful kind.” Given how Mike, the headmaster, describes the video in the first section, it’s hard to see Sienna as lacking agency, or someone who’s coerced into her actions. That, in the end, is what I think makes the Caitlin Flanagans of the world so unhappy: if the Siennas will perform their dances and give it up freely and happily, does that mean other girls will have to chase the market leader? Will they have to acknowledge that a reasonably large minority of girls like the action, like the hooking up, like the exploring? If so, a lot of Western narratives about femininity go away, if they haven’t already. If you’re a novelist, you have to look at the diversity of people out there and the diversity of their desires. Shreve does this quite well. So does Francine Prose in Blue Angel. If you’re writing essays / polemics, though, you can questionable claim that teenagers are closer to their childhood selves all you want.

I like Flanagan’s writing because she’s good at interrogating what’s going on out there, but I’m not the first to notice her problems with politics; William Deresiewicz is more concise than I am when he writes Two Girls, True and False, but the point is similar. Flanagan wants to imply that all people, or all girls, are the same. They aren’t. The ones unhappy with the hookup culture are certainly out there, and they might be the majority. But the Siennas are too. To deny them agency because they’re 14 is foolish. Matthew, J. Dot’s father, says that “The irony was that if a few kids had done something similar at the college, they’d be calling it an art film.” He’s right. Things don’t magically change at 18. Our culture and legal system are designed around the fiction that everything changes at 18, when it actually does much earlier. The gap between puberty and 18, however, is a fertile ground for novelists looking for cultural contradictions.

Why we need the third way: "What Are You Going to Do With That" and the need for imagination

In “What Are You Going to Do With That?,” William Deresiewicz tells the freshmen class at Stanford:

In the journey toward the success that you all hope to achieve, you have completed, by getting into Stanford, only the first of many legs. Three more years of college, three or four or five years of law school or medical school or a Ph.D. program, then residencies or postdocs or years as a junior associate. In short, an ever-narrowing funnel of specialization. You go from being a political-science major to being a lawyer to being a corporate attorney to being a corporate attorney focusing on taxation issues in the consumer-products industry. You go from being a biochemistry major to being a doctor to being a cardiologist to being a cardiac surgeon who performs heart-valve replacements.

But he goes on to point out why and how these kinds of defined professional paths—the ones high school and college students students are so often told constitute “success”—might not be optimal, for either the person on the path or society in general. If you “simply go with the flow,” you can end up merely being defined by what someone else has laid out. Perhaps not surprisingly, Deresiewicz goes on to say, “There is an alternative.” He calls it “moral imagination” and defines it this way: “Moral imagination means the capacity to envision new ways to live your life.” I would call it something else: the “third way.”

Deresiewicz’s essay shows why we need more talk about the third way: there are more options out there than further advanced schooling. Stanford in particular is a good place to be reminded of this. Obviously, Deresiewicz doesn’t say you must choose grad school or the professions, but the absence of any acknowledgement about starting your own company implies that those are the two primary choices.

I’ve similar talk. In my interview with him, Tucker Max describes the primary speech he gives at colleges:

[. . . W]hen you’re an undergrad, generally you think you can do two things. You’re gonna have to get a job after you graduate or you gotta go do more school. Because everyone who’s giving you advice or telling you how to live your life are people who’ve done one of those two things.

He describes a “third way,” with his two normal paths defined a lot like Deresiewicz’s, but in a lower register:

You don’t generally have anyone in your life who has gone out on their own and done something entrepreneurial or done something artistic or truly risky or truly taken the path less traveled, because those people [. . .] don’t work in academics. And don’t become cubicle monkeys. So what I try and explain in my speeches is that there’s a third way. Because a lot of people—I think most people—want to do something besides those two things.

A lot of people want to do something else, but that something else is, in some ways, harder to do than the normal path. Yet the people who go the third way often talk about it as being more satisfying, and the people who go the “two paths” often speak wistfully of the third—despite the difficulty one is likely to encounter. A friend wrote this to me: “I know for a fact that I’d hate [Tucker] Max’s writing, but he’s dead right about how few students are aware that they can do something artistic or creative or entrepreneurial.” Too few students are aware of this—and too few people in general are. You can consider this post a very small step in the direction of increasing awareness.

So far I’ve noted two examples. Paul Graham talks about the problem of standard paths too, in “A Student’s Guide to Startups:” “Till recently graduating seniors had two choices: get a job or go to grad school. I think there will increasingly be a third option: to start your own startup.” His answer is more defined than Deresiewicz’s or Max’s, but the very language he uses is similar. But he’s also got a way of generating the “third way” by funding startups. Instead of merely telling people to find one, he’s creating a third way for people to flow, which might be the most valuable contribution of all, at least for the technically inclined.

I think all three of these disparate writers—Deresiewicz, Max, and Graham—are pointing to a more fundamental need for the imagination necessary to exit the obvious paths that so often end up going nowhere. Of the three, Graham has done the most to institutionalize this process and make it available for others by starting Y Combinator. Max has probably done the most to be a living embodiment of an unusual third way. Deresiewicz is pointing to the possibility from within the way of a well-defined path (and the same one I’m one) from undergrad to graduate school to being a professor. Taken together, they diagnose and offer treatment for the same malady that can’t quite be identified yet comes from so many sources and has so many symptoms: Dilbert, cubicles, malaise, ennui, florescent lights, midlife crises, 20-somethings with advanced degrees working as baristas, waiters, or bartenders, essay writers.

Artistic or creative activities don’t usually come prepackaged in convenient jobs that get handed to college graduates. They get created by people who are artistic and creative, who find a way to turn what they want to do, or their inchoate ideas, into something greater than the idea itself. The “inchoate idea” is important: I suspect most people don’t entirely know what they’re doing when they find a third way. Steven Berlin Johnson has a term for this in his book Where Good Ideas Come From: The Natural History of Innovation: the slow hunch. This happens when something that you’ve been gnawing on slowly develops over time. Johnson describes it much more fully, of course, but a lot of my ideas in writing novels or academic work comes from slow hunches. Writing fiction isn’t an activity that really comes packaged in convenient job form: it is made by each practitioner individually. People who succeed as writers sometimes do so not through conventional publishing, but through alternate ways—as Max did with his website, or as J.A. Konrath apparently does with his blog, “A Newbie’s Guide to Publishing.”

Like Deresiewicz and Max, I don’t really have a solution to the problem other than to encourage you to think imaginatively. But who’s against thinking imaginatively? Partners are probably telling their third-year associates the same thing, even as the associates put in soul-killing seventy hours weeks under those menacing florescent lights. The other part of my solution is to be aware of the problem. I’ll also channel Graham in “What You’ll Wish You’d Known” and encourage you to stay upwind:

In the graduation-speech approach, you decide where you want to be in twenty years, and then ask: what should I do now to get there? I propose instead that you don’t commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward.

It’s not so important what you work on, so long as you’re not wasting your time. Work on things that interest you and increase your options, and worry later about which you’ll take.

Suppose you’re a college freshman deciding whether to major in math or economics. Well, math will give you more options: you can go into almost any field from math. If you major in math it will be easy to get into grad school in economics, but if you major in economics it will be hard to get into grad school in math.

Flying a glider is a good metaphor here. Because a glider doesn’t have an engine, you can’t fly into the wind without losing a lot of altitude. If you let yourself get far downwind of good places to land, your options narrow uncomfortably. As a rule you want to stay upwind.

“Work on things that interest you and increase your options:” the target of Graham’s essay is nominally high school students, but it’s applicable to a much broader swath of people. Maybe you’re one. If so, however, you’ll probably read this and then go back to filling out those TPS reports. Or maybe you’ll be one of the very rare people who realize there is no speed limit and react appropriately. At least you can’t say that no one told you. At least three people have: Deresiewicz, Max, and Graham. Four if you count me, writing a meta essay.

Why we need the third way: “What Are You Going to Do With That” and the need for imagination

In “What Are You Going to Do With That?,” William Deresiewicz tells the freshmen class at Stanford:

In the journey toward the success that you all hope to achieve, you have completed, by getting into Stanford, only the first of many legs. Three more years of college, three or four or five years of law school or medical school or a Ph.D. program, then residencies or postdocs or years as a junior associate. In short, an ever-narrowing funnel of specialization. You go from being a political-science major to being a lawyer to being a corporate attorney to being a corporate attorney focusing on taxation issues in the consumer-products industry. You go from being a biochemistry major to being a doctor to being a cardiologist to being a cardiac surgeon who performs heart-valve replacements.

But he goes on to point out why and how these kinds of defined professional paths—the ones high school and college students students are so often told constitute “success”—might not be optimal, for either the person on the path or society in general. If you “simply go with the flow,” you can end up merely being defined by what someone else has laid out. Perhaps not surprisingly, Deresiewicz goes on to say, “There is an alternative.” He calls it “moral imagination” and defines it this way: “Moral imagination means the capacity to envision new ways to live your life.” I would call it something else: the “third way.”

Deresiewicz’s essay shows why we need more talk about the third way: there are more options out there than further advanced schooling. Stanford in particular is a good place to be reminded of this. Obviously, Deresiewicz doesn’t say you must choose grad school or the professions, but the absence of any acknowledgement about starting your own company implies that those are the two primary choices.

I’ve had similar talk. In my interview with him, Tucker Max describes the primary speech he gives at colleges:

[. . . W]hen you’re an undergrad, generally you think you can do two things. You’re gonna have to get a job after you graduate or you gotta go do more school. Because everyone who’s giving you advice or telling you how to live your life are people who’ve done one of those two things.

He describes a “third way,” with his two normal paths defined a lot like Deresiewicz’s, but in a lower register:

You don’t generally have anyone in your life who has gone out on their own and done something entrepreneurial or done something artistic or truly risky or truly taken the path less traveled, because those people [. . .] don’t work in academics. And don’t become cubicle monkeys. So what I try and explain in my speeches is that there’s a third way. Because a lot of people—I think most people—want to do something besides those two things.

A lot of people want to do something else, but that something else is, in some ways, harder to do than the normal path. Yet the people who go the third way often talk about it as being more satisfying, and the people who go the “two paths” often speak wistfully of the third—despite the difficulty one is likely to encounter. A friend wrote this to me: “I know for a fact that I’d hate [Tucker] Max’s writing, but he’s dead right about how few students are aware that they can do something artistic or creative or entrepreneurial.” Too few students are aware of this—and too few people in general are. You can consider this post a very small step in the direction of increasing awareness.

So far I’ve noted two examples. Paul Graham talks about the problem of standard paths too, in “A Student’s Guide to Startups:” “Till recently graduating seniors had two choices: get a job or go to grad school. I think there will increasingly be a third option: to start your own startup.” His answer is more defined than Deresiewicz’s or Max’s, but the very language he uses is similar. But he’s also got a way of generating the “third way” by funding startups. Instead of merely telling people to find one, he’s creating a third way for people to flow, which might be the most valuable contribution of all, at least for the technically inclined.

I think all three of these disparate writers—Deresiewicz, Max, and Graham—are pointing to a more fundamental need for the imagination necessary to exit the obvious paths that so often end up going nowhere. Of the three, Graham has done the most to institutionalize this process and make it available for others by starting Y Combinator. Max has probably done the most to be a living embodiment of an unusual third way. Deresiewicz is pointing to the possibility from within the way of a well-defined path (and the same one I’m one) from undergrad to graduate school to being a professor. Taken together, they diagnose and offer treatment for the same malady that can’t quite be identified yet comes from so many sources and has so many symptoms: Dilbert, cubicles, malaise, ennui, florescent lights, midlife crises, 20-somethings with advanced degrees working as baristas, waiters, or bartenders, essay writers.

Artistic or creative activities don’t usually come prepackaged in convenient jobs that get handed to college graduates. They get created by people who are artistic and creative, who find a way to turn what they want to do, or their inchoate ideas, into something greater than the idea itself. The “inchoate idea” is important: I suspect most people don’t entirely know what they’re doing when they find a third way. Steven Berlin Johnson has a term for this in his book Where Good Ideas Come From: The Natural History of Innovation: the slow hunch. This happens when something that you’ve been gnawing on slowly develops over time. Johnson describes it much more fully, of course, but a lot of my ideas in writing novels or academic work comes from slow hunches. Writing fiction isn’t an activity that really comes packaged in convenient job form: it is made by each practitioner individually. People who succeed as writers sometimes do so not through conventional publishing, but through alternate ways—as Max did with his website, or as J.A. Konrath apparently does with his blog, “A Newbie’s Guide to Publishing.”

Like Deresiewicz and Max, I don’t really have a solution to the problem other than to encourage you to think imaginatively. But who’s against thinking imaginatively? Partners are probably telling their third-year associates the same thing, even as the associates put in soul-killing seventy hours weeks under those menacing florescent lights. The other part of my solution is to be aware of the problem. I’ll also channel Graham in “What You’ll Wish You’d Known” and encourage you to stay upwind:

In the graduation-speech approach, you decide where you want to be in twenty years, and then ask: what should I do now to get there? I propose instead that you don’t commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward.

It’s not so important what you work on, so long as you’re not wasting your time. Work on things that interest you and increase your options, and worry later about which you’ll take.

Suppose you’re a college freshman deciding whether to major in math or economics. Well, math will give you more options: you can go into almost any field from math. If you major in math it will be easy to get into grad school in economics, but if you major in economics it will be hard to get into grad school in math.

Flying a glider is a good metaphor here. Because a glider doesn’t have an engine, you can’t fly into the wind without losing a lot of altitude. If you let yourself get far downwind of good places to land, your options narrow uncomfortably. As a rule you want to stay upwind.

“Work on things that interest you and increase your options:” the target of Graham’s essay is nominally high school students, but it’s applicable to a much broader swath of people. Maybe you’re one. If so, however, you’ll probably read this and then go back to filling out those TPS reports. Or maybe you’ll be one of the very rare people who realize there is no speed limit and react appropriately. At least you can’t say that no one told you. At least three people have: Deresiewicz, Max, and Graham. Four if you count me, writing a meta essay.