Getting good with women and how I’ve done almost everything in my life wrong: Part II

This is the second part in a series; The first part is here.

An interview between me and Tucker Max about how I used to suck with women and now I’m okay just went up on his Mating Grounds podcast. You should go listen or read the transcript. This essay grew out of my notes for that podcast.

Empathy

I did some online dating years ago, primarily in Seattle and a little bit in Tucson, and some of the girls I got to know showed me their message streams. Some messages were disgusting or outright idiotic, but most they were boring and poorly thought out: “Hey.” “How are you?” “Your profile is interesting.” Pretty girls get dozens of them every week. Hell, even average girls do. I was thinking, “If I were a girl, I’d get turned off by all this crap too.”

Most of the girls I talked to—even the ones who just wanted to get laid—were in fact tired of all that crap. They got so many low-quality messages, or messages from guys who’d copied and pasted an initially clever come-on but couldn’t follow-up. Those women wanted something a little different. They were bored, which is a point I’ll come back to later. Dating for women is different in important ways than dating for men, and I wish I’d understood that sooner.

Reading those messages also explained why I was doing fairly well, since I was deliberately trying to say something non-obvious and ideally slightly lascivious without being gross. That class of message stood out. Being tall and in shape obviously helped too. My photos were pretty good. I didn’t spend much time playing games, and if women didn’t want to meet quickly I would stop messaging them (which would often lead those who were reluctant to meet for whatever reason to want to meet).

The girls from the Internet taught me something else useful too: some said they liked online dating because it let them meet guys without their bitchy, judgmental, hypocritical friends around (they didn’t use those words, but that’s what they meant). Without the chorus of shame squawking in their ears, real desires emerge. The real upholders of the sexual double standard are actually women, not men.

Somewhere along the way I realized that lots of women are lonely and looking for connection, and that loosened me up as far as approaching women and asking them out. I’ve asked out women on the street, in buses (if you’re a guy on the prowl you should love public transportation), in grocery lines, on running trails. Usually the conversation starts with something observational, then moves to whatever is going on that day or week. If you have nothing going on, get something going on and get talking about it. Energetic people are on average more attractive than sluggards.

I’m still not inured to rejection—is anyone?—but if a girl on the street says no, it doesn’t matter. Move on. She’ll forget, and perhaps I’ll make some other girl’s day, and she’ll go home and tell her friends that a cute stranger was hitting on her.

This isn’t something I experienced directly, but a friend’s recent adventures helped teach me too. She’s posted to a well-known amateur porn site (without her face in the shots). On this site she gets a lot of responses from viewers, and she’s shown them to my fiancée and me. They’re voluminous, amazingly bad, and unintentionally hilarious. Hundreds of guys write to her, almost all of them saying some variation on “You’re so hot” or “I want to fuck you.” And these guys have no idea where she lives.

The messages were pathetic, and when we were reading them my fiancée said something like, “This is what all women have to deal with.” In that moment so much became clear to me. I knew that, intellectually, but seeing the really low-value, unsuccessful messages from guys on the Internet reinforced that point. It’s such a waste of time to send those messages. They’re more a fantasy projection that a real attempt to meet women, but every minute or second they spend sending “Ur so hot show me ur butthole” is a minute or second they’re not doing something useful. If I were a woman I couldn’t imagine looking for quality men on amateur porn sites. Yet these are doing so, and the way they’re doing it is all wrong, and yet they persist in doing it the wrong way.

And our friend is not the primary motivator for them. She’s reasonably attractive but not incredibly spectacular—most guys and girls who have a taste for other girls would be happy to date her. But you won’t see her on a Victoria’s Secret runway. she’s getting this kind of response, which is a distillation and intensification of what many women experience otherwise. In real life most guys won’t go up to women and say “show me ur butthole” for good reason; online, with the cloak of pseudonymity, they’re willing to. In real life, guys would probably like to say that, but they can’t or don’t.

On a separate subject, reading Norah Vincent’s book Self-Made Man taught me about the lack of empathy women have for men. So did stories told by women about gross and very insistent guys, or nasty comments from parents and other girls. Reading “The Daughter-Guarding Hypothesis,” since it showed how fear and loathing around sexual behaviors get inculcated in women from an early age.

Looks count

Looks and style matter, and like many nerds (and especially nerds growing up in nerd-infested places like Seattle) I wanted to believe they didn’t. But people make snap judgments for reasons that I now realize are quite good: we communicate a huge amount of information based on what we wear, how we hold ourselves, and so forth. For both men and women wearing clothes that fit matters. Women learn this almost immediately; it took me until I was 25 to figure it out.

Still, when I was 14 or 15 I started lifting and running consistently relatively early, and that was a definitive advantage that continues to be an advantage—not only in dating but in long-term relationships. If you’re old enough to know people who’ve been in long-term relationships, you’ll have seen the pattern in which one or both parties in a long-term relationship let themselves go, which usually coincides taking their partner for granted. That’s probably a mistake at any time or place, but it’s really a mistake in contemporary American society, since in this society and culture the rigors of the dating market never really end. That may be a bad thing but it is a thing. You can’t let yourself go, both for your partner’s sake and because you never know when you’re going to be involuntarily dumped back into the market.

Younger people probably shouldn’t be focused on very long-term relationships because they change so much. I didn’t have a somewhat stable, developed personality until I was 24 or so. People evolve through their lives but that evolution is particularly rapid and pronounced from puberty well into the 20s. If you’re 20, chances are you won’t be dating the same person for five years. Understand that you’re going to be on the market a lot, and it’s difficult or impossible to hide from market tests.

Guys who pay attention to their posture, to what they wear, and to their workouts are in the game. Guys who don’t probably aren’t. That doesn’t mean guys have to become obsessed with these issues—I never have been—but it does mean being aware of them and taking care to do them right.

The last part is here.

Journalism, physics and other glamor professions as hobbies

The short version of this Atlantic post by Alex C. Madrigal is “Don’t be a journalist,” and, by the way, “The Atlantic.com thinks it can get writers to work for free” (I’m not quoting directly because the article isn’t worth quoting). Apparently The Atlantic is getting writers to work for free, because many writers are capable of producing decent-quality work, and the number of paying outlets are shrinking. Anyone reading this and contemplating journalism as a profession should know that they need to seek another way of making money.

The basic problems journalism faces, however, are obvious and have been for a long time. In 2001, I was the co-editor-and-chief of my high school newspaper and thought about going into journalism. But it was clear that the Internet was going to destroy a lot of careers in journalism. It has. The only thing I still find puzzling is that some people want to major in journalism in college, or attempt to be “freelance writers.”

Friends who know about my background ask why I don’t do freelance writing. When I tell them that there’s less money in it than getting a job at Wal-Mart they look at me like I’m a little crazy—they don’t really believe that’s true, even when I ask them how many newspapers they subscribe to (median and mode answer: zero). Many, however, spend hours reading stuff for free online.

In important ways I’m part of the problem, because on this blog I’m doing something that used to be paid most of the time: reviewing books. Granted, I write erratically and idiosyncratically, usually eschewing the standard practices of book reviews (dull, two-paragraph plot summaries are stupid in my view, for instance), but I nonetheless do it and often do it better than actual newspapers or magazines, which I can say with confidence because I’ve read so many dry little book reports in major or once-major newspapers. Not every review I write is a critical gem, but I like doing it and thus do it. Many of my posts also start life as e-mails to friends (as this one did). I also commit far more typos than a decently edited newspaper or magazine. Which I do correct when you point them out.

The trajectory of journalism is indicative of other trends in American society and indeed the industrialized world. For example, a friend debating whether he should consider physics grad school wrote this to me recently: “I think physics is something that is fun to study for fun, but to try to become a professional physicist is almost like too much of a good thing.” He’s right. Doing physics for fun, rather than trying to get a tenure-track job, makes more sense from a lifestyle standpoint.

A growing number of what used to occupations seem to be moving in this direction. Artists got here first, but others are making their way here. I’m actually going to write a post about how journalism increasingly looks like this too. The obvious question is how far this trend will go—what happens when many jobs that used to be paid become un-paid?

Tyler Cowen thinks we might be headed towards a guaranteed annual income, an idea that was last popular in the 60s and 70s. When I asked Cowen his opinions about guaranteed annual incomes, he wrote back to say that he’d address the issue in a forthcoming book. The book hasn’t arrived yet, but I look forward to reading it. As a side not, apparently Britain has, or had, a concept called the “Dole,” which many people went on, especially poor artists. Geoff Dyer wrote about this some in Otherwise Known as the Human Condition. The Dole subsidized a lot of people who didn’t do much, but it also subsidized a lot of artists, which is pretty sweet; one can see student loans and grad school serving analogous roles in the U.S. today.

IMG_1469-1Even in programming, which is now the canonical “Thar be jobs!” (pirate voice intentional) profession, some parts of programming—like languages and language development—basically aren’t remunerative. Too many people will do it free because it’s fun, like amateur porn. In the 80s there were many language and library vendors, but nearly all have died, and libraries have become either open source or rolled into a few large companies like Apple and Microsoft. Some aspects of language development are cross-subsidized in various ways, like professors doing research, or companies paying for specific components or maintenance, but it’s one field that has, in some ways, become like photography, or writing, or physics, even though programming jobs as a whole are still pretty good.

I’m not convinced that the artist lifestyle of living cheap and being poor in the pursuit of some larger goal or glamor profession seems is good or bad, but I do think it is (that we have a lot of good cheap stuff out there, and especially cheap stuff in the form of consumer electronics, may help: it’s possible to buy or acquire a nearly free, five-year-old computer that works perfectly well as a writing box).* Of course, many starving artists adopt that as a pose—they think it’s cool to say they’re working on a novel or photography project or “a series of shorts” or whatever, but don’t actually do anything, while many people with jobs put out astonishing work. Or at least work, which is usually a precursor to astonishing work.

For some people, the growing ability of people to disseminate ideas and art forms even without being paid is a real win. In the old days, if you wanted to write something and get it out there, you needed an editor or editors to agree with you. Now we have a direct way of resolving questions about what people actually want to read. Of course, the downside is that whole payment thing, but that’s the general downside of the new world in which we live, and, frankly it’s one that I don’t have a society-wide solution for.

In writing, my best guess is that more people are going to book-ify blogs, and try to sell the book for $1 – $5, under the (probably correct) assumption that very few people want to go back and read a blog’s entire archives, but an ebook could collect and organize the material of those archives. If I read a powerful post by someone who seemed interesting, I’d buy a $4 ebook that covers their greatest hits or introduced me to their broader thinking.

This is tied into other issues around what people spend their time doing. My friend also wrote that he read “a couple of articles on Keynes’ predictions of utopia and declining work hours,” but he noted that work still takes up a huge amount of most people’s lives. He’s right, but most reports show that median hours worked in the U.S. has declined, and male labor force participation has declined precipitously. Labor force participation in general is surprisingly low. Ross Douthat has been discussing this issue in The New York Times (a paid gig I might add), and, like, most reasonable people he has a nuanced take on what’s happening. See also this Wikipedia link on working time for some arguments that working time has declined overall.

Working time, however, probably hasn’t decreased for everyone. My guess is that working time has increased for some smallish number of people at the top of their professors (think lawyers, doctors, programmers, writers, business founders), with people at the bottom often relying more on government or gray market income sources. Douthat starts his essay by saying that we might expect working hours among the rich to decline first, so they can pursue more leisure, but he points out that the rich are working more than ever.

Though I am tempted to put “working” in scare quotes, because it seems like many of the rich are doing things they would enjoy doing on some level anyway; certainly a lot of programmers say they would keep programming even if they were millionaires, and many of them become millionaires and keep programming. The same is true of writers (though fewer become millionaires). Is writing a leisure or work activity for me? Both, depending. If I self-publish Asking Anna tomorrow and make a zillion dollars, the day after I’ll still be writing something. I would like to get paid but some of the work I do for fun isn’t contingent on me getting paid.

Turning blogs into books and self-publishing probably won’t replace the salaries that news organizations used to pay, but it’s one means for writers or would-be writers to get some traction.

Incidentally, the hobby-ification of many professions makes me feel pretty good about working as a grant writing consultant. No one think when they’re 14, “I want to be a grant writer like Isaac and Jake Seliger!”, while lots of people want to be like famous actors, musicians, or journalists. There is no glamor, and grant writing is an example of the classic aphorism, “Where there’s shit, there’s gold” at work.

Grant writing is also challenging. Very few people have the weird intersection of skills necessary to be good, and it’s a decade-long process to build those skills—especially for people who aren’t good writers already. The field is perpetually mutating, with new RFPs appearing and old ones disappearing, so that we’re not competing with proposals written two years ago (where many novelists, for example, are in effect still competing with their peers from the 20s or 60s or 90s).

To return to journalism as a specific example, I can think of one situation in which I’d want The Atlantic or another big publisher to publish my work: if I was worried about being sued. Journalism is replete with stories about heroic reporters being threatened by entrenched interests; Watergate and the Pentagon Papers are the best-known examples, but even small-town papers turn up corruption in city hall and so forth. As centralized organizations decline, individuals are to some extent picking up the slack, but individuals are also more susceptible to legal and other threats. If you discovered something nasty about a major corporation and knew they’d tie up your life in legal bullshit for the next ten years, would you publish, or would you listen to your wife telling you to think of the kids, or your parents telling you to think about your career and future? Most of us are not martyrs. But it’s much harder for Mega Corp or Mega Individual to threaten The Atlantic and similar outlets.

The power and wealth of a big media company has its uses.

But such a use is definitely a niche case. I could imagine some of the bigger foundations, like ProPublica, offering a legal umbrella to bloggers and other muckrakers to mitigate such risks.

I have intentionally elided the question of what people are going to do if their industries turn towards hobbies. That’s for a couple reasons: as I said above, I don’t have a good solution. In addition, the parts of the economy I’m discussing here are pretty small, and small problems don’t necessarily need “solutions,” per se. People who want to turn their hours into a lot of income should try to find ways and skills to do that, and people who want to turn their hours into fun products like writing or movies should try to find ways to do that too. Crying over industry loss or change isn’t going to turn back the clock, and just because someone could make a career as a journalist doesn’t mean they can today.


* To some extent I’ve subsidized other people’s computers, because Macs hold their value surprisingly well and can be sold for a quarter to half of their original purchase price three to five years after they’ve been bought. Every computer replaced by my family or our business has been sold on Craigslist. Its also possible, with a little knowledge and some online guides, to add RAM and an SSD to most computers made in the last couple of years, which will make them feel much more responsive.

Life: Love edition

“[T]he choice one makes between partners, between one man and another (or one woman and another) stretches beyond romance. It is, in the end, the choice between values, possibilities, futures, hopes, arguments (shared concepts that fit the world as you experience it), languages (shared words that fit the world as you believe it to be) and lives.”

—Zadie Smith, Changing My Mind

Martin Amis, the essay, the novel, and how to have fun in fiction

There’s an unusually interesting interview with Martin Amis in New York Magazine, where he says:

I think what has happened in fiction is that fiction has responded to the fact that the rate of history has accelerated in this last generation, and will continue to accelerate, with more sort of light-speed kind of communications. Those huge, leisurely, digressive, essayistic, meditative novels of the postwar era—some of which were on the best-seller lists for months—don’t have an audience anymore. [. . .]

No one is writing that kind of novel now. Well [. . . ] David Foster Wallace—that posthumous one looks sort of Joycean and huge and very left-field. But most novelists I think are much more aware than they used to be of the need for forward motion, for propulsion in a novel. Novelists are people too, and they’re responding to this just as the reader is.

I think people aren’t reading the “essayistic, meditative novels” because “essayistic, meditative novels” reads like code-words for boring. In addition, we’re living in “The Age of the Essay.” We don’t need novelists to write essays disguised as novels when we can get the real thing in damn near infinite supply.

The discovery mechanisms for essays are getting steadily better. Think of Marginal Revolution, Paul Graham’s essays, Hacker News, The Feature, and others I’m not aware. Every Saturday, Slate releases a link collection of 5 – 10 essays in its Longform series. Recent collections include the Olympics, startups, madness in Mexico, and disease. The pieces selected tend to be deep, simultaneously intro- and extrospective, substantive, and engaging. They also feel like narrative, and nonfiction writers routinely deploy the narrative tricks and voice that fiction pioneered. The best essay writers have the writing skill of all but perhaps the very best novelists.

As a result, both professional (in the sense of getting paid) and non-professional (in the sense of being good but not earning money directly from the job) writers have an easy means of publishing what they produce. Aggregators help disseminate that writing. A lot of academics who are experts in a particular subject have fairly readable blogs (many have no blogs, or unreadable blogs, but we’ll focus on the readable ones), and the academics who once would have been consigned to journals now have an outlet—assuming they can write well (many can’t).

We don’t need to wait two to five years for a novelist to decide to write a Big Novel on a topic. We often have the raw materials at hand, and the raw material is shaped and written by someone with more respect for the reader and the reader’s time than many “essayistic” novelists. I’ve read many of those, chiefly because they’ve been assigned at various levels of my academic career. They’re not incredibly engaging.

This is not a swansong about how the novel is dead; you can find those all over the Internet, and, before the Internet, in innumerable essays and books (an awful lot of novels are read and sold, which at the very least gives the form the appearance of life). But it is a description of how the novel is, or should be, changing. Too many novels are self-involved and boring. Too many pay too little to narrative pacing—in other words, to their readers. Too many novels aren’t about stuff. Too many are obsessed with themselves.

Novels might have gotten away with these problems before the Internet. For the most part, they can’t any more, except perhaps among people who read or pretend to read novels in order to derive status from their status as readers. But being holier-than-thou via literary achievement, if it ever worked all that well, seems pretty silly today. I suppose you could write novels about how hard it is to write novels in this condition—the Zuckerman books have this quality at times, but who is the modern Zuckerman?—but I don’t think anyone beyond other writers will be much interested.

If they’re not going to be essayistic and meditative, what are novels to be? “Fun” is an obvious answer. The “forward motion” and “propulsion” that Amis mentions are good places to start. That’s how novels differ, ideally, from nonfiction.

Novels also used to have a near-monopoly on erotic material and commentary. No more. If you want to read something weird, perverse, and compelling, Reddit does a fine job of providing it (threads like “What’s your secret that could literally ruin your life if it came out?” provides what novels used to).

Stylistically, there’s still the question of how weird and attenuated a writer can make individual sentences before the work as a whole becomes unreadable or boring or both. For at least a century and change, writers could go further and further in breaking grammar, syntax, and point of view rules while still being comprehensible. By the time you get to late Joyce or Samuel Beckett’s novels, however, you start to see the limits of incomprehensibility and rule breaking regarding sentence structure, grammar, or both.

Break enough rules and you have word salad instead of language.

Most of us don’t want to read word salad, though, so Finnegans Wake and Malone Dies remain the province of specialists writing papers to impress other specialists. We want “forward motion” and “propulsion.” A novel must delight in terms of the plot and the language used. Many, many novels don’t. Amis is aware of this—he says, “I’m not interested in making a diagnostic novel. I’m 100 percent committed in fiction to the pleasure principle—that’s what fiction is, and should be.” But I’m not sure his fiction shows this (as House of Meetings and Koba the Dread show). Nonetheless, I’m with him in principle, and, I hope, practice.

How to think about science and becoming a scientist

A lot of students want to know whether they should major in the humanities, business, or science, which is a hard choice because most of them have no idea whatsoever about what real science (or being a scientist) is like, and they won’t learn it from introductory lectures and lab classes. So freshmen and sophomores who are picking majors don’t, and can’t, really understand what they’re selecting—or so I’ve been told by a lot of grad students and youngish professors who are scientists.

One former student recently wrote me to say, “I was a biochemistry major with a dream of being a publisher and long story short, I am no longer a biochem major and I am going full force in getting into the publishing field right now” (emphasis added). I discouraged her from going “into” publishing, given that I’m not even convinced there is going to be a conventional publishing industry in five years, and forwarded her e-mail to a friend who was a biochemistry major. My friend’s response started as a letter about how to decide if you want to become a scientist but turned into a meditation on how to face the time in your life when you feel like you have to decide what, if anything, you want to become.


The thing about being “interested” in science is that the undergraduate survey classes rarely confirm if you really are. They’re boring. Rote. Dull. I credit my Bio 101 teacher with making the delicate, complicated mysteries of carbon based life as engaging as listening to my Cousin “M” discuss the subtle differences among protein powder supplements. I spent most of class surfing Wikipedia on my laptop. The next semester I weaseled my way into an advanced cell bio class that was fast and deep and intellectually stimulating, thanks to an eccentric teacher with a keen mind and a weird tendency to act out enzymatic reactions in a sort of bastardized interpretive dance. I dropped Bio 102, which didn’t cripple my ability to keep up with advanced cell bio in any way (showing that survey classes can be unnecessary, boring, and confusing—confusing primarily because they leave out the details that are supposed to be too “advanced” but in fact clarify what the hell is going on), and got an unpaid research position in a faculty lab that eventually turned into a paid gig. By the way: there is significant pressure to dumb survey courses down and virtually no pressure on professors to teach them well; there are still good ones, but don’t let the bad ones dissuade you.

If any field of scientific inquiry interests you, if you have the impulse to ask your own questions and are excited by the idea that you can go find the answers yourself and use what you’ve discovered to tinker and build and ask new questions—which is to say, if you like the idea of research—you’ve got a much better chance of figuring out if you want to be a scientist. How? Go and be one. Or, at least, play at being a scientist by finding a lab that will train you at doing the work until you wake up one day and realize that you are teaching a new undergrad how to program the PCR machine and your input is being used to develop experiments.

I was a biochemistry undergrad major, and I absolutely deplored the labs that were required by classes, but it turned out I loved the actual work of being in a lab. Classes lacked the creativity that makes science so appealing; they feel designed to discourage interest in science. In class, we had 50 minutes to purify a protein and learn to use the mass spectrometer. Big deal. Can I go now? But put me in front of the PCR machine with a purpose? I’d learn how to use it in an afternoon because doing so meant that I was one step closer to solving a problem no one had solved before. You don’t find that kind of motivation in most classrooms. And you don’t need a Ph.D. to contribute to the field. All you need is intellectual appetite. (For an exception to the “class is boring” rule, check out Richard Feynman’s intro to physics lectures.)

So: I didn’t like glossing over information, memorizing for tests, and being told I had till the end of class to identify how many hydrogen ions were in compound X. I wasn’t excited by my major, but I was excited by my subject—and the longer I spent working in a real lab with a professor who saw that I was there every day and willing to learn (he eventually gave me a pet project), the more engaged I became with biochemistry. Sure, the day-to-day involved a lot of pipetting and making nutrient-agar plates to grow bacteria on, but I was working towards something larger than a grade.

I was splicing the DNA of glucose galactose binding protein and green fluorescent protein to try to make a unique gene that could express a protein which fluoresced when binding to glucose. In essence, a protein flare. Then I built it into an e-coli plasmid so it would self-replicate, while a lab in Japan was trying to get the gene expressed into what effectively turned into glow-under-blacklight-just-add-sugarwater mice. The goal was to get the gene expressed in diabetic people who could wear a fluorimeter watch and check how brightly the genetically engineered freckle on their wrist glowed, in lieu of pricking their finger to check their blood glucose.

Do you have any idea how awesome it was to describe my research at parties? I left out the parts where I had to evacuate the whole lab for four hours after accidentally creating a chlorine cloud and especially the parts where I spent an entire day making 250 yeast-agar plates and went home with “nutrient powder” in my hair and lungs. But even with the accidents and drudgery, the bigger goal was motivating. Being part of the real scientific conversation gave my work purpose that a grade did not. I dreamed of building nanobots and splicing the DNA together to build biological machines. It sure as hell beat memorizing the Kreb’s cycle in return for a 4.0 GPA and med school.

That is what I love about science: you get to build something, you get to dream it up and ask questions and see if it works and even if you fail you learn something. What I loved was a long way from the dreary crap that characterizes so many undergrad classes. To be fair, the day-to-day isn’t all that whiz bang, but it’s rarely Dilbert-esque and I really liked the day-to-day challenges. There was something zen about turning on music and pipetting for three hours. That was right for me. It might not be for you; if you’re trying to be a scientist or get a feel for what science is like (more on that below), don’t be afraid to expose yourself to multiple labs if the first doesn’t work out for you.

My own heart will always be that of a splice-n-dicer. I’ll always love fiddling with DNA more than purifying compounds over a bunsen burner. But you don’t know what day-to-day tasks will give you the most pleasure. You don’t yet know that you might find zen at 3 a.m. setting up DNA assays, your mind clear, the fluid motion of your hand pulling you into a state of flow. You find out by doing, and you might be surprised—especially because classes don’t give you a good sense of what the life of a scientist is like. It also doesn’t introduce you to the graduate students, the post-doctorates and the assistant professors who show you what kind of struggle comes from love, which in turn generates internal motivation. It doesn’t take you away from your university into summer programs that show you how amazing it is to be in a lab with real money and the resources to make your crazy ideas possible.

Which brings me to choosing a field: If you like science, but don’t know what kind, pick the most substantive one that interests you, with as much math as you’re willing to handle, and just get started (math is handy because it applies virtually everywhere in the sciences). Chemistry, biochem and biology overlap to such a degree that I was working in a biochem lab on a genetics project with the goal of creating a protein, and biology labs worked with us in tandem. When you get into the real work, the lines between fields blur. You can major in biochem and get a Ph.D. in neuroscience, study organic chemistry and work for a physical chemistry / research firm. Other scientists don’t care about what classes you took or what your degree says—they care about what you know and what you can do and if what you can do can be applied in a useful way. When in doubt, focus on developing technical skills more than the words on your degree.

One summer I applied to the Mayo Clinic Summer Undergraduate Research Fellowship (something I recommend anyone interested in science do—there are “SURF” programs at almost every major university and research center and they will give you a stipend, housing and exposure to a new lab. It can do amazing things for your CV, your career and your relationship to the larger scientific community. In math and some other fields, your best bet is the NSF’s Research Experiences for Undergraduates (REU) Program). But I didn’t get the job. I had six months in a lab at that point. I had a 3.96 GPA. I had a pretty great “why me” essay. Still, nothing.

A year later I applied again. By that time I’d been in the lab for a year and a half. I knew how to handle most of our major equipment. My CV described the tasks I could perform unsupervised, the problems I tackled by myself, and solutions I’d found. My advisor explained my role and the amount of autonomy I had been given. This time I got the job. When I met with the director of my summer lab in person he made it clear that there were many fine applicants with stellar GPAs. I’d never even worked with radioactive iodine tagged proteins. They picked me because they knew undergrads only had three months to get substantive research done, and they simply didn’t have time to train someone (especially someone who might turn out to lack tenacity). They needed someone who knew how to work in the lab and could adapt quickly. They needed someone who knew how to work the machines my college lab used, and someone who knew how to work with e-coli plasmids. I could do that.

So pick whatever you think you like best, start with that, find a lab, and learn how to be adept at as many basic lab skills as possible. Delve more deeply into the ones associated with your research. Be ready to work when the right opportunity and research lab come along. The new lab will always ask what skills you have and whether they can be applied to the questions their lab is trying to solve, even if you’ve never asked similar questions. A chemistry major could therefore be what a certain biology lab needs at a given time.

A lot of what is frustrating and off-putting about science at first, including working in the research lab, is the same thing that’s frustrating and off-putting about math: to really enter the conversation you have to have the vocabulary, so there’s a lot of memorizing when you start. Which is just obnoxious. But it doesn’t take too long, and if you start interning in a lab early, then the memorizing feels justifiable and pertinent, even if you feel initially more frustrated at a) not knowing the information and b) not knowing how to apply it. If you don’t get into a lab, however, it’s just hard and pointlessly so (even though it isn’t).

(Virtually all fields have this learning curve, whether you realize it or not; one of Jake’s pet books is Daniel T. Willingham’s Why Don’t Students Like School: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom, which describes how people move from no knowledge to shallow knowledge to deep knowledge. It’s bewildering and disorienting to start with no knowledge on a subject, but you have to endure and transcend that state if you’re going to move to deep knowledge. He says that he’s climbed that mountain with regard to writing, which makes writing way more rewarding than it used to be.)

Once you have the language and are able to think about, say, protein folding, the way you would a paragraph of prose, or the rhythm in a popular song, science takes on a whole new life, like Frankenstein’s Monster but without the self-loathing or murder. You start to think about what questions you can ask, what you can build, and what you can do—as opposed to what you can regurgitate. The questions you pose to people in your lab will lead to larger conversations. Feeling like an insider is nice, not only because it’s nice to belong, but because you’ll realize that even being a small part of the conversation means you’re still part of the larger discussion.

Science is exciting, but not until you find a way to break through the barriers and into the real thing, so don’t give up prematurely. Like most things, however, your experience depends on whether you have or make the right opportunities. I went to medical school after a grad school detour. How I feel about that decision is an entirely different essay, and one I’ll post later. I ended up specializing in Emergency Medicine because I had enthusiastic ER docs teaching me. Before residency, I thought I’d do anesthesia, but the profs were boring and it seemed like awful work. I’m on a fabulous anesthesia rotation right now, the medical equivalent of a Riviera cruise, and am thinking, “Hey! Maybe I should have done this.” Same with rehab medicine. It’s a perfect fit for me, but I had two boring weeks of it in a non-representative place and so wasn’t about to sign myself over to a whole career without having any more to base my opinion on.

Some days I think that if I’d had a different lab, which exposed me to different things, if my Mayo summer had given me different connections, I would be pipetting merrily away at Cold Spring Harbor research center, building a nanobot that would deliver the next big cancer treatment on a cellular level. Or maybe I would be a disgruntled post-doc, wishing that I could finally have a lab of my own. Or working for Pfizer. Anything could have changed my path. And just because you choose to study something you love doesn’t mean you’ll succeed.

But not choosing to study something you love is even worse. Point is, most choices in life are luck and chance, but you shouldn’t discard viable options—especially ones in science—based on a couple of survey courses designed to move the meat. Universities do view freshmen as piggybanks whose tuition dollars fund professors’ salaries and research, which is why they cram 1,000 of you into lecture halls and deliver an industrial-grade product that’s every bit as pleasant as industrial-grade mystery meat. Unfortunately, those classes are often the only real way to know if you like something and to be exposed to it unless you seek out more real-world representative opportunities. Most universities won’t go out of their way to shunt you into those opportunities. You have to want them and seek them out. So if you think you like biology? Or physics? Read The Elegant Universe**. The greatest show on earth. The history of the polio vaccine. See if it stirs you.

That being said, if you don’t like science, you don’t like it; I’m just warning you that what you think you don’t like might simply be due to not quite knowing enough or having negative exposures. Still, you can have all the best intentions, follow my advice, find a great lab, try out different opportunities if the first or second don’t work out, and decide it’s just not for you. You probably can’t force it to be your passion, but you probably also underestimate the extent to which you, like most people, have a range of possible passions. I only caution you to make sure that you aren’t basing your choice on one bad class or a crappy lab advisor. This is good advice in any field.

Here’s an example of possible premature optimization: I received an email from Jake’s former student, saying she was thinking about being a judge as a “backup,” in case a career in publishing didn’t work out. Being a judge, that’s a life goal. A big one. And law does not make good on its promise of a comfortable income the way it once did. For more on that, see the Slate.com article “A Case of Supply v. Demand: Law schools are manufacturing more lawyers than America needs, and law students aren’t happy about it,” which points out that there are too many certified and credentialed “lawyers” for the amount of legal work available. Plus, while society needs a certain number of lawyers to function well, too many lawyers leads to diminishing returns as lawyers waste time ginning up work by suing each other over trivialities or chasing ambulances.

By contrast, an excess of scientists and engineers means more people who will build the stuff that lawyers then litigate over. Scientists and engineers expand the size of the economic pie; lawyers mostly work to divide it up differently. Whenever possible, work to be a person who creates things, instead of a person who tries to take stuff created by someone else. There is an effectively infinite amount of work in science because the universe is big and we don’t really understand it and we probably never will.* New answers to questions in science yields more questions. More lawsuits launched by lawyers just yields fighting over scraps.

Another personal example: I wasn’t just queen of the lab nerds. Sure, I tie-dyed my lab coat and dated a man who liked to hear me read aloud from organic chemistry textbook, but I also wanted to write: not academic papers and book chapters, but novels and essays. I’d always been dual-minded and never bought the “Two Cultures” idea one scientific and one humanistic, described in C.P. Snow’s eponymous book. This bifurcation is, to speak bluntly, bullshit. As a kid I spent as much time trying to win the science fair as I did submitting poetry to Highlights. May 1994’s Grumpy Dog issue was my first publication. You may have read it and enjoyed the story of “Sarah, the new puppy.” Or, you may not have been born yet. That was me as a kid. As an adult, I’m not confined to science either—and neither is any other scientist.

I imagine many of you reading this post who are struggling with whether or not to be a scientist are, fundamentally, not struggling with what you want to major in, but what you want to be and how your decisions in college influence your options. Many of you are likely creatively-minded, as scientific types often are, despite how “poindexter” characters are portrayed in popular T.V. Staying close to your passions outside the test tube gives you the creative spark that makes your scientific thinking unique and fresh. So you don’t have to pick science and say, “That’s it, I’m a scientist and only a scientist.” You become a scientist and say: Now what do I want to build/ask/figure out?


Jake again:

So what should you do now to get into science? Here’s a list that I, Jake Seliger the non-scientist, wrote, based on the experiences described by friends in the sciences:

0) Look for profs in your department. Look for ones who are doing research in an area in or adjacent to what you might be interested in doing.

1) Read a couple of their recent papers. You probably won’t understand them fully, but you should try to at least get a vague sense of what they’re doing. You may want to prepare a couple of questions you can ask in advance; some profs will try to weed out people who are merely firing off random e-mails or appearing in the office hours to beg.

2) Look for a website related to their lab or work, and try to get a sense of whether you might be interested in their work. Chances are you won’t be able to tell in advance. You should also figure out who their grad students are—most science profs will have somewhere between one and dozens of students working under them.

3) Go meet with said prof (or grad students) and say, “I’m interested in X, I’ve read papers W, Y, and Z, and I’d like to work in your lab.” Volunteer, since you probably won’t get paid at first.

4) They might say no. It’s probably not personal (rejection is rarely personal in dating, either, but it takes many people years or decades to figure this out). If the prof says no, go work on the grad students some, or generally make yourself a pest.

5) Try other labs.

6) Don’t give up. This is a persistent theme in this essay for good reason.

7) Keep reading papers in the area you’re interested in, even if you don’t understand them. Papers aren’t a substitute for research, but you’ll at least show that you’re interested and learn some of the lingo. Don’t underestimate the value of knowing a field’s jargon. Knowing the jargon can also be satisfying in its own right.

8) Take a computer science course or, even better, computer science courses. Almost all science labs have programming tasks no one wants to do, and your willingness to do scutwork will make you much more valuable. Simple but tedious programming tasks are the modern lab equivalent of sweeping the floor.

If you don’t have bench research experience, you probably won’t get into grad school, or into a good grad school. You might have to pay for an MA or something like that to get in, which is bad. If you’re thinking about grad school, read Louis Menand’s The Marketplace of Ideas as soon as possible. See also Penelope Trunk’s Don’t Try to Dodge the Recession with Grad School and Philip Greenspun’s Women in Science. Ignore the questionable gender comments Greenspun makes and attend to his discussion of what grad school in the sciences is like, especially this, his main point: “Adjusted for IQ, quantitative skills, and working hours, jobs in science are the lowest paid in the United States.”

Another: Alex Tabarrok points out in his book Launching The Innovation Renaissance: A New Path to Bring Smart Ideas to Market Fast that we appear to have too few people working in technical fields and too many majoring in business and dubious arts majors (notice that he doesn’t deal with graduate school, which is where he diverges from Greenspun). In his blog post “College has been oversold,” Tabarrok points out that student participation in fields that pay well and are likely “to create the kinds of innovations that drive economic growth” is flat. On an anecdotal level, virtually everyone I know who majored in the hard sciences and engineering is employed. Many of those who, like me, majored in English, aren’t.

According to a study discussed in the New York Times, people apparently leave engineering because it’s hard: “The typical engineering major today spends 18.5 hours per week studying. The typical social sciences major, by contrast, spends about 14.6 hours.” And:

So maybe students intending to major in STEM fields are changing their minds because those curriculums require more work, or because they’re scared off by the lower grades, or a combination of the two. Either way, it’s sort of discouraging when you consider that these requirements are intimidating enough to persuade students to forgo the additional earnings they are likely to get as engineers.

There’s another way to read these findings, though. Perhaps the higher wages earned by engineers reflect not only what they learn but also which students are likely to choose those majors in the first place and stay with them.

Don’t be scared by low grades. Yes, it’s discouraging to take classes where the exam average is 60, but keep taking them anyway. Low grades might be an indication that the field is more intellectually honest than one with easy, high grades.

In the process of writing and editing this essay, the usual panoply of articles is about topics like “science majors are more likely to get jobs” have been published. You’ve probably read these articles. They’re mostly correct. The smattering linked to here are just ones that happened to catch my attention.

Science grads may not get jobs just because science inherently makes you more employable—it may be that more tenacious, hard-working, and thus employable people are inclined to major in the sciences. But that means you should want to signal that you’re one of them. And healthier countries in general tend to focus on science, respect science, and product scientists; hence the story about the opposite in “Why the Arabic World Turned Away from Science.”

If you’re leaving science because the intro courses are too hard and your friends majoring in business are having more fun at parties, you’re probably doing yourself a tremendous disservice that you won’t even realize until years later. If you’re leaving science because of a genuine, passionate interest in some other field, you might have a better reason, but it still seems like you’d be better off double majoring or minoring in that other field.


My friend again, adding to what I said above:

As someone who was going to do the science PhD thing before deciding on medical school I agree with most of what Jake says. Let me emphasize: you will have to volunteer at first because you don’t have the skills to be hired in a lab for a job that will teach you something. Being hired without previous experience usually means the job doesn’t require the skills you want to learn, and so you won’t learn them. So you don’t want that job.

I had a paying job in a lab, so you can get them eventually—but I only started getting paid after I’d worked in it for a year, even then the pay was more like a nice boost because the money just happened to show up and they thought, “What the heck, she’s been helpful.” Think of this time as paying your way into graduate school, because if you don’t have lab work, despite how good your grades are, you will not get into a good graduate school with funding.

Here’s why: You have a limited amount of time in graduate school and are not just there to do independent research and learn. You’re there to do research with the department, and they need you to start immediately. If you already have years of bench research experience, the departments and the professors in that department know you can—and there is no substitute for experience.

The place where you really learn how to work in a lab and develop your skills is in one, not in the lab classes where you learn, at best, some rote things (plus, you need to know if you like the basic, day-to-day experience of working in a lab and the kind of culture you’ll find yourself in; not everyone does). Even if you do learn the tools you need for a certain lab, it doesn’t demonstrate that you’re actually interested in research.

The only thing that demonstrates an interest in research, which is all graduate school really cares about, is working in a lab and doing real research. I can’t stress that enough, which is why I’ve repeated it several times in this paragraph. A 4.0 means you can study. It doesn’t mean you can do research. People on grad school committees get an endless number of recommendation letters that say, “This candidate did well in class and got an ‘A.'” Those count for almost nothing. People on grad school committees want letters that say, “This candidate did X, Y, and Z in my lab.”

I recommend starting with your professors—the ones whose classes you’ve liked and who know you from office hours. Hit them up first. Tell them your goal is to be a scientist and that, while academics are nice, you want to start being a scientist now. If they don’t have space for you, tell them to point you in the direction of someone who does. Keep an open mind. Ask everybody. I was interested in nanobots, gene work, molecular what-nots, etc.

I started by asking my orgo [“organic chemistry” to laymen] teacher. Nothing. I asked my Biochem [“biological chemistry” or “biochemistry”] professor and was welcomed with open arms. Point is, if the labs you want have no space, go to another. Don’t give up. Be relentlessly resourceful. Be tenacious—and these aren’t qualities uniquely useful to scientists.

The skills I ended up with in the biochem lab turned out to be almost 100% on point with what I wanted to do later, even though the research was different. The kind of research you end up doing usually springs from the lab skills you have, and it’s much harder to decide what you want and try to find a lab that will give you those skills. So instead of trying to anticipate what research you’ll want to do from a position where you can’t know, just learn some research skills. Any skills are better than none. Then you have something to offer the lab you want when space / funding becomes available. I took what I learned in that biochem lab and spent a summer doing research on protein folding—it wasn’t like my initial research, but the prof needed someone who knew how to do X, Y and Z, which I did, and he was willing to train me on the rest.

You’ll face other decisions. For example, in many fields you’ll have to decide: do you want wet-lab research (this does not refer to adult entertainment) or do you want more clinical research? “Wet lab” means that you’re mucking with chemicals, or big machines, and stuff like. Clinical research means you’re dealing more with humans, or asking people questions, or something along those lines. I would suggest the wet lab if you think you may be even slightly interested (sort of like how you should experiment with lovers when you’re in college if you think you’re even slightly interested in different sorts of things). In fact, I’d suggest wet-lab work or some sort of computational lab in general, because clinical research skills can be extrapolated from wet lab—but not vice versa.

You can show that you can think if you’re in a clinical lab, but in a wet-lab you need to be able to think and use a pipette. Or that you can use modeling software, if you’re interested in the computer side of things. That’s where the programming comes in handy if you’re capable of doing it; if not, then I feel less strongly than Jake about programming, because often labs need someone with serious computer training, like a post doc, if their research revolves around modeling. But it could come in handy for you, anyway, and certainly couldn’t hurt, so if you’re interested it could be an extra perk.

Once you’re in the lab, if you want to learn skills outside what you’re working with. Ask. Ask everyone. Ask the computer guy, ask the woman on the other project. Get whatever you can get get good at it, then put it on you C.V. and make sure you can explain it clearly when someone asks, even if you’re not an expert, just be able to play on on T.V.

As for #3, about figuring out who their grad students are: I also find that less important. You need to talk to the primary investigator, the guy who runs the lab. If he’s not interested in you, it’s not worth going through grad student channels to convince him to take you. Someone is going to want you, and it’s best to go there in both science and love. Don’t fall for the false lead of the pretty girl in the alluring dress who disappears as soon as you get close. You can always try alternate channels later if you really want to get back into lab #1.

Think of it this way: if you’re struggling just to get a foot in the door, you’re going to struggle to get any research done. Not that the research will feel meaningful at first: you’ll be doing tasks assigned to you. But you should feel like this gets better, that you get more independence. And if that’s the not the ethos of the lab to start with, it never will be. As I mentioned before, if I’d gotten into that orgo lab, I’d have been a scut monkey for years.

As Jake said: read your professors’ papers. You probably won’t have any idea what’s going on. I still have no idea what’s going on half the time, but read ’em anyway. Shows you’re worth the effort, especially when you ask for that lab spot. Jake’s 100% right about ways to get your professors attention.

Don’t give up. Just don’t give up. Take “no” for an answer and kiss grad school (at least a good PhD program with full funding, which is what you want: don’t pay for this under any circumstances) goodbye. Scientists are distinguished by their tenacity, whether they’re in grad school or not. And make sure you know what you’re giving up before you do.

What kind of research are you interested in? What gets you going? Even if you’re not sure there are a certain number of fundamental things that, I believe, if you’re familiar with, will get you into whatever lab you want because they are used in most labs and shows you’re trainable for the other stuff. And you’ll know what science is like, which you simply don’t right now. Giving up on it based on some bad grades as a freshman or sophomore is folly.***


* One snarky but accurate commenter said, “There may be an infinite amount of work in science, but there is a finite (and very unevenly distributed) number of grants.

** Although a different friend observed, “Books are a step above classes, but in my experience, many aspiring theoretical physicists are really people who like reading popular science books more than they like doing math.”

*** You can download this essay as a .pdf.

Are you more than a consumer? “The Once and Future Liberalism” and some answers

This is one of the most insightful thing I’ve read about an unattractive feature of American society: we put an “emphasis on consumption rather than production as the defining characteristic of the good life.” It’s from “Beyond Blue 6: The Great Divorce,” where, in Walter Russell Mead’s reading, “Americans increasingly defined themselves by what they bought rather than what they did, and this shift of emphasis proved deeply damaging over time.” I’m not convinced this has happened equally for everybody, all the time, but it rings awfully true.

Which brings us back to the point made in the title: are you producing more than you consume? Are you focused on making things, broadly imagined, instead of “consuming” them? Is there more to your identity than the music you like and the clothes you wear? (“More” might mean things you know, or know how to do, or know how to make.) Can you do something or somethings few others can? If the answers are “no,” you might be feeling the malaise Mead is describing. In Anything You Want, Derek Sivers writes:

When you want to learn how to do something yourself, most people won’t understand. They’ll assume the only reason we do anything is to get it done, and doing it yourself is not the most efficient way.

But that’s forgetting about the joy of learning and doing.

If you never learn to do anything yourself—or anything beyond extremely basic tasks everyone else knows—you’re not going to lead a very satisfying life. Almost as bad, you probably won’t know it. You’ll only have that gnawing feeling you can’t name, a feeling that’s easy—too easy—to ignore most of the time. You can’t do everything yourself, and it would be madness to try. But you should be thinking about expanding what you can do. I’ve made a conscious effort to resist being defined by what I buy rather than what I do, and that effort has intensified since I read Paul Graham’s essay “Stuff;” notice especially where he says, “Because the people whose job is to sell you stuff are really, really good at it. The average 25 year old is no match for companies that have spent years figuring out how to get you to spend money on stuff. They make the experience of buying stuff so pleasant that “shopping” becomes a leisure activity.” To me it’s primarily tedious.

But this tedious activity is everywhere, and in Spent: Sex, Evolution, and Consumer Behavior, Geoffrey Miller describes how companies and advertisers have worked to exploit evolved human systems for mating and status in order to convince you that you need stuff. Really, as he points out, you don’t: five minutes of conversation does more signaling than almost all the stuff in the world. Still, I don’t really take a moral view of shopping, in that I don’t think disliking shopping somehow makes me more virtuous than someone who does like shopping, but I do think the emphasis on consumption is a dangerous one for people’s mental health and well-being. And I wonder if these issues are also linked to larger ones.

A lot of us are suffering from an existential crisis and a search for meaning in a complex world that often appears to lack it. You can see evidence in the Western world’s high suicide rates, in Viktor Frankl’s book Man’s Search for Meaning (he says, “I do not at all see in the bestseller status of my book so much an achievement and accomplishment on my part as an expression of the misery of our time: if hundreds of thousands of people reach out for a book whose very title promises to deal with the question of a meaning to life, it must be a question that burns under the fingernails”), in Irvin Yalom’s Existential Psychotherapy (especially the chapter on despair), in Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, in All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, in The Joy of Secularism: 11 Essays for How We Live Now, in the work of Michel Houellebecq. I could keep going. The question isn’t merely about the number of responses to present conditions, but about what those present conditions are, how they came about, what they say about contemporary politics (Mead makes the political connection explicit in “The Once and Future Liberalism: We need to get beyond the dysfunctional and outdated ideas of 20th-century liberalism“), and what they say about how the individual should respond.

People respond in all kinds of ways. Despair is one. Fanaticism, whether towards sports teams or political parties or organized religion is another, with religion being especially popular. You can retreat to religious belief, but most dogmatic religious beliefs are grounded in pre-modern beliefs and rituals, and too many religions are surrounded by fools (did Heinlein say, “It’s not God I have a problem with, it’s his fan club”? Google yields many variations). Those kinds of answers don’t look very good, at least to me. You have to look harder.

I think part of the answer has to lie in temperament, attitude, and finding a way to be more than a consumer. For a very long time, people had to produce a lot of what they consumed—including their music, food, and ideas. I don’t want to lapse into foolish romanticism about the pre-modern, pre-specialized world, since such a world would be impossible to recreate and ugly if we did. People conveniently forget about starvation and warfare when they discuss the distant past. Plus, specialization has too many benefits—like the iMac I’m looking at, the chair I’m sitting in, the program I’m using to write this, the tasty takeout I can order if I want it, the tea in my kitchen, the condoms in my bedroom, or the camera on my tripod. For all its virtues, though, I’m increasingly convinced that specialization has psychic costs that few of us are really confronting, even if many of us feel them, and those costs relate to how we related to meaning and work.

According to Mead, in the 19th Century, families “didn’t just play together and watch TV together; they worked together to feed and clothe themselves.” Today, disparate activities drive specialization even within the family, and family life has become an increasingly consumption, status-oriented experience. To Mead, “If we wonder why marriage isn’t as healthy today in many cases, one reason is surely that the increasing separation of the family from the vital currents of economic and social life dramatically reduces the importance of the bond to both spouses – and to the kids.” We’ve gotten wealthier as a society, and wealth enables us to make different kinds of choices. Marriage is much more of a consumer good: we choose it, rather than being forced into it because the alternative is distressingly high resource diminishment. Charles Murray observes some effects this has on marriage in Coming Apart: The State of White America, 1960-2010, since getting and staying married has enormous positive effects on income—even if “the vital currents of economic and social life” conspire to make spouses less dependent on each other.

Kids are less economically useful and simultaneously more dependent on their parents. It also means they’re separated from the real world for a very long time. To Mead, part of this is education:

As the educational system grew more complex and elaborate (without necessarily teaching some of the kids trapped in it very much) and as natural opportunities for appropriate work diminished, more and more young people spent the first twenty plus years of their lives with little or no serious exposure to the world of work.

It starts early, this emphasis on dubious education and the elimination of “natural opportunities for appropriate work”:

Historically, young people defined themselves and gained status by contributing to the work of their family or community. Childhood and adulthood tended to blend together more than they do now. [. . .] The process of maturation – and of partner-seeking – took place in a context informed by active work and cooperation.

In the absence of any meaningful connection to the world of work and production, many young people today develop identities through consumption and leisure activities alone. You are less what you do and make than what you buy and have: what music you listen to, what clothes you wear, what games you play, where you hang out and so forth. These are stunted, disempowering identities for the most part and tend to prolong adolescence in unhelpful ways. They contribute to some very stupid decisions and self-defeating attitudes. Young people often spend a quarter century primarily as critics of a life they know very little about: as consumers they feel powerful and secure, but production frightens and confuses them.

I’m familiar with those “stunted, disempowering identities” because I had one for along time. Most teenagers don’t spend their adolescence becoming expert hackers, like Mark Zuckerberg or Bill Gates, and they don’t spend their time becoming experts musicians, like innumerable musicians. They spend their adolescences alienated.

I’m quoting so many long passages from Mead because they’re essential, not incidental, to understanding what’s going on. The result of an “absence of any meaningful connection to the world of work and production” is Lord of the Flies meets teen drama TV and movies. Paul Graham gets this; in one of my favorite passages from “Why Nerds Are Unpopular,” he writes:

Teenage kids used to have a more active role in society. In pre-industrial times, they were all apprentices of one sort or another, whether in shops or on farms or even on warships. They weren’t left to create their own societies. They were junior members of adult societies.

Teenagers seem to have respected adults more then, because the adults were the visible experts in the skills they were trying to learn. Now most kids have little idea what their parents do in their distant offices, and see no connection (indeed, there is precious little) between schoolwork and the work they’ll do as adults.

And if teenagers respected adults more, adults also had more use for teenagers. After a couple years’ training, an apprentice could be a real help. Even the newest apprentice could be made to carry messages or sweep the workshop.

Now adults have no immediate use for teenagers. They would be in the way in an office. So they drop them off at school on their way to work, much as they might drop the dog off at a kennel if they were going away for the weekend.

What happened? We’re up against a hard one here. The cause of this problem is the same as the cause of so many present ills: specialization. As jobs become more specialized, we have to train longer for them. Kids in pre-industrial times started working at about 14 at the latest; kids on farms, where most people lived, began far earlier. Now kids who go to college don’t start working full-time till 21 or 22. With some degrees, like MDs and PhDs, you may not finish your training till 30.

But “school” is so often bad that 30% of teenagers drop out—against their own economic self-interest. Only about a third of people in their twenties have graduated from college. What gives? Part of it must be information asymmetry: teenagers don’t realize how important school is. But the other part of the problem is what Graham describes: how dull school seems, and how disconnected it is from what most people eventually do. And that disconnection is real.

So, instead of finding connections to skills and making things, teenagers pick up status cues from music and other forms of professionally-produced entertainment. Last year, I was on a train from Boston to New York and sat near a pair of 15-year-olds. We talked a bit, and one almost immediately asked me what kind of music I liked. The question struck me because it had been so long since I’d been asked it so early in a conversation with a stranger. In high school and early college, I was asked it all the time: high school-aged people sort themselves into tribes and evaluate others based on music. In college, the first question is, “What’s your major?”, and in the real world it’s, “What do you do?” The way people ask those early questions reveals a lot about the assumptions underlying the person doing the asking.

Now: I like music as much as the next guy, but after high school I stopped using it to sort people. Why should high school students identify themselves primarily based on music, as opposed to some other metric? It’s probably because they have nothing better to signal who they are than music. It would make sense to discuss music if you are a musician or a genuine music aficionado, but I wasn’t one and most of the people I knew weren’t either. Yet the “What’s your favorite music?” question always arose. Now, among adults, it’s more often “What do you do?”, which seems to me an improvement, especially given its proximity to the questions, “What can you do?” and “What do you know?”

But that’s not a very important question for most high school students. They aren’t doing anything hard enough that errors matter. And in some ways, mistakes don’t matter much in most modern walks of life: they don’t cause people to die, or to really live, or do things differently. So finding a niche where mistakes do matter—as they do when you run your own business, or in certain parts of the military, or in some parts of medicine, or as an individual artist accountable to fans—can lead to a fuller, more intensely lived life. But that requires getting off the standard path. Few of us have the energy to bother. Instead, we feel underutilized, with the best parts of ourselves rusting from disuse–or perhaps gone altogether, because we never tried to develop the best parts of ourselves. That might explain, almost as much as my desire to tell stories, why I spend so much time writing fiction that, as of this writing, has mostly been fodder for agents and friends, and why I persist in the face of indifference.

Individuals have to learn to want something more than idle consumption. They have to want to become artists, or hackers, or to change the world, or to make things, all of which are facets of the same central application of human creativity (to me, the art / science divide is bullshit for similar reasons). For much of the 20th Century, we haven’t found “something” in work:

Since work itself was so unrewarding for so many, satisfaction came from getting paid and being able to enjoy your free time in the car or the boat that you bought with your pay. It was a better deal than most people have gotten through history, but the loss of autonomy and engagement in work was a cost, and over time it took a greater and greater toll.

A friend once told me about why he left a high-paying government engineering job for the hazards and debts of law school: at his engineering job, everyone aspired to a boat or a bigger TV. Conversations revolved around what people had bought or were planning to buy. No one thought about ideas, or anything beyond consumption. So he quit to find a place where people did. I mean, who cares that you buy a boat? Maybe it makes getting laid marginally easier, at least for guys, but that time, money, and energy would probably be better spent going out and meeting people, rather than acquiring material objects.

I’ve seen people who have virtually no money be extraordinarily happy and extraordinarily successful with the sex of their choice, and people in the exact opposite condition. The people with no money and lots of sex tend to get that way because of their personalities and their ability to be vibrant (again: see Miller’s book Spent). Even if you’re bad at being vibrant, you can learn to be better: The Game is, at bottom, about how to be vibrant for straight men, and the many women’s magazines (like Cosmo) are, at bottom, about how to be vibrant for women. Neither, unfortunately, really teaches one to be tolerant of other people’s faults, which might be the most important thing in the game of sex, but perhaps that comes through in other venues.

I don’t wish to deify Mead or his argument; when he says, “There was none of the healthy interaction with nature that a farmer has,” I think he’s missing how exhausting farming was, how close farmers were to starvation for much of agricultural history, and how nasty nature is when you’re not protected from it by modern amenities (we only started to admire nature in the late eighteenth century, when it stopped being so dangerous to city dwellers.) It’s easy to romanticize farming when we don’t have to do it. Likewise, Mead says:

A consumption-centered society is ultimately a hollow society. It makes people rich in stuff but poor in soul. In its worst aspects, consumer society is a society of bored couch potatoes seeking artificial stimulus and excitement.

But I have no idea what he means by “poor in soul.” Are Mark Zuckerberg or Bill Gates “poor in soul?” Is Stephen King? Tucker Max? I would guess not, even though all four are “rich in stuff.” We’ve also been “A consumption-centered society” for much of the 20th century, if not earlier, and, all other things being equal, I’d rather have the right stuff than no stuff, even if the mindless acquisition of stuff is a growing hazard. The solution might be the mindful acquisition of stuff, but even that is hard and takes a certain amount of discipline, especially given how good advertisers are at selling. I would also include “politicians” as being among advertisers these days.

Contemporary politics are (mostly) inane, for the structural reasons Bryan Caplan describes in The Myth of the Rational Voter. So I’m predisposed to like explanations along these lines:

Nobody has a real answer for the restructuring of manufacturing and the loss of jobs to automation and outsourcing. As long as we are stuck with the current structures, nobody can provide the growing levels of medical and educational services we want without bankrupting the country. Neither “liberals” nor “conservatives” can end the generation-long stagnation in the wage level of ordinary American families. Neither can stop the accelerating erosion of the fiscal strength of our governments at all levels without disastrous reductions in the benefits and services on which many Americans depend.

Most people on the right and the left have “answers” about contemporary problems that miss large aspects of those problems or the inherent trade-offs involved. A lot of the debate that does occur is dumb, sometimes militantly and sometimes inadvertently, but dumb nonetheless. As Mead says: “We must come to terms with the fact that the debate we have been having over these issues for past several decades has been unproductive. We’re not in a “tastes great” versus “less filling” situation; we need an entirely new brew.” Yet we’re getting variations on old brews, in which liberals look like conservatives in their defense of 1930s-era policies, and conservatives look like conservatives in their veneration of 19th century-style free-market policies. Only a few commentators, like Tyler Cowen in The Great Stagnation, even try earnestly to identify real problems and discuss those problems in non-partisan terms.

This post started as a pair of links, but it ended in an essay because Mead’s essays are so important in the way they get at an essential aspect of contemporary life. If you’re a writer, you can’t afford to ignore what’s happening on the ground, unless you want to be, at best, irrelevant, and I wonder if one reason nonfiction may be outpacing fiction in the race for importance involves the way nonfiction sidesteps questions of meaning by focusing on real things with real effects, instead of how people can’t or won’t find meaning in a world where most of us succeed, at least on a material level, by following a conventional path.

Naturally, I also think about this in the context of fiction. A while ago, I wrote this to a friend: “Too much fiction is just about dumb people with dumb problems doing dumb things that the application of some minor amount of logic would solve. Bored with life because you’re a vaguely artistic hipster? Get a real job, or learn some science, or be a real artist, or do something meaningful. The world is full of unmet needs and probably always will be. But so many characters wander around protected by their own little bubbles. Get out! The world is a big place.” Mead, I think, would agree.

It’s hard to disentangle the individual, education, acquisition, ideas, society, and politics. I’ve somewhat conflated them in my analysis, above, because one inevitable leads to the other, since talking about how you as a person should respond inevitably leads one to questions about how you were educated, and education as a mass-process inevitably leads one to society, and so forth. But I, as an individual, can’t really change the larger systems in which I’m embedded, though I can do a limited amount to observe how those systems work and how I respond to them (which often entails writing like this and linking to other writers).

Distrust That Particular Flavor — William Gibson

As with most essay collections, the ones in Distrust That Particular Flavor are uneven: a few feel like period pieces that’ve outlived their period, but most maintain their vitality (Gibson admits as much in the introduction). Gibson knows about the expiration date of predictions and commentary, and having this feature built into his essays makes them endure better. It’s a useful form of admitting a potential weakness and thus nullifying it. In the place of dubious predictions, Gibson makes predictions about not being able to predict and how we should respond:

I found the material of the actual twenty-first century richer, stranger, more multiplex, than any imaginary twenty-first century could ever have been. And it it could be unpacked with the toolkit of science fiction. I don’t really see how it can be unpacked otherwise, as so much of it is so utterly akin to science fiction, complete with a workaday level of cognitive dissonance we now take utterly for granted.

I’d like to know what that last sentence means: what’s a “workaday level of cognitive dissonance,” as opposed to a high or low level? How do we take it for granted now, in a way w didn’t before? I’d like clarification, but I have some idea of what he means: that things are going to look very different in a couple years, in a way that we can’t predict now. His own novels offer an example of this: in Pattern Recognition, published in 2003, Cayce Pollard is part of a loose collaborative of “footage” fetishists, who hunt down a series of mysterious videos and debate what, if anything, they mean (as so many people do on so many Internet forums: the chatter too often means nothing, as I’ve discovered since starting to read about photography). By 2005, YouTube comes along as the de facto repository of all non-pornographic things video. The “material of the actual twenty-first century” changes from 2003 to 2012. What remains is the weirdness.

In writing and in ideas, though Gibson is less weird and easier to follow here than in his recent fiction. There are transitions, titles, short descriptions in italicized blue at the back of each essay, where the contemporary-ish, 2011 Gibson comments on his earlier work. He gets to grade himself on what he’s gotten right and what he hasn’t. He’s self-aware, about both his faults and his mode of work:

A book exists at the intersection of the author’s subconscious and the reader’s response. An author’s career exists in the same way. A writer worries away at a jumble of thoughts, building them into a device that communicates, but the writer doesn’t know what’s been communicated until it’s possible to see it communicated.

After thirty years, a writer looks back and sees a career of a certain shape, entirely unanticipated.

It’s a mysterious business, the writing of fiction, and I thank you all for making it possible.

Comments like this, on the nature of the book and of writing, are peppered in Distrust That Particular Flavor. Technology changes but writing remains, though we again get the idea of fundamental unpredictability (“the writer doesn’t know what’s being communicated”), which is the hallmark of our time and perhaps the hallmark of life since the Industrial Revolution. It’s the kind of life that science fiction prepares us for, even when the science fiction is wrong about the particulars. It still gets the temperament right. Hence science fiction as a toolkit for the present and future—and, to some extent, as a toolkit for the past. One could view the past as a series of social disruptions abetted and enabled by technology that creates winners and losers in the struggle or cooperation for resources, sex, power:

Much of history has been, often to an unrecognized degree, technologically driven. From the extinction of North America’s mega-fauna to the current geopolitical significance of the Middle East, technology has driven change. [. . .] Very seldom do nations legislate the emergence of new technology.

The Internet, an unprecedented driver of change, was a complete accident, and that seems more often the way of things. The Internet is the result of the unlikely marriage of a DARPA project and the nascent industry of desktop computing. Had nations better understood the potential of the Internet, I suspect they might well have strangled it in its cradle. Emergent technology is, by its very nature, out of control, and leads to unpredictable outcomes.

The first step is recognition, which is part of the work Gibson is doing. Nations also might not “legislate the emergence of new technology,” but they do create more or less favorable conditions to the emergence of technology. Economic historians, general historians, and others have been trying to figure out why the Industrial Revolution emerged from England when it did, as opposed to emerging somewhere else or sometime else. I find the Roman example most tantalizing: they appear to have missed the printing press and gunpowder as two major pre-conditions, since the printing press allows the rapid dissemination of ideas and gunpowder, if used correctly, lowers of the cost of defense against barbarians.

I find the idea of history being “technologically driven” intriguing: technology has enabled progressively large agglomerations of humans, whether in what we now call “countries” or “corporations,” to act in concert. The endgame isn’t obvious and probably never will be, unless we manage to destroy ourselves. We can only watch, participate in, or ignore the show. Most people do the latter, to the extent they can.

I use a fountain pen and notebook and so identify with this:

Mechanical watches are so brilliantly unnecessary.
Any Swatch or Casio keeps better time, and high-end contemporary Swiss watches are priced like small cars. But mechanical watches partake of what my friend John Clute calls the Tamagotchi Gesture. They’re pointless in a peculiarly needful way; they’re comforting precisely because they require tending.

Much of life, especially cultural life, beyond food, shelter, and sex might be categorized as “brilliantly unnecessary;” it’s awfully hard to delineate where the necessary ends and superfluous begins—as the Soviet Union discovered. To me, haute couture is stupidly unnecessary, but a lot of fashion designers would call fountain pens the same. Necessity changes. Pleasure varies by person. Being able to keep “better time” isn’t the sole purpose of a watch, which itself is increasingly an affectation, given the ubiquity of computers with clocks embedded (we sometimes call these computers “cell phones”). We want to tend. Maybe we need to. Maybe tending is part of what makes us who we are, part of what makes us different from the people who like hanging out with their friends, watching TV, and shopping. Gibson also mentions that his relationship or lack thereof to TV also relates to him as a writer:

I suspect I have spent just about exactly as much time actually writing as the average person my age has spent watching television, and that, as much as anything, may be the real secret here.

Notice that word, “may,” weakening his comment, but not fatally. TV is the mostly invisible vampire of time, and it’s only when people like Gibson, or Clay Shirky, point to it as such that we think about it. Doing almost anything other than watching TV with the time most people spend watching it means you’re going to learn a lot more, if you’re doing something even marginally active (this is Shirky’s point about the coming “cognitive surplus” enabled by the Internet). Gibson did something different than most people his generation, which is why we now know who he is, and why his thoughts go deeper. Like this, variations of which I’ve read before but that still resonate:

Conspiracy theories and the occult comfort us because they present models of the world that more easily make sense than the world itself, and, regardless of how dark or threatening, are inherently less frightening.

They’re less frightening because they have intentionality instead of randomness, and random is really scary to many people, who prefer to see causality where none or little exists. Instead, we have all these large systems with numerous nodes and inherently unpredictability in the changes and interactions between the nodes; one can see this from a very small to a very large scale.

This is easier to perceive in the abstract, as stated here, than in the concrete, as seen in life. So we get stories, often in “nonfiction” form, about good and evil and malevolent consciousnesses, often wrapped up in political narratives, that don’t really capture reality. The weirdness of reality, to return to term I used above. Reality is hard to capture, and perhaps that science fiction toolkit gives us a method of doing so better than many others. Certainly better than a lot of the newspaper story toolkits, or literary theory toolkits, to name two I’m familiar with (and probably better than religious toolkits, too).

I’m keeping the book; given that I’ve become progressively less inclined to keep books I can’t imagine re-reading, this is a serious endorsement of Distrust That Particular Flavor. I wish Gibson wrote more nonfiction—at least, I wish he did if he could maintain the impressive quality he does here.

Humor as an antidote to frustration, from Christopher Hitchens

I think of Christopher Hitchens more along the lines of Katha Pollitt, who “want[s] to complicate the picture even at the risk of seeming churlish.” And she does. Still, Hitchens was sometimes spectacularly right, as in this introduction to Arguably: Essays:

The people who must never have power are the humorless. To impossible certainties of rectitude they ally tedium and uniformity. Since an essential element in the American idea is its variety, I have tried to celebrate things that are amusing for their own sake, or ridiculous but revealing, or simply of intrinsic interest. All of the above might apply to the subject of my little essay on the art and science of the blowjob, for example [….]

Be almost as wary of the humorless as you are of the people who pride themselves on humor.

Essays: The modern genre, and why writing for the web counts

In writing about Paul Graham’s “The Age of the Essay,” I forgot to mention this:

Up till a few years ago, writing essays was the ultimate insider’s game. Domain experts were allowed to publish essays about their field, but the pool allowed to write on general topics was about eight people who went to the right parties in New York. Now the reconquista has overrun this territory, and, not surprisingly, found it sparsely cultivated. There are so many essays yet unwritten. They tend to be the naughtier ones; the insiders have pretty much exhausted the motherhood and apple pie topics.

This leads to my final suggestion: a technique for determining when you’re on the right track. You’re on the right track when people complain that you’re unqualified, or that you’ve done something inappropriate. If people are complaining, that means you’re doing something rather than sitting around, which is the first step. And if they’re driven to such empty forms of complaint, that means you’ve probably done something good.

This is part of the reason I write a fair amount about sex, sexual politics, sexuality in writing, and so forth: they’re not as deeply mined as other topics, and they’re also changing rapidly in strange, unpredictable ways vaguely reminiscent of cellular automata or Go. A lot of people do complain about writing on those subjects because they’re subjects about which people often have a) very strongly held belief that b) are not based on or supported by evidence. So a lot of people will complain that “you’ve done something inappropriate” when you write about them; that was certainly part of the response I got to Status and sex: On women in bands never getting laid and Norah Vincent’s Self-Made Man and Sexting and society: How do writers respond? Lots of people have written about sex in fiction, the most obvious being The Joy of Writing Sex, but even that one has a bogus-seeming chapter on HIV. Not too many have written about it like I have (so far as I know).

Plus, almost no one in writing programs or English classes—where I spend a lot of my time—tells you to pay attention to contemporary sexual politics or how things have changed and are changing—which leaves a lot of space for re-conquistadors. Instead, they want to tell you that you can see parallels between Jane Austen’s world and ours. Which is true, but not very helpful to, say, fiction writers: if your characters have the same relationship to marriage and sex that Austen’s did, you’re probably not writing compelling fiction. You’re writing to standards that have already changed so much that people reading your work will feel like they’ve entered a time warp. Hell, as I read Updike’s work from 1959 – 2008, I can’t help but notice that he seems like he’s writing about a world that, although it’s closer to me than Jane Austen’s, is still pretty far from the one I grew up with and live in now. He has lots of naughty parts, but also lots of people very concerned with each others’ religions. They also tend to live in suburbs, which was once a big deal but which I now find pretty boring, on average; I tend to write about characters who want to or are escaping from the suburbs. Updike is a high-status writer, but I can’t help but thinking a lot of his writing does feel like he’s playing an insider’s game.

In reading The Research Bust, Mark Bauerlein implicitly points out the consequences of what happens when “the reconquista has overrun” the major position of people in “New York” or academia. It used to be you had to be an academic or journalist to write anything that might be read by more than a handful of people. Now that almost anyone can for virtually no marginal cost, the academics especially are trapped in a world of diminishing returns: people can read things other than their articles, and academic journals appear to have responded by narrowing their focus even further. Bauerlein says that “after four decades of mountainous publication, literary studies has reached a saturation point.” Literary studies of canonical writers may have “reached a saturation point,” but I see little evidence that people no longer want to read anything; one could argue that, with the advent of the web, many people are reading more than ever. The logical response to that circumstance is to do what Graham advocates: look for something new to write about. A fair number of academics have said or implied that I’m wasting my time writing this blog, since that time could be spent on academic articles. This sounds very close to “inappropriate” to me. Which might mean that I’m on Graham’s right track: by producing work outside the scholarly hothouse, and by not believing in its importance, I’m infinitesimally lowering its value. And that’s a pretty scary thing, if your whole life is based around the model of letting others validate your work. But I’d rather spend time in the “sparsely cultivated” territory of of the web than fight for a spot of dubious value off it.

Paul Graham and not being as right as he could be in “The Age of the Essay”

Paul Graham often challenges people who say that he’s wrong to cite a particular sentence that is untrue; see, for example, this: “Can you give an example of something I said that you think is false?” Elsewhere, although I can’t find a link at the moment, he says that most people who say he’s said something wrong aren’t actually referring to something he’s said, but something they think he’s said, or imagines he might say. Hence my italicization of “something I said:” Internet denizens often extrapolate from or simplify his often nuanced positions in an attempt to pin ideas to him that he hasn’t explicitly endorsed. So I’m going to try not to do that, but I will nonetheless look at some of what he’s said about writing and writing education and describe some of my attempts to put his implied criticisms into action.

While I think Graham is right the vast majority of the time, I also think he’s off the mark regarding some of his comments about how writing is taught in schools. I wouldn’t call him wrong, exactly, but I would say that trying some of the things he suggests or implicitly suggests hasn’t worked out nearly as well as I’d hoped, especially when applied to full classrooms of students drawn from a wide spectrum of ability and interest.

I’ve long been bothered by the way writing and related subjects are taught in school. They’re made so boring and lifeless most of the time. Part of the problem, and perhaps the largest part, is the teachers. I’ve spent a lot of time contemplating how to improve the writing class experience. Some of that effort appears to be paying off: a surprisingly large number of students will say, either to me directly or in their evaluations, that they usually hate English classes but really like this one. Yes, I’m sure some are sucking up, but I don’t care about sucking up and suspect students can detect as much. I really care about what happens on their papers. But some of my experiments haven’t worked, and I’ll talk about them here.

In “The Age of the Essay,” Graham starts:

Remember the essays you had to write in high school? Topic sentence, introductory paragraph, supporting paragraphs, conclusion. The conclusion being, say, that Ahab in Moby Dick was a Christ-like figure.

Oy. So I’m going to try to give the other side of the story: what an essay really is, and how you write one. Or at least, how I write one.

Graham doesn’t say so explicitly, but the implication of “the other side of the story” and “what an essay really is” is that essay writing in school should be more like real essay writing. To some extent he’s right, but trying to make school essay writing like real essay writing doesn’t yield the kinds of results I’d hoped for. Graham is right that he hasn’t directly said that school writing should be more like real writing, but it’s an obvious inference from this and other sections of “The Age of the Essay,” which I’ll discuss further below. He also does a lot with the word “Oy:” it expresses skepticism and distaste wrapped in one little word.

The way Graham puts it, writing a school essay sounds pretty bad; concluding “that Ahab in Moby Dick was a Christ-like figure” in a pre-structured essay is tedious, if for no other reason than because a million other students and a much smaller number of teachers and professors have already concluded or been forced to conclude the same thing. I think that a) teaching literature can be a much better experience and still serves some institutional purposes, and b) teaching writing in the context of other subjects might not be any better.

Passion and interest

Graham:

The most obvious difference between real essays and the things one has to write in school is that real essays are not exclusively about English literature. Certainly schools should teach students how to write. But due to a series of historical accidents the teaching of writing has gotten mixed together with the study of literature. And so all over the country students are writing not about how a baseball team with a small budget might compete with the Yankees, or the role of color in fashion, or what constitutes a good dessert, but about symbolism in Dickens.

I’d love to get well-developed essays on baseball, economics, and fashion. But most students either don’t appear to have the kind of passion that would be necessary to write such essays or don’t appear able to express it. Alternately, they have passion, but not knowledge behind the passion: someone who’d read Moneyball and other baseball research and could put together this kind of essay, but almost no students have. Even those who do have the passion don’t have much knowledge behind their passion. I’ve been implicitly testing this theory for the past three and a half years: on my assignment sheets, I always include a line that tells students something like this: they can write on “a book or subject of your own choosing. If you write on a book or idea of your own, you must clear your selection with me first.” Almost none exercise this choice.

Now, one could argue that students have been brainwashed by 12 years of school by the time I’ve got them, and to some extent that’s probably true. But if a student were really, deeply interested in a subject, I think she’d be willing to say, “Hey, what if I mostly write about the role of imagination among physicists,” and I’d probably say yes. This just doesn’t happen often.

I think it doesn’t happen because students don’t know where to start, and they aren’t skilled enough to closely read a book or even article on their own. They don’t know how to compare and contrast passages well—the very thing I’m doing here. So I could assign a book about baseball and work through the “close reading” practice in class, but most people aren’t that interested in the subject, and then the people interested in fashion or math will be left out (and most students who say they’re “interested in fashion” appear to mean they skim Cosmo and Vogue).

If you’re going to write about a big, somewhat vague idea, like money in baseball, you need a lot more knowledge and many more sources than you do to write about “symbolism in Dickens.” Novels and stories have the advantage of being self-contained. That’s part of what got the New Criticism technique of “close reading” so ingrained in schools: you could give students 1984 and rely on the text itself to argue about the text. This has always been a bit of a joke, of course, because knowing about the lead up to World War II and the beginnings of the Cold War will give a lot of contextual information about 1984, but one can still read the novel and analyze it on its own terms more easily than one can analyze more fact-based material. So a lot of teachers rely on closely reading novels, which I’ll come back to in a bit.

There may be more to the story of why students are writing about 1984 and not “what constitutes a good dessert” beyond “a series of historical accidents.” Those accidents are part of the story, but not all.

Amateurs and experts

What’s appropriate for amateurs may not be appropriate for experts; Daniel Willingham makes this point at length in his book Why Don’t Students Like School: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom; he says that “Cognition early in training is fundamentally different from cognition late in training” and, furthermore, “[. . .] years of practice make a qualitative, not quantitative, difference in the way [scientists, artists, and others] think compared to how a well-informed amateur thinks.” We don’t get there right away: “Experts don’t think in terms of surface features, as novices do; they think in terms of functions, or deep structure.” It takes years of that dedicated practice to become an expert, and ten often appears to be it: “There’s nothing magical about a decade; it just seems to take that long to learn the background knowledge to develop” what one really needs to do the new, interesting, creative work that defines an expert.”

Graham is an expert writer. He, like other expert writers, can write differently than amateurs and still produce excellent work. Novice writes usually can’t write effectively without a main point of some sort in mind. I couldn’t, either, when I was a novice (though I tried). Graham says:

The other big difference between a real essay and the things they make you write in school is that a real essay doesn’t take a position and then defend it. That principle, like the idea that we ought to be writing about literature, turns out to be another intellectual hangover of long forgotten origins.

He’s right in the sense that real essays don’t have to take a position and defend it, but teachers insist on thesis statements for the same reason bikes for three-year olds have training wheels: otherwise the student-writer will fall over. If you don’t get students to take a position, you’ll get—maybe—summarization. If you don’t ask for and emphasize thesis statements, which are basically the position to be defended, you’ll get wishy-washy essay that don’t really say much of anything. And it’s not that they don’t say much of anything because they’re trying to explore a complex problems: they don’t say much of anything because the writer doesn’t have anything to say, or is afraid of saying anything, or doesn’t know how to explore a problem space. If you want an academic-ized version of what essays are, Wolfgang Holdheim says in The Hermeneutic Mode: Essays on Time in Literature and Literary Theory that “[…] works in the essay genre (rather than presenting knowledge as a closed and often deceptively finished system) enact cognition in progress, knowledge as the process of getting to know.” Students don’t have the cognition in progress they need to enact Graham-style essays. They haven’t evolved enough to write without the scaffolding of a thesis statement.

When I started teaching, I didn’t emphasize thesis statements and got a lot of essays that don’t enact cognition or make a point. The better ones instinctively made a point of some kind; the worse ones summarized. After a while I realized that I could avoid a lot of heartache on the part of my students by changing the way I was offering instruction, because students weren’t ready to write essays without taking a position and defending it.

So now I teach thesis statements more or less like every other English instructor. I try to avoid boring theses and encourage deep ones, but it’s nonetheless true that I’ve realized I was wrong and have consequently moved on. I consider the no-thesis-emphasized experiment just that: an experiment that taught me how I should teach. In the future, I might try other experiments that could lead me away from emphasizing thesis statements. But for now, I do teach students to take a perspective and defend it. Many don’t end up doing so—their papers end up more exploratory than disputatious—but the overall effect of telling them to take a point of view and defend it is a positive one.

I’m not the first one to have noticed the problem. In Patrick Allitt’s I’m the Teacher, You’re the Student, he says this of student writing in a history class:

Certain errors are so common as to be almost universal. The first one is that almost no student really knows how to construct an argument and then deploy information to support and substantiate it. Usually student papers describe what happened, more or less, then throw in an indignant moral judgment or two before stopping abruptly.

I know the feeling: students, when they start my class, mostly want to summarize what they’ve read. And, as Allitt notes, they badly want to moralize, or castigate other people, or to valorize their own difference from the weakness of the writer’s. I find the moralizing most puzzling, especially because it makes me think I’m teaching a certain number of people who are a) hypocrites or b) lack the empathy to understand where other writers come from, even if they don’t agree with said writer. They use ad-hominem attacks. When I assign Graham’s essays “What You’ll Wish You’d Known” and “What You Can’t Say,” a surprisingly large number of students say things like, “Who is this guy?”

When I tell them something along the lines of, “He started an early Internet store generator called Viaweb and now writes essays and an early-stage startup investment program,” their follow-up questions are usually a bit incoherent but boils down to a real question: Who gives him the authority to speak to us? They’re used to reading much-lauded if often boring writers in school. When I say something like, “Who cares who he is?” or “Shouldn’t we judge people based on their writing, not on their status?” they eye me suspiciously, like six-year olds might eye an eight-year old who casts aspersions on the Tooth Fairy.

They’ve apparently been trained by school to think status counts for a lot, and status usually means being a) old, b) dead, c) critically acclaimed by some unknown critical body, and d) between hard or soft covers, ideally produced by a major publisher. I’m not again any of those things: many if not most of my favorite writers fit those criteria. But it’d be awfully depressing if every writer had to. More importantly, assuming those are the major criteria for good writing is fairly bogus since most old dead critically acclaimed writers who are chiefly found between hard covers were once young firebrands shaking up a staid literary, social, political, or journalistic establishment with their shockingly fresh prose and often degenerate ideas. If we want to figure out who the important dead people will be in the future, we need some way of assessing living writers right now. We need something like taste, which is incredibly hard to teach. Most schools don’t even bother: they rely on weak fallback criteria that are wrapped up in status. I’d like my students to learn how to do better, no matter how hard.

Some of the “Who is this guy?” questions regarding Graham come from a moralizing perspective: students think or imply that someone who publishes writing through means other than books are automatically somehow lesser writers than those whose work is published primarily between hard covers (Graham published Hackers & Painters, as well as technical books, but the students aren’t introduced to him in that fashion; I actually think it useful not to mention those books, in order to present the idea that writing published online can be valid and useful).

Anyway, trying to get students to write analytically—to be able to understand and explain a subject before they develop emotional or ethical reactions to it—is really, incredibly difficult (Allitt mentions this too). And having them construct and defend thesis statements seems to help this process. Few students understand that providing analysis and interpretation is a better, subtler way of eventually convincing others of whatever emotional or ethical point of view you might hold. They want to skip the analysis and interpretation and go straight to signaling what kind of person they want the reader to imagine them to be.

Not all students have all these problems, and I can think of at least one student who didn’t have any of them, and probably another dozen or so (out of about 350) who had none or very few of these problems when they began class. I’m dealing with generalizations that don’t apply to each individual student. But class requires some level of generalization: 20 to 30 students land in a room with me for two and a half hours per week, and I, like all instructors, have to choose some level of baseline knowledge and expectation and some level of eventual mastery, while at the same time ensuring that writing assignments are hard enough to be a challenge and stretch one’s abilities while not being so hard that they can’t be completed. When I see problems like the ones described throughout this essay, I realize the kinds of things I should focus on—and I also realize why teachers do the things they do the way they do them, instead of doing some of the things Graham implies.

Reading Allitt makes me realize I’m not alone, and he has the same issues in history I have in English. His other problems—like having students who “almost all use unnecessarily complicated language”—also resonate; I talk a lot about some of the best and pithiest writing advice I’ve ever read (“Omit unnecessary words“), but that advice is much easier to state than implement (my preceding sentence began life saying, “much easier to say than to implement,” but I realized I hadn’t followed my own rule).

Graham again:

I’m sometimes accused of meandering. In defend-a-position writing that would be a flaw. There you’re not concerned with truth. You already know where you’re going, and you want to go straight there, blustering through obstacles, and hand-waving your way across swampy ground. But that’s not what you’re trying to do in an essay. An essay is supposed to be a search for truth. It would be suspicious if it didn’t meander.

But defend-a-position essays, if they’re taught and written well, shouldn’t be completely opposed to meandering, and they’re not about “blustering through obstacles.” They’re about considering what might be true, possible objections to it, addressing those questions, building roads over “swamp ground,” changing your mind if necessary, and so on—eventually getting to something like truth. In Graham’s conception of defend-a-position essays, the result is probably going to be lousy. The same is likely to be true of students who are taught the “hand-waving your way” method of writing. They should be taught that, if they discover their thesis is wrong, they should change their thesis and paper via the magic of editing. I think Graham is really upset about the quality of teaching.

Thesis statements also prevent aimless wandering. Graham says that “The Meander (aka Menderes) is a river in Turkey. As you might expect, it winds all over the place. But it doesn’t do this out of frivolity. The path it has discovered is the most economical route to the sea.” Correct. But students do this out of frivolity and tend to get nowhere. Students don’t discover “the most economical route to the sea;” they don’t have a route at all. They’re more like Israelites wandering in the desert. Or a body of water that simply drains into the ground.

Why literature?

Graham:

It’s no wonder if this [writing essays about literature] seems to the student a pointless exercise, because we’re now three steps removed from real work: the students are imitating English professors, who are imitating classical scholars, who are merely the inheritors of a tradition growing out of what was, 700 years ago, fascinating and urgently needed work.

We may have gotten to teaching students how to write through literature via the means Graham describes, but I don’t think the practice persists solely because of the history. It persists because teaching through literature offers a couple of major conveniences: literature can be studied as a self-contained object via close reading and offers a narrower focus for students than larger subjects that require more background.

The rise of literature in university departments started in the nineteenth century and really took off in the first half of the twentieth. It was helped enormously by the rise of “close reading,” a method that had two major advantages: the trappings of rigor and a relative ease of application.

The “trappings of rigor” part is important because English (and writing) needed to look analytical and scientific; Louis Menand covers this idea extensively in a variety of forums, including The Marketplace of Ideas: Reform and Resistance in the American University, where he says that the argument “that there is such a thing as specifically literary language, and that literary criticism provides an analytical toolbox for examining it—was the basis for the New Criticism’s claim to a place in the structure of the research university.” So students look at literature because teachers and professors believe there is “specifically literary language” that’s different from other kinds of language. I used to not think so. Now I’m not so sure. After having students try to write analyses of various kinds of nonfiction, I can see the attraction in teaching them fiction that doesn’t have a specific message it’s trying to impart, primarily because a lot of students simply don’t have sufficient background knowledge to add anything to most of the nonfiction they read. They don’t read nonfiction very carefully, which means they have trouble making any statements other than bald assertion and frequently saying things that be countered through appeals to the text itself. Getting them to read it carefully through the asking of detailed questions is both hard and tedious.

Enter close reading. It supplies literature with a rationale, as stated above, but it also works pretty well when used in classrooms. As a method, it only requires knowledge of the tool and some text to apply it on. Like literature. To do close reading, you have to know you should pay attention to the text and how its writer or speaker is using the language it does. From there, the text becomes what Umberto Eco calls “a machine conceived for eliciting interpretations” in a way that a lot of nonfiction isn’t.

Paul Graham’s essay “What You’ll Wish You’d Known,” which I teach in my first unit, almost always generates vastly worse papers than James Baldwin’s short story “Sonny’s Blues” because Graham has deliberately covered most of the interesting territory relating to his subject. “Sonny’s Blues,” on the other hand, is just trying to tell a story, and the possible meanings of that story extend incredibly far outward, and they can be generated through close readings and relatively little other knowledge. Students who want to discuss “What You’ll Wish You’d Known” intelligently need a vast amount of life experience and other reading to even approach it cogently.

Students who want to discuss “Sonny’s Blues” intelligently need to pay attention to how the narrator shifts over the course of the story, how sound words recur, what music might mean, and a host of other things that are already mostly contained in the story. Students seem to have much more difficulty discovering this. When I teach Joyce Carol Oates’ short story “Where Are You Going, Where Have You Been?”, students almost never realize how the story subtly suggests that Connie is actually in a dream that plays out her anxieties regarding puberty, adulthood, and encroaching sexuality. It offers a lot more substance for discussion and decent papers than Graham’s essays and a lot of other nonfiction.

Perhaps the bad papers on Graham are my own fault, but I’ve tried a lot of ways to get students to write better papers on nonfiction, usually without much success. I’ve begun to suspect they’re just not ready. Students can be taught close reading that, in an ideal world, then gets applied to nonfiction. The reading of literature, in other words, is upwind of the reading of other kinds of nonfiction, however useful or interesting those other kinds of nonfiction might be. If you’re dealing with not-very-bright high school teachers and students who know even less than college students, the advantages of close reading literature as a method are magnified.

This is a relatively new affair, too; here’s Louis Menand discussing where English departments came from and how T.S. Eliot influenced them:

The English department is founded on the belief that people need to be taught how to read literature. This is not a self-evident proposition. Before there were English departments, people read stories, poems, and plays without assuming that special training was required. But most English professors think that people don’t intuitively get the way that literary writing works. Readers think that stories and poems are filled with symbols that ‘stand for’ something, or that the beliefs expressed in them are the author’s own, or that there is a hidden meaning they are supposed to find. They are unable to make sense of statements that are not simple assertions of fact. People read literature too literally.

Now, maybe people don’t “need to be taught how to read literature” as literature. But they do need to be taught how to read closely, because most people are really bad at it, and literature offers advantages to doing so.

Most students don’t have very good reading skills. They can’t synthesize information from books and articles effectively. So if you turn them loose on a library without direction, they’ll dutifully look some stuff up, and you’ll get back a lot of papers with citations from pages three to nine. Not very many cite page 221. And the citations they have feel random, rather than cohesive. In a structured class, one can spend a lot of time close reading: what does the author mean here? Why this sentence, why this turn of phrase? How is the piece structured? If it’s a story, who’s speaking? These skills are hard to build—I’m still building mine—and most freshmen simply don’t have them, and they don’t have the energy to engage with writing on its own terms in an unstructured environment.

Giving them a topic and telling them to write is akin to taking a random suburbanite, dropping them in northern Canada, and wishing them luck in finding their way back to civilization. Sure, a few hardy ones will make it. But to make sure most make it, you’ll have to impart a lot of skills first. That’s what good high school and undergrad classes should do. The key word in the preceding sentence, of course, is “good:” lots of humanities classes are bad and don’t teach much of anything, which gives the humanities themselves a bad rap, as people recall horrific English or history teachers. But one bad example doesn’t mean the entire endeavor is rotten, even if the structure of schools isn’t conducive to identifying and rewarding good teachers of the sort who will teach writing well.

Bad Teaching and the Real Problem with Literature

English, like most subjects, is easy to do badly. Most English teachers teach their subjects poorly; that’s been my experience, anyway, and it seems to be the experience of most people in school. I’m not sure broadening the range of subjects will help all that much if the teacher himself is lousy, or uninterested in class, or otherwise mentally absent.

It’s also easy to understand why English teachers eventually come to scorn their students: the students aren’t perfect, have interests of their own, aren’t really willing to grant you the benefit of the doubt, aren’t interested in your subject, and don’t understand your point of view. Notice that last one: students don’t understand the teacher’s point of view, but after a while the teacher stops trying to understand the students’s point of view. “What?” the teacher thinks. “Not everyone finds The Tempest and Middlemarch as fascinating as I do?” Er, no. And that kind of thing bleeds into papers. The world might be a better place if teachers could choose more of their own material; I’ve read most of Middlemarch and find it pretty damn tedious. Perhaps giving teachers more autonomy to construct their own curriculum around works students like better would solve some of the literature problem. But if the median student doesn’t read anything for pleasure, what then?

Too many teachers also don’t have a sense of openness and possibility to various readings. They don’t have the deft touch necessary to apply both rigor and openness to their own readings and students’s readings. Works of art don’t have a single meaning (and if they did, they’d be rather boring). But that doesn’t equate to “anything can mean anything and everything is subjective.” In teaching English, which is often the process of teaching interpretation, one has to balance these two scales. No one balances them perfectly, but too many teachers don’t seem to balance them at all, or acknowledge that they exist, or care that they exist. So you get those essays that find, “say, that Ahab in Moby Dick was a Christ-like figure.” Which is okay and probably true, but I wouldn’t want to read 30 papers that come to that conclusion, and I wouldn’t order my students to come to that conclusion. I’d want them to figure out what’s going on in the novel (then again, in composition classes I teach a lot of stuff outside the realm of “English literature”).

Not being a bogus teacher is really hard. Teachers aren’t incentivized to not be bogus: most public high school teachers effectively can’t be fired after two or three years, thanks to teachers’ unions, except in the case of egregious misconduct. Mediocrity, tedium, torpor, and the like aren’t fireable or punishable offenses. Students merely have to suffer through until they get to college, although some get lucky and find passionate, engaged teachers. But it’s mostly a matter of luck, and teaching seems to actively encourage the best to leave and the worst to stay. Even at college, however, big public schools incentivize professors and graduate students to produce research (or, sometimes “research,” but that’s a topic for another essay), not to teach. So it’s possible to go through 16 years of education without encountering someone who is heavily incentivized to teach well. Some people teach well because they care about teaching well—I’d like to think I’m one—but again, that’s a matter of luck, not a matter of systematic efforts to improve the education experience for the maximum number of students.

Teachers can, and do, however, get in trouble for being interesting. So there’s a systematic incentive to be boring.

In an essay that used to be called “Good Bad Attitude” and now goes by “The Word ‘Hacker,’” Graham says that “Hackers are unruly. That is the essence of hacking. And it is also the essence of American-ness.” Writers are unruly too. At least the good ones are. But many teachers hate unruliness and love conformity. So they teach writing (and reading—you can’t really do one without the other) on the factory model, where a novel or whatever goes in one end and is supposed to emerge on the other like a car, by making sure every step along the way is done precisely the same way. But writing (and, to some extent, reading) doesn’t really work that way, and students can sense as much in some inchoate way. Graham, too, senses that the way we teach writing and reading is busted, and he’s right that we’d be better off encouraging students to explore their own interests more. That’s probably less important than cultivating a sense of openness, explicitly telling students when you’re ordering them to do something for training-wheel purposes, admitting what you don’t know, acknowledging that there’s an inherent level of subjectivity to writing, and working on enumerating principles that can be violated instead of iron-clad rules that are almost certainly wrong.

Most students aren’t interested in English or writing; one can do a lot to make them interested, but it’s necessarily imperfect, and a lot of classrooms are unsatisfying to very bright people (like Graham and, I would guess, a lot of his readers), but that’s in part because classrooms are set up to hit the broad middle. And the broad middle needs thesis statements, wouldn’t know how to start with a wide-open prompt, and aren’t ready for the world of writing that Graham might have in mind.

While a series of historical accidents might’ve inspired the teaching we get now, I don’t think they’re solely responsible for the continuation of teaching literature. Teaching literature and close reading through literature continue to serve pedagogical purposes. So Graham isn’t wrong, but he’s missing a key piece of the story.

Writing this essay

When you’re thinking about a topic, start writing. I began this essay right after breakfast; I started thinking about it while making eggs and thinking about the day’s teaching. I had to interrupt it to go to class and do said teaching, but I got the big paragraph about “status” and a couple notes down. If you’re not somewhere you can write, use a notebook—I like pretentious Rhodia Webbies, but any notebook will do. If you don’t have a notebook, use a cell phone. Don’t have a phone? Use a napkin. Whatever. Good ideas don’t always come to you when you’re at your computer, and they often come while you’re doing something else. Paul Graham gets this: in “The Top Idea in Your Mind,” he wrote:

I realized recently that what one thinks about in the shower in the morning is more important than I’d thought. I knew it was a good time to have ideas. Now I’d go further: now I’d say it’s hard to do a really good job on anything you don’t think about in the shower.

Everyone who’s worked on difficult problems is probably familiar with the phenomenon of working hard to figure something out, failing, and then suddenly seeing the answer a bit later while doing something else. There’s a kind of thinking you do without trying to. I’m increasingly convinced this type of thinking is not merely helpful in solving hard problems, but necessary. The tricky part is, you can only control it indirectly.

Most students don’t do this and don’t think this way. If they did, or could be instructed to, I suspect Graham’s ideas would work better.

Knowing it

Students themselves, if they’re intellectually honest, intuit a lot of the advice in this essay. One recent paper writer said in a reflection that: “My first draft does not have a direction or a point, but my final draft does.” Not all writing needs a point, but if you read student writing, you find that very little of it lacks a point because the author is trying to discover something or explore something about the world. It lacks a point because it’s incoherent or meandering. Again: that’s not me trying to be a jerk, but rather a description of what I see in papers.

Here’s another: “You were correct in telling me that writing a paper by wrapping evidence around big ideas rather than literary analysis would be difficult, and I found that out the hard way.” These writers could be trying to suck up or tell me what I want to hear, but enough have said similar things in a sufficient number of different contexts to make me think their experiences are representative. And I offer warnings, not absolute rules: if students want to write “big idea” papers, I don’t order them not to, though many suffer as a result. Suffering can lead to growth. A few thrive. But such students show why English instructors offer the kinds of guidance and assignments they do. These can be parodied, and we’ve all had lousy English classes taught by the incompetent, inept, and burned out.

If I had given students assignments closer to the real writing that Graham does, most simply wouldn’t be able to do them. But I am pushing students in the direction of real writing—which is part of the reason I tell the ones who want to really write to read “The Age of the Essay.” I love the essay: it’s only some of the reasoning about why schools operate the way they do that bothers me, and even then I only came to discover why things are done the way they are by doing them.

If you think you can teach writing better, I encourage you to go try it, especially in a public school or big college. I thought I could. Turned out to be a lot harder than I thought. Reality has a surprising amount of detail.

EDIT: In A Jane Austen Education, William Deresiewicz writes:

My professor taught novels, and Catherine was mistaught by them, but neither he nor Austen was finally concerned with novels as such. Learning to read, they both knew, means learning to live. Keeping your eyes open when you’re looking at a book is just a way of teaching yourself to keep them open all the time.

Novels are tricky in this way: they’re filled with irony, which, at its most basic, means saying one thing while meaning something else, or saying multiple things and meaning multiple things. That’s part of what “learning to live” consists of, and fiction does a unique job of training people to keep their eyes “open all the time.” Most teachers are probably bad at conveying this, but I do believe that this idea, or something like it, lies underneath novels as tools for teaching students how to live in a way that essays and other nonfiction probably doesn’t do.

A lot of people seem very eager to stop learning how to live as quickly as possible. They might have the hardest time of all.