Subjectivity in writing and evaluating writing

This essay started its life as an e-mail to a student who wanted to know if all writing was, on some level, “just subjective,” which would imply that grading is bogus and so is much of what we do in English classes. I didn’t have time to offer a nuanced explanation of what makes good writing good, so I wrote to him later that night. He didn’t reply to the e-mail.

I was thinking about our conversation and realized that I have more to say about the issues of subjectivity and skill in writing: as you observed, there’s an element of subjectivity in judging what’s good writing and what isn’t. But it’s also worth noting that dominant opinions change over time—a lot of the writing from the 18th and 19th Century, for example, was considered “good” if it contained long sentences with balanced, nested clauses, and such stylistic preferences are one reason why a lot of contemporary students have trouble reading such material today, because most of us value variety in sentence structure and value less complexity less.

This is normally the place where I could go off on a rant about social media and cell phones and texting speak and how the kids these days are going to hell, but I’ll avoid that because it doesn’t appear true overall and certainly isn’t true regarding writing. The trend, including among professional writers writing for other expert writers, has been towards simpler structures and informality (which may speak about the culture as a whole).

IMG_3049That being said, if you want to write a paper full of long, windy clauses and abstruse classical allusions, I’m not going to stop or penalize you and may even reward you, since few if any students write in such a fashion, and I (like most contemporary people) value novelty. The number of people imitating James Boswell may be too small! As long as the content is strong, I’m willing to roll with somewhat unusual stylistic quirks, and I’m fairly pluralistic in my view of language use.

So how do you, the seeker, figure out what good writing is? You practice, you read, you think about it, you practice some more, like you would if you were learning to play a guitar. You look at how the writing of other people works, or doesn’t. I’ve never heard guitar instructors say that their students say all music is subjective; playing the guitar appears to be transparently hard, in the sense that you know you’re bad at it, in a way that writing isn’t. Still, if you’d like to know a lot more about good writing, take a look at Francine Prose’s Reading Like a Writer, James Wood’s ıHow Fiction Works, and Jan Venolia’s Write Right!

When you’re done with those, move on to B. R. Myers’ A Reader’s Manifesto. When you’re done with that, move on to the New York Times’ series Writers on Writing. Collectively, these books will teach you that every word counts and every word choice says something about the writer and the thing the writer is conveying, or trying to convey. Not only that, but every word changes, slightly, the meaning of every word around it. Good writers learn to automatically, subconsciously ask themselves, “Does this word work? Why? Why not? How should I change it? What am I trying to convey here?”

Eventually, over time, skilled writers and thinkers internalize these and other ideas, and their conscious mind moves to other issues, much like a basketball player’s shot happens via muscle memory after it’s been practiced and tweaked over 100,000 repetitions.

Skilled writers are almost always skilled readers, so they have a fairly large, subconscious stock of built-in phrases, ideas, and concepts. Somewhere along the line I’ve read a fair amount about how athletes practice and how athletes become good (perhaps some of that material came from Malcolm Gladwell’s Outliers, or Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience). I know how important practice and repetition are to any skill-based human endeavor. So I combined the idea of skill with writing and skill in basketball, since many students are more familiar with sports than with writing. Where did that analogy come from? I don’t know, exactly, but it’s there now, along with the idea that analogies are good, and explaining what I’m doing is good, and so are many other things.

To return to the athletic analogy, skill in sports also has a subjective element. Is Lebron James now better than Michael Jordan was when Jordan ruled? You can have this argument with morons in bars all day long. I’ve heard it and find it particularly tedious because the outcome is so unimportant. But both players are very clearly good, and at the top of their peers in their respective eras. The comparison at least makes sense.

One could also argue about whether Elmore Leonard or Alain de Botton is the better writer, although I would argue that they’re too different to make that a fruitful comparison; Elmore Leonard would be better matched against someone like Raymond Chandler or Patricia Highsmith. But Leonard and de Botton are both fantastically better writers than most freshmen; for one thing, most freshmen haven’t yet mastered the mechanical parts of writing, like how to use commas consistently and correctly (if they wish to), let alone higher questions about vocabulary, metaphor, and so on.

If you really want to get better, spend a lot of time reading, writing, and thinking about those activities. Then look back at your earlier work and judge its quality for yourself. Few students think the first draft of their first paper is as good as the final draft, and I tend to agree. Few people who consciously work throughout their lives think their work as, say, 20-year-old students is as good as their work at age 30.

With regard to thesis statements, good ones tend to have some aspect of how a text (I hate the term “text,” but it fits here) shows something (“Free-indirect speech in ‘She Wasn’t Soft. . .'”), what a text shows, usually symbolically (“is used to demonstrate how Paula and Jason, despite being a couple, really disdain each other”) and have some larger point to make (“which shows that what people think and how people behave don’t always match”).

That’s not a great thesis statement because I’m doing it quickly and freeform; a better one might say something like, “The use of free-indirect speech in ‘She Wasn’t Soft’ demonstrates that Paula is actually soft, despite her repeated claims to the contrary, and that Jason and Paula’s mutual loathing sustains their relationship, despite what they say.” That’s still not the sort of thesis statement I’d use to write a publishable academic paper, but it’s closer. Many if not most student papers are missing one of those elements. Not every thesis needs all three, but they’re not bad ideas to check for.

Over time and with experience, I’ve developed, and you’ll develop, a fairly good eye for thesis statements. Eventually, when you’re sufficiently practiced, you won’t necessarily use explicit thesis statements—your thesis will be implied in your writing. Neal Stephenson doesn’t really have an explicit thesis statement in “Turn On, Tune In, Veg Out,” although his last line may function as one, and Roland Barthes definitely doesn’t have an explicit one in “The Brain of Einstein.” Thesis statements aren’t necessarily appropriate to all genres, all the time.

When I started teaching, I thought I was going to be a revolutionary and not teach thesis statements at all. I wrote about that experience here. The experiment didn’t work. Most undergrads need thesis statements. So I started teaching them, and student papers got better and more focused, and I’ve been doing so ever since.

Your question or questions are about the inherent challenges of writing, and those don’t have easily summarized answers. The problem also comes from language. Language itself is imprecise, or, alternately, layered with meaning; that’s where so much humor and misunderstanding comes from (and humor could be considered a kind of deliberate misunderstanding). I’ve read about how, when computer scientists tried to start making translation systems and natural-language processing systems, they ran into the ambiguity problem—and that problem still hasn’t been fully solved, as anyone who’s tried to use text-to-speech software, or Google translate, can easily find (I wish I could find any citations or discussions regarding this issue; if you happen to run across any, send them over).

This line of questioning also leads into issues of semiotics—how signs, signaling, and reception function—and the degree of specificity necessary to be good. Trying to specify every part of good writing is like trying to specify every aspect of good writing: you get something like McDonald’s. While McDonald’s does a lot of business, I wouldn’t want to eat there, and it’s pretty obvious that something is lost is the process (Joel Spolsky’s article “Big Macs vs. the Naked Chef” (sfw) also uses McDonald’s as a cautionary tale, this time for software developers; you should definitely read it).

I’m going to interrupt this essay to quote from Joel:

The secret of Big Macs is that they’re not very good, but every one is not very good in exactly the same way. If you’re willing to live with not-very-goodness, you can have a Big Mac with absolutely no chance of being surprised in the slightest.

Bad high school teachers often try to get students to write essays that are not very good in exactly the same way. I’m trying to get students, and myself, to write essays that are good and that a human might want to read. This guarantees that different students will approach the problem space in different ways, some more successfully than others, and different essays are going to be good in different ways. I’m trying to get students to think about the process and, more broadly, to think not just about the solutions, but about the domain; how you conceptualize the problem domain will change what you perceive as the solution. Learning to conceptualize the problem domain is an essential part of the writing process that’s often left out of high school and even college. That being said, if you ever find yourself in front of 20 or 30 novice writers, you’ll quickly see that some are much better than others, even if there’s much wiggle room between a C and C+.

I don’t get the sense that students who are unhappy with their grades are unhappy out of a deeply felt and considered sense of aesthetic disagreement about fundamental literary or philosophical principles. I suspect I feel this way partially because I have a fairly wide or broad sense of “good” writing—or at least writing good enough to get through undergrad English classes, and someone with sufficient sophistication and knowledge to make a good argument about aesthetics or the philosophy of writing would be very unlikely to get a sufficiently low mark to want to argue about it. Rather, I think most students who are unhappy about their grades just want better grades, without doing the thinking and writing necessary to get them.

These issues are compounded by the a meta-issue: many if not most K – 12 English (and other humanities) teachers are bad. And many of them aren’t that smart or knowledgeable (which tends to overlap with “bad”). So a lot of students—especially those on the brighter side—inchoately know that their teachers are bad, and that something stinks, and therefore they conclude that English is bogus anyway, as are related fields. This has a lot of unfortunate consequences on both the individual and societal level; books like C.P. Snow’s The Two Cultures are one manifestation of this larger problem.

In general, I’d like for people to try and get along, see each other’s points of view, and be tolerant—not only in fields like religion and politics, but also things like the humanities / sciences, or reason / emotion, or any number of the other possibly false binaries that people love to draw for reasons of convenience.

If you think I’m completely wrong about what makes good writing (and what makes writing good), you have a huge world out there and can judge the reaction to your writing. Twilight and The Da Vinci Code are poorly written novels, yet millions of people have read and enjoyed them—many fewer than have read Straight Man, one of my favorite novels and one that’s vastly better written. Who’s right: the millions of teenage girls who think they’re in love with the stilted, wooden prose that makes up Edward, or me, who sees the humor in a petulant English department? It depends on what you mean by “right.” If I were a literary agent or editor, I would’ve passed on both Twilight and The Da Vinci Code. Definitions of “good” are uncertain, and the ones I embrace and impose on students are worth questioning. If you can at least understand where I’m coming from and why I hold the views I do, however, I’ll consider my work a relative success.

Most people’s conception of “good” differs at different points in their lives; I’m in my 20s and view writing very differently than I did in my teens. I would be surprised if I view writing the same way in my 40s. One major change is that I’ve done so much reading, and probably will do so much reading. Someone who doesn’t read very much, or doesn’t challenge themselves when they do read, may find that their standards don’t change as much either. I could write much more on this point alone, but for the most part you’ll have to trust me: your tastes will probably change.

This email is a long way of saying, “I’m not trying to bullshit you, but the problem domain itself is hard, and that domain is not easy to explain, without even getting into its solution.” The gap between “fact” and “opinion” is blurry, but writers who attend carefully to what another writer says will have more detailed opinions than those who don’t.

The short version of this email is “trust me,” or, alternatively, spend the next ten years of your life pondering and contemplating these issues while reading about them, and then you’ll have a pretty good grasp of what good writing means. Writing is one of these 10,000 hour skills in that it probably takes 10,000 hours of deliberate practice to get good. Start now and you’ll be better in a couple years.

Sharp Objects — Gillian Flynn

The first time through Sharp Objects I though it totally absurd, since the characters in it behave like fantastical morons perpetually rolling on ecstasy or akin to faeries from Jonathan Strange & Mr Norrell. The plausibility of the plot is so low that I almost gave up, exasperated.

But I kept reading the first time and was curious enough to reread the second time and realize that Sharp Objects is not about a realistic story of realistic detection; instead, it’s a mythic-Freudian* work about the anxiety that comes from two related phenomena: transitions to adulthood and the muddying of lines between the generations. Camille, the protagonist, is supposed to be an adult (she’s a reporter for paper, she covers murders, she pays the rent) but around her mother she acts like a child and around her 13-year-old sister she acts like a peer.

Sharp_ObjectsOnce this alternate reading became clear, Sharp Objects became pleasant. It’s not supposed to be realistic (or, if it is, it fails so badly at its purpose that it might as well be read my way). It’s a fairy tale with a bit of media critique thrown in, and it says that girls and women have the dark urges that are often absent from fiction and from the news. Camille needs to reconcile her family relationships and her family’s history in order to understand the murders she’s investigating. Conventional reportorial skills and abilities are of little use; at best one might say she employs some aspects of New or Gonzo Journalism, since she does in fact drop ecstasy at one point.

In the novel Camille is dispatched by her editor to her home town to investigate a murder that becomes a series of murders of girls. The novel signals its intentions early. Camille is describing the home town she came from, and she ends the first chapter with this:

When I was still in grammar school, maybe twelve, I wandered into a neighbor boy’s hunting shed, a wood-planked shack where animals were stripped and split. Ribbons of moist, pink flesh dangled from strings, waiting to be dried for jerky. The dirt floor was rusted with blood. The walls were covered with photographs of naked women. Some of the girls were spreading themselves wide, others were being held down and penetrated. One woman was tied up, her eyes glazed, her breasts stretched and veined like grapes, as a man took her from behind. I could smell them all in the thick, gory air.

At home that night, I slipped a finger under my panties and masturbated for the first time, panting and sick.

The blurred mental lines between sexuality, animals, reproduction, and early age remain a theme that runs through the novel.

Attention is also a scarce resource in the novel: Camille constantly seeks it from her mother, even at the risk of being dangerous, and also seeks it from men (at least at first). Her sister is repeating Camille’s experience. Parents are either absent (from page 21: “I wondered where their mother was”) or overwhelming. Family sexuality recurs; here is one early example, from Camille’s narration:

The Victorians, especially southern Victorians, needed a lot of room to stray away from each other, to duck tuberculosis and flu, to avoid rapacious lust, to wall themselves away from sticky emotions. Extra space is always good.

“Stray” is an exact quote. And if extra space is always good, why then does Camille go to her mother’s house? She returns to a point of danger in search of information, like Little Red Riding Hood entering the Wolf’s house. The novel itself keeps pointing to Fairy Tales. Amma, Camille’s sister, says:

now we’re reunited. You’re like poor Cinderella, and I’m the evil stepsister. Half sister.

A few pages later, Camille speaks with a boy who says that he saw a “woman” take the second girl, who turns up murdered. She thinks this of him:

What did James Capisi see? The boy left me uneasy. I didn’t think he was lying. But children digest terror differently. The boy saw a horror, and that horror became the wicked witch of fairy tales, the cruel snow queen.

No one believes that the killer is a woman because women don’t behave that way. But wicked and evil women are pronounced in fairy tales.

This details occurs in Camille’s mother’s house:

Walking past Amma’s room, I saw her sitting very properly on the edge of a rocking chair, reading a book called Greek Goddesses. Since I’d been here, she’d played at being Joan of Arc and Bluebeard’s wife and Princess Diana—all martyrs, I realized. She’d find even unhealthier role models among the goddesses. I left her to it.

There are more. These are enough.

Seemingly no one grows up in Sharp Objects. Nearly every woman in Wind Gap still gossips like she’s in high school. Growing up is hard and harder for some of us than others. Perhaps we never fully leave childhood behind. Camille can’t. Her sister Amma is in some ways eager to leave childhood (she behaves like a pro when it comes to the inciting the desires of men) but in other ways wants its protections. In our culture, she can legally at least get both,** and she behaves in both ways. At one moment Amma is behaving like an infant:

Amma lolled sleepy as a newborn in her blanket, smacking her lips occasionally. It was the first time I’d seen my mother since our trip to Woodberry. I hovered in front of her, but she wouldn’t take her eyes off Amma.

In others she doesn’t, as when she says that after her mother takes care of her, “I like to have sex.” Then:

She flipped up her skirt from behind, flashed me a hot pink thong.
“I don’t think you should let boys do things to you, Amma. Because that’s what it is. It’s not reciprocal at your age.”

Camille’s counsel is distinctly odd, coming from someone who did similar things at similar ages and, it would appear, for similar reasons. But she doesn’t at this moment have the power to break the familial cycle, with its hints and implications of incest. That waits until later.

Camille’s decision to enter this cauldron of weirdness reinforces the idea that Sharp Objects is more about family patterns and dynamics than detection. In one of the flimsier rationales in the book, Camille stays with her mother, her stepfather, and her adolescent sister, ostensibly for the sake of saving the paper money, but this decision is insane given her relationship to the family. That she continues to stay as events become more and more macabre and surreal are equally insane and implausible. Camille should leave, and that’s obvious to any sane reader and should be obvious to her. That she stays anyway indicates that the story has motives different than the ones I initially assumed.


* Freud has a much stronger mythic element to his work than is commonly supposed—and so I’m justified in using myth and Freud in this way. Much of his work is unfalsifiable, giving what is nominally a scientific body of work a distinctly literary quality, and the supposed universality of many of his concepts (the death drive, the Oedipus complex, etc.) are not supportable.

* Let me reproduce the footnote at the link:

As Judith Levine notes in Harmful to Minors: The Perils of Protecting Children from Sex: “One striking pair of contradictory trends: as we raise the age of consent for sex, we lower the age at which a wrongdoing child may be tried and sentenced as an adult criminal. Both, needless to say, are ‘in the best interests’ of the child and society.” And, as Laurie Schaffner points out in a separate essay collection, “[…] in certain jurisdictions, young people may not purchase alcohol until their twenty-first birthday, or may be vulnerable plaintiffs in a statutory rape case at 17 years of age, yet may be sentenced to death for crimes committed at age 15 [….]”

Laws [. . .] reflect race and gender norms: white girls are the primary target of age-of-consent laws, while African American youth are the target of laws around crime and delinquency. The contradictory trends are readily explained by something rather unpleasant in society.

I didn’t elaborate on what the “unpleasant” thing may be and won’t here, either, but you’re welcome to take a shoot at your best interpretation in the comments.

“All American fiction is young adult fiction: Discuss”

Via Twitter Hollis Robbins offers a prompt: “‘[A]ll American fiction is young-adult fiction.’ Discuss.” Her takeoff is A. O. Scott’s excellent “The Death of Adulthood in American Culture,” which you should go read; oddly, it does not mention the show Entourage, which may be the best contemporary narrative artifact / fantasy about the perpetual party.*

American fiction tends toward comedy more than “young-adult” because comedy = tragedy – consequences. AIDS fiction is tragic because people die. Most contemporary heterosexual love stories are comedy because the STIs tend to be curable or not that important; people who are diligent with birth control rarely get pregnant. Facing death, starvation, or other privations have always been the adult’s lot, and adults who made sufficiently bad choices regarding resource allocation or politics died. Think of the numerous adults who could have done everything they could to flee the area between Russia and Germany in 1914 and didn’t, or the ones who didn’t after 1918 and before the Holocaust. The example is extreme but it illustrates the principle. Frontier and farm life was relentlessly difficult and perilous.

Today by contrast we live in the a world of second chances. America is a “victim,” although that is the wrong word, of its own success. If you color more or less inside the lines and don’t do anything horrendous, life can be awesome. People with an agreeable and conscientious disposition can experience intense pleasures and avoid serious pain for decades; not everyone takes to this (see for example the works of Michel Houellebecq) but many do. The literary can write essays, the scientists can do science, the philosophers can argue with each other, the business guys have a fecund environment, and the world’s major problems are usually over “there” somewhere, across the oceans. If we ever get around to legalizing drugs we’ll immediately stabilize every country from Mexico to Chile.**

What are the serious challenges that Americans face as a whole? In the larger world there is no real or serious—”serious” being a word associated with adulthood—ideological alternatives to democracy or capitalism. Dictatorships still exist but politics are on the whole progressing instead of regressing, Russia and parts of the Middle East excepted.

One could reframe the question of all American fiction being young adult fiction to: “Why not young adult fiction?” Adults send young people to war to die; adulthood is World War II, us against them, thinking that if we don’t fight them in Saigon we’ll have to fight them in Seattle. Adults brought us Vietnam. Young people brought us rock ‘n’ roll, rap, and EDM. Adults want to be dictators, whether politically or religiously, and the young want to party and snag the girl(s) or guy(s) of their dreams.

Adulthood is associated with boredom, stagnation, suburbs, and death. Responsibility is for someone else, if possible, and those who voluntarily assume responsibility rarely seem to be rewarded for it in the ways that really count (I will be deliberately ambiguous on what those ways are). Gender politics and incentives in the U.S. and arguably Western Europe are more screwed up than many of us would want to admit, and in ways that current chat among the clerisy and intellectual class do not reflect or discuss. If adulthood means responsibility, steady jobs, and intense fidelity, then we’ve been dis-incentivizing it for decades, though we rarely want to confront that.

Many people are so wealthy and safe that they are bored. In the absence of real threats they invent fake ones (vaccines) or worry disproportionately about extremely unlikely events (kidnapping). Being a steady person in a steady (seeming) world is often thus perceived as being dull. In contemporary dating, does the stolid guy or girl win, or does hot funny and unreliable guy or girl win?

A lot of guys have read the tea leaves: divorce can be a dangerous gamble while marriage offers few relationship rewards that can’t be achieved without involving the legal establishment or the state more generally. A shockingly large number of women are willing to bear the children of men they aren’t married to: 40.7%$ of births now occur to unmarried women, and that number has been rising for decades.

Why take on responsibility when no one punishes you for evading it and arguably active irresponsibility is rewarded in many ways, while safety nets exist to catch those who are hurt by the consequences of their actions? That’s our world, and it’s often the world of young adulthood; in fiction we can give ourselves monsters to fight and true enduring love that lasts forever, doesn’t have bad breath in the morning, and doesn’t get bored of us in four years. Young adult fiction gives us the structure lacking in the rest of our lives.

Moreover, there has always been something childlike in the greatest scientists and artists. Children feel unconstrained by boundaries, and as they grow older they feel boundaries more and more acutely. I’m not about to argue that no one should have boundaries, but I am going to argue that retaining an adult version of the curiosity children have and the freedom they have is useful today and in many cases has always been useful.

The world has gotten so efficient that vast pools of money are available for venture capitalists to fund the future and tech guys to build or make it. The biggest “problem” may be that so many of us want to watch TV instead of writing code, but that may be a totally bunk argument because consumption has probably always been more common and easier than production.

In this world fiction should tend towards comedy, not the seriousness too typically associated with Literature.

If American fiction is young adult fiction, that may be a sign of progress.***


* Another show, Californication, mines similar themes but with (even weaker) plots and total implausibility. Here is an essay disagreeing with Scott: Adulthood Isn’t Dead.

** Breaking Bad and innumerable crime novels would have no driving impetus without drug prohibition. The entire crime sector would be drastically smaller almost overnight were we to legalize drugs and prostitution. That would be a huge win for society but harmful to fiction writers.

*** Usually I eschew polemics but today I make an exception.

The appeal of “pickup” or “game” or “The Redpill” is a failure of education and socialization

Since posting “The inequality that matters II: Why does dating in Seattle get left out?” and “Men are where women were 30 years ago?” I’ve gotten into a couple discussions about why Neil Strauss’s The Game is popular and why adjacent subjects like “pickup” and the “Redpill” have become more popular too. One friend wrote, “It’s so tedious to see how resentful men get—a subject much in the news lately because of the Santa Barbara shooting…”

That’s somewhat true, but underlying, longer-term trends are still worth examining. The world is more complex than it used to be in many respects, and that includes sex and dating. Until relatively recently—probably the late 60s / early 70s—it was common for most guys to marry a local girl, maybe straight out of high school, and marry a girl whose parents the guy probably knows and her parents probably know the guy’s. Parents, families, and religious authorities probably had a strong effect on what their children did, and a lot of men and women married as virgins. The dating script was relatively easy to follow and relatively many people paired early. In the 60s an explosion of divorces began, and that complicated matters in ways that are still being sorting out.

Today there are more hookups for a longer period of time and fewer universal scripts that everyone follows, or is supposed to be following. Instead, one sees a proliferation of possibilities, from the adventurous player—which is not solely a male role—to early marriage (though those early marriages tend to end in divorce).

Dating “inequality” has probably increased, since the top guys are certainly having a lot more sex than the median or bottom guys. To some extent high-status guys have always had more sex, but now “top” could mean dozens of partners at a relatively early age, and the numerical top is more readily available to guys who want it. In the old regime it was probably possible for almost everyone to find a significant other of some sort (and I think families had more sway and say). Now that may be harder, especially for guys towards the bottom who don’t want to realize that if they’re towards the bottom the women they’re likely to attract are likely to be around the same place. We don’t all get a Hollywood ending, and Hollywood itself is unrealistic.

Guys who notice that movies, TV shows, and some books portray an unlikely or unrealistic set of dating and marriage patterns should start to wonder what the “real thing” looks like. The Game isn’t bad, though it is dated, and I expect Tucker Max and Geoffrey Miller’s book Mate to be popular for reasons similar to the ones that made The Game popular.

I’ve also noticed an elegiac sense that a weirdly large number of the “pickup artists” or “Red Pill” (sometimes it’s used as two words, sometimes as one) or “manosphere” guys have about the past, and how back then it was relatively easy to find, date, and marry a woman. Much of this is probably mythological, and I don’t think most of them would be happy marrying at 20 or 24 and having two or three kids by 28 or 29.

Like all generalizations, the stereotype above are riddled with holes and exceptions—see further the oeuvre of John Updike—but I’m examining broad trends rather than specific details. Today almost no one gets married straight out of high school. Routine moves from city to city are normal, and each move often rips someone from the social networks that provide romantic connections. Families play a smaller and smaller role. Twenty-somethings, and especially women, don’t listen to their parents’s romantic advice.

If you don’t have the infrastructure of school, how do you meet lots of new people? Jobs are one possibility but looking for romantic prospects at work has obvious pitfalls. Online dating is another, but people who can’t effectively date offline often aren’t any better on—and are often worse.

Technology matters too. Technologies take a long time—decades, at least—to really reach fruition and for their ripples to be felt throughout societies and cultures. Virtually all big ideas start small.* That’s an important lesson from Where Good Ideas Come From, The Great Stagnation, The Enlightened Economy, and similar books about technological, economic, and social history.

A suite of interrelated technologies around birth control (like hormonal birth control itself, better forms of it, and easy condom distribution and acquisition) are still playing out. Same with antibiotics and vaccines against STIs. VOX offers one way to think about this in “From shame to game in one hundred years: An economic model of the rise in premarital sex and its de-stigmatisation.” It begins:

The last one hundred years have witnessed a revolution in sexual behaviour. In 1900, only 6% of US women would have engaged in premarital sex by the age of 19, compared to 75% today . . . Public acceptance of premarital sex has reacted with a lag.

Culture is still catching up. Pickup, game, and the Redpill, regardless of what you personally think of them, are part of the the cultural catchup. They’re responses from guys frustrated by the way their own efforts fail while some of their peers’s efforts succeed. A lot of women appear less interested in an okay guy with an okay job and an okay but not that exciting or fun life, relative to guys with a different set of qualities. Men invest in what they think women want and women invest in what they think men want, and relative wants have changed over time.

Almost every guy sees or knows at least one guy and often a couple who do spectacularly well with women. Guys who are frustrated or who can’t achieve the romantic life they want start to ask, “What are the successful guys doing that I’m not?” Pickup or game or the Redpill are different strains of systematic answers. All three may have things wrong with them, but all three are better than nothing. Saying “Women are mysterious” or “No one knows what women want” is bullshit, and guys only have to look around to notice it.

Pickup artists and those who read them are responding to a cultural milieu in which most guys get terrible socialization regarding dating and women. Pickup artists are stepping into that gap. They’re trying to answer questions in a concrete way, which most people, including their detractors, aren’t. In a review of Clarisse Thorn’s Confessions of a Pickup Artist Chaser I wrote:

feminism does very little to describe, let alone evaluate, how micro, day-to-day interactions are structured. Pickup artists, or whatever one may want to call guys who are consciously building their skills at going out and getting women, are describing the specific comments, conversations, styles, and venues women respond to. The pickup artists are saying, “This is how you approach a woman in a bar, this is how you strike up a conversation at the grocery store, and so forth.” In other words, they’re looking at how people actually go about the business of getting laid. Their work is often very detailed, and the overall thrust is toward the effectiveness of getting laid rather than how male-female interactions work in theory. Feminism, in Thorn’s view, appears to be silent, or mostly silent, on the day-to-day interactions.

Who else is doing that? Almost no one. As with virtually any other topic, one can muddle along through trial and error (and mostly error) or one can try to systematically learn about it and apply that learning to the problem domain, along with the learning others have done.

To be sure, the worst of the group if just trying to sell shit, and sell as much of it as possible to fools. The best of the group is saying things that almost no one else is saying.

Max, Miller, and Nils Parker wrote Mate: The Young Man’s Guide To Sex And Dating, which is, among other things, a description of modern dating and a description of why so many guys do it so badly for so long. Confusion reigns, and the book promises to be the sort of fun-but-comprehensive read that can be given to unhappy, puzzled guys who understand something is wrong but don’t know how to fix it.

One strategy in response to new social circumstances is to figure out what you should do to be reasonably successful and what you can do to make yourself more appealing. This is not a male-only question: virtually every issue of Cosmo is about how to attract men, retain men, and deal with female friends and rivals. Another is to blame women, or withdraw from dating, or kill innocents because of your own frustration.

If you think half the population isn’t into you, the problem is with you, not the population. There’s an important similarity to business here: If you start a business and no one wants to buy your products or services, you can blame the market or you can realize that you’re not doing what people want.

It’s easier to blame women than it is to make real changes, and there is a tendency among some of the self-proclaimed “Redpill”-types to do that. Paul Graham says the real secret to making wealth is to “Make something people want.” In dating the real “secret” (which isn’t a secret) is to be a person people like. How to do that can be a whole book’s worth of material.

Blame is easy and improvement is hard. Short guys do have it harder than tall guys—but so what? Go ask a fat girl, or a flat-chested one, how much fun dating is for her, compared to her slenderer or better-endowed competitors. Honesty in those conversations is probably rare, but it is out there: usually in late-night conversations after a couple drinks.

I don’t hate “pickup artists” as a group, though I dislike the term and wish there was something better. Many of the critics are accurate. But so what? criticizing without recognizing the impetus for the development in the first place is attacking the plant while ignoring the roots. This post, like so many of the posts I write, is looking at or attempting to look at the root.

Feminism didn’t come from nowhere. Neither has pickup.


* Which is not to say that all small ideas will automatically become big. Most don’t. But ideas, technologies, practices, and cultures spread much more slowly than is sometimes assumed, especially among the rah-rah tech press.

The modern art (and photography) problem

In “Modern art: I could have done that… so I did: After years of going to photography exhibitions and thinking he could do better, Julian Baggini gave it a go. But could he convince The Royal West of England Academy with his work?“, Baggini writes:

there are times when we come across something so simple, so unimpressive, and so devoid of technical merit that we just can’t help believing we could have done as well or better ourselves.

He’s right—except that this happens entirely too often and helps explain much of modern art’s bogosity. I’m not the only person to have noticed—in Glittering Images, Camille Paglia writes:

the big draws [for museums] remain Old Master or Impressionist painting, not contemporary art. No galvanizing new style has emerged since Pop Art, which killed the avant-garde by embracing commercial culture. Art makes news today only when a painting is stolen or auctioned at a record price.

She’s right too; many people have noticed this but few apparently have in the art world itself, which seems to have become more interested in marketing than making (a problem afflicting the humanities in academia too). But there are enough people invested in and profiting from propagating bogosity that they can remain indifferent to countervailing indifference.

OLYMPUS DIGITAL CAMERAYears ago I was at the Seattle Art Museum and looking various pieces of modern supposed “art” that consisted mostly of a couple lines or splotches and what not, and they made me think: “there’s a hilarious novel in here about a director who surreptitiously hangs her own work—and no one notices.” Unfortunately, now I’ve realized that people have already done this, or things like it, in the real world—and no one cared. It’s barely possible to generate scandal in the art world anymore; conservatives have mostly learned about the Streisand effect and thus don’t react to the latest faux provocation. The artists themselves often lack both anything to say and any coherent way of saying it.

To the extent people respond to art, they respond to the art that people made when it took skill be an artist.

Photography has a somewhat similar problem, except that it’s been created by technology. Up until relatively recent it took a lot of time, money, and patience to become a reasonably skilled photographer. Now it doesn’t take nearly as much of any of those things: last year’s cameras and lenses still work incredibly well; improvements in autofocus, auto-exposure, and related technologies make photos look much better; and it’s possible to take, review, and edit hundreds or thousands of photos at a time, reducing the time necessary to go from “I took a picture” to expert.

The results are obvious for anyone who pays attention. Look through Flickr, or 500px, or any number of other sites and you’ll see thousands of brilliant, beautiful photos. I won’t say “anyone can do it,” but many people can. It’s also possible to take great photos by accident, with the machine doing almost all the work apart from the pointing and clicking. Adding a little bit of knowledge to the process is only likely to increase the keeper rate. Marketing seems to be one of the primary differentiators among professional photographers; tools like Lightroom expand the range of possibility for recovering from error.

One of the all-time top posts on Reddit’s photography section is “I am a professional photographer. I’d like to share some uncomfortable truths about photography,” where the author writes that “It’s more about equipment than we’d like to admit” and “Photography is easier than we’d like to admit.”

The profession is dying, for reasons not identical to painting but adjacent to it. In photography, we’re drowning in quality. In fine art, we’re drowning in bogosity, and few people appear to be interested in rescuing the victim.

Journalism, physics and other glamor professions as hobbies

The short version of this Atlantic post by Alex C. Madrigal is “Don’t be a journalist,” and, by the way, “The Atlantic.com thinks it can get writers to work for free” (I’m not quoting directly because the article isn’t worth quoting). Apparently The Atlantic is getting writers to work for free, because many writers are capable of producing decent-quality work, and the number of paying outlets are shrinking. Anyone reading this and contemplating journalism as a profession should know that they need to seek another way of making money.

The basic problems journalism faces, however, are obvious and have been for a long time. In 2001, I was the co-editor-and-chief of my high school newspaper and thought about going into journalism. But it was clear that the Internet was going to destroy a lot of careers in journalism. It has. The only thing I still find puzzling is that some people want to major in journalism in college, or attempt to be “freelance writers.”

Friends who know about my background ask why I don’t do freelance writing. When I tell them that there’s less money in it than getting a job at Wal-Mart they look at me like I’m a little crazy—they don’t really believe that’s true, even when I ask them how many newspapers they subscribe to (median and mode answer: zero). Many, however, spend hours reading stuff for free online.

In important ways I’m part of the problem, because on this blog I’m doing something that used to be paid most of the time: reviewing books. Granted, I write erratically and idiosyncratically, usually eschewing the standard practices of book reviews (dull, two-paragraph plot summaries are stupid in my view, for instance), but I nonetheless do it and often do it better than actual newspapers or magazines, which I can say with confidence because I’ve read so many dry little book reports in major or once-major newspapers. Not every review I write is a critical gem, but I like doing it and thus do it. Many of my posts also start life as e-mails to friends (as this one did). I also commit far more typos than a decently edited newspaper or magazine. Which I do correct when you point them out.

The trajectory of journalism is indicative of other trends in American society and indeed the industrialized world. For example, a friend debating whether he should consider physics grad school wrote this to me recently: “I think physics is something that is fun to study for fun, but to try to become a professional physicist is almost like too much of a good thing.” He’s right. Doing physics for fun, rather than trying to get a tenure-track job, makes more sense from a lifestyle standpoint.

A growing number of what used to occupations seem to be moving in this direction. Artists got here first, but others are making their way here. I’m actually going to write a post about how journalism increasingly looks like this too. The obvious question is how far this trend will go—what happens when many jobs that used to be paid become un-paid?

Tyler Cowen thinks we might be headed towards a guaranteed annual income, an idea that was last popular in the 60s and 70s. When I asked Cowen his opinions about guaranteed annual incomes, he wrote back to say that he’d address the issue in a forthcoming book. The book hasn’t arrived yet, but I look forward to reading it. As a side not, apparently Britain has, or had, a concept called the “Dole,” which many people went on, especially poor artists. Geoff Dyer wrote about this some in Otherwise Known as the Human Condition. The Dole subsidized a lot of people who didn’t do much, but it also subsidized a lot of artists, which is pretty sweet; one can see student loans and grad school serving analogous roles in the U.S. today.

IMG_1469-1Even in programming, which is now the canonical “Thar be jobs!” (pirate voice intentional) profession, some parts of programming—like languages and language development—basically aren’t remunerative. Too many people will do it free because it’s fun, like amateur porn. In the 80s there were many language and library vendors, but nearly all have died, and libraries have become either open source or rolled into a few large companies like Apple and Microsoft. Some aspects of language development are cross-subsidized in various ways, like professors doing research, or companies paying for specific components or maintenance, but it’s one field that has, in some ways, become like photography, or writing, or physics, even though programming jobs as a whole are still pretty good.

I’m not convinced that the artist lifestyle of living cheap and being poor in the pursuit of some larger goal or glamor profession seems is good or bad, but I do think it is (that we have a lot of good cheap stuff out there, and especially cheap stuff in the form of consumer electronics, may help: it’s possible to buy or acquire a nearly free, five-year-old computer that works perfectly well as a writing box).* Of course, many starving artists adopt that as a pose—they think it’s cool to say they’re working on a novel or photography project or “a series of shorts” or whatever, but don’t actually do anything, while many people with jobs put out astonishing work. Or at least work, which is usually a precursor to astonishing work.

For some people, the growing ability of people to disseminate ideas and art forms even without being paid is a real win. In the old days, if you wanted to write something and get it out there, you needed an editor or editors to agree with you. Now we have a direct way of resolving questions about what people actually want to read. Of course, the downside is that whole payment thing, but that’s the general downside of the new world in which we live, and, frankly it’s one that I don’t have a society-wide solution for.

In writing, my best guess is that more people are going to book-ify blogs, and try to sell the book for $1 – $5, under the (probably correct) assumption that very few people want to go back and read a blog’s entire archives, but an ebook could collect and organize the material of those archives. If I read a powerful post by someone who seemed interesting, I’d buy a $4 ebook that covers their greatest hits or introduced me to their broader thinking.

This is tied into other issues around what people spend their time doing. My friend also wrote that he read “a couple of articles on Keynes’ predictions of utopia and declining work hours,” but he noted that work still takes up a huge amount of most people’s lives. He’s right, but most reports show that median hours worked in the U.S. has declined, and male labor force participation has declined precipitously. Labor force participation in general is surprisingly low. Ross Douthat has been discussing this issue in The New York Times (a paid gig I might add), and, like, most reasonable people he has a nuanced take on what’s happening. See also this Wikipedia link on working time for some arguments that working time has declined overall.

Working time, however, probably hasn’t decreased for everyone. My guess is that working time has increased for some smallish number of people at the top of their professors (think lawyers, doctors, programmers, writers, business founders), with people at the bottom often relying more on government or gray market income sources. Douthat starts his essay by saying that we might expect working hours among the rich to decline first, so they can pursue more leisure, but he points out that the rich are working more than ever.

Though I am tempted to put “working” in scare quotes, because it seems like many of the rich are doing things they would enjoy doing on some level anyway; certainly a lot of programmers say they would keep programming even if they were millionaires, and many of them become millionaires and keep programming. The same is true of writers (though fewer become millionaires). Is writing a leisure or work activity for me? Both, depending. If I self-publish Asking Anna tomorrow and make a zillion dollars, the day after I’ll still be writing something. I would like to get paid but some of the work I do for fun isn’t contingent on me getting paid.

Turning blogs into books and self-publishing probably won’t replace the salaries that news organizations used to pay, but it’s one means for writers or would-be writers to get some traction.

Incidentally, the hobby-ification of many professions makes me feel pretty good about working as a grant writing consultant. No one think when they’re 14, “I want to be a grant writer like Isaac and Jake Seliger!”, while lots of people want to be like famous actors, musicians, or journalists. There is no glamor, and grant writing is an example of the classic aphorism, “Where there’s shit, there’s gold” at work.

Grant writing is also challenging. Very few people have the weird intersection of skills necessary to be good, and it’s a decade-long process to build those skills—especially for people who aren’t good writers already. The field is perpetually mutating, with new RFPs appearing and old ones disappearing, so that we’re not competing with proposals written two years ago (where many novelists, for example, are in effect still competing with their peers from the 20s or 60s or 90s).

To return to journalism as a specific example, I can think of one situation in which I’d want The Atlantic or another big publisher to publish my work: if I was worried about being sued. Journalism is replete with stories about heroic reporters being threatened by entrenched interests; Watergate and the Pentagon Papers are the best-known examples, but even small-town papers turn up corruption in city hall and so forth. As centralized organizations decline, individuals are to some extent picking up the slack, but individuals are also more susceptible to legal and other threats. If you discovered something nasty about a major corporation and knew they’d tie up your life in legal bullshit for the next ten years, would you publish, or would you listen to your wife telling you to think of the kids, or your parents telling you to think about your career and future? Most of us are not martyrs. But it’s much harder for Mega Corp or Mega Individual to threaten The Atlantic and similar outlets.

The power and wealth of a big media company has its uses.

But such a use is definitely a niche case. I could imagine some of the bigger foundations, like ProPublica, offering a legal umbrella to bloggers and other muckrakers to mitigate such risks.

I have intentionally elided the question of what people are going to do if their industries turn towards hobbies. That’s for a couple reasons: as I said above, I don’t have a good solution. In addition, the parts of the economy I’m discussing here are pretty small, and small problems don’t necessarily need “solutions,” per se. People who want to turn their hours into a lot of income should try to find ways and skills to do that, and people who want to turn their hours into fun products like writing or movies should try to find ways to do that too. Crying over industry loss or change isn’t going to turn back the clock, and just because someone could make a career as a journalist doesn’t mean they can today.


* To some extent I’ve subsidized other people’s computers, because Macs hold their value surprisingly well and can be sold for a quarter to half of their original purchase price three to five years after they’ve been bought. Every computer replaced by my family or our business has been sold on Craigslist. Its also possible, with a little knowledge and some online guides, to add RAM and an SSD to most computers made in the last couple of years, which will make them feel much more responsive.

Why little black books instead of phones and computers

“Despite being a denizen of the digital world, or maybe because he knew too well its isolating potential, Jobs was a strong believer in face-to-face meetings.” That’s from Walter Isaacson’s biography of Steve Jobs. It’s a strange way to begin a post about notebooks, but Jobs’ views on the power of a potentially anachronistic practice applies to other seemingly anachronistic practices. I’m a believer in notebooks, though I’m hardly a luddite and use a computer too much.

The notebook has an immediate tactile advantage over phones: they aren’t connected to the Internet. It’s intimate in a way computers aren’t. A notebook has never interrupted me with a screen that says, “Wuz up?” Notebooks are easy to use without thinking. I know where I have everything I’ve written on-the-go over the last eight years: in the same stack. It’s easy to draw on paper. I don’t have to manage files and have yet to delete something important. The only way to “accidentally delete” something is to leave the notebook submerged in water.Notebook stack

A notebook is the written equivalent of a face-to-face meeting. It has no distractions, no pop-up icons, and no software upgrades. For a notebook, fewer features are better and fewer options are more. If you take a notebook out of your pocket to record an idea, you won’t see nude photos of your significant other. You’re going to see the page where you left off. Maybe you’ll see another idea that reminds you of the one you’re working on, and you’ll combine the two in a novel way. If you want to flip back to an earlier page, it’s easy.

The lack of editability is a feature, not a bug, and the notebook is an enigma of stopped time. Similar writing in a computer can function this way but doesn’t for me: the text is too open and too malleable. Which is wonderful in its own way, and that way opens many new possibilities. But those possibilities are different from the notebook’s. It’s become a cliche to argue that the technologies we use affect the thoughts we have and the way we express those thoughts, but despite being cliche the basic power of that observation remains. I have complete confidence that, unless I misplace them, I’ll still be able to read my notebooks in 20 years, regardless of changes in technology.

In Distrust That Particular Flavor, William Gibson says, “Once perfected, communication technologies rarely die out entirely; rather, they shrink to fit particular niches in the global info-structure.” The notebook’s niche is perfect. I don’t think it’s a coincidence that Moleskine racks have proliferated in stores at the same time everyone has acquired cell phones, laptops, and now tablets.

In The Shallows, Nicholas Carr says: “The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.” Cell phones subtly change our relationship with time. Notebooks subtly change our relationship with words and drawings. I’m not entirely sure how, and if I were struggling for tenure in industrial design or psychology I might start examining the relationship. For now, it’s enough to feel the relationship. Farhad Manjoo even cites someone who studies these things:

“The research shows that the type of content you produce is different whether you handwrite or type,” says Ken Hinckley, an interface expert at Microsoft Research who’s long studied pen-based electronic devices. “Typing tends to be for complete sentences and thoughts—you go deeper into each line of thought. Handwriting is for short phrases, for jotting ideas. It’s a different mode of thought for most people.” This makes intuitive sense: It’s why people like to brainstorm using whiteboards rather than Word documents.

IMG_2100I like to write in notebooks despite carrying around a smartphone. Some of this might be indicative of the technology I grew up with—would someone familiar with smartphone touchscreens from age seven have sufficiently dexterous fingers to be faster than they would be with paper?—but I think the obvious answer to “handwriting or computer?” is “both, depending.” As I write this sentence, I have a printout of a novel called ASKING ANNA in front of me, covered with blue pen, because editing on the printed page feels different to me than editing on the screen. I write long-form on computers, though. The plural of anecdote is not data. Still, I have to notice that using different mediums appears to improve the final work product (insert joke about low quality here).

There’s also a shallow and yet compelling reason to like notebooks: a disproportionate number of writers, artists, scientists, and thinkers like using them too, and I suspect that even contemporary writers, artists, scientists, and thinkers realize that sometimes silence and not being connected is useful, like quiet and solitude.

In “With the decline of the wristwatch, will time become just another app?”, Matthew Battles says:

Westerners have long been keenly interested in horology, as David Landes, an economic historian, points out in Revolution in Time, his landmark study of the development of timekeeping technology. It wasn’t the advent of clocks that forced us to fret over the hours; our obsession with time was fully in force when monks first began to say their matins, keeping track of the hours out of strict religious obligation. By the 18th century, secular time had acquired the pressure of routine that would rule its modern mode. Tristram Shandy’s father, waiting interminably for the birth of his son, bemoans the “computations of time” that segment life into “minutes, hours, weeks, and months” and despairs “of clocks (I wish there were not a clock in the kingdom).” Shandy’s father fretted that, by their constant tolling of the hours, clocks would overshadow the personal, innate sense of time—ever flexible, ever dependent upon mood and sociability.

The revolution in electronic technology is wonderful in many ways, but its downsides—distraction, most obviously—are present too. The notebook combats them. Notebooks are an organizing or disorganizing principle: organizing because one keeps one’s thoughts, but disorganizing because one cannot rearrange, tag, and structure thoughts in a notebook as one can on a screen (Devonthink Pro is impossible in the real world, and Scrivener can be done but only with a great deal of friction).

Once you try a notebook, you may realize that you’re a notebook person. You might realize it without trying. If you’re obsessed with this sort of thing, see Michael Loper / Rands’ Sweet Decay, which is better on validating why a notebook is important than evaluating the notebooks at hand. It was also written in 2008, before Rhodia updated its Webbie.

Like Rands, I’ve never had a sewn binding catastrophically fail. As a result, notebooks without sewn bindings are invisible to me. I find it telling that so many people are willing to write at length about their notebooks and use a nominally obsolete technology.

Once you decide that you like notebooks, you have to decide which one you want. I used to like Moleskines, until one broke, and I began reading other stories online about the highly variable quality level.

So I’ve begun ranging further afield.

I’ve tested about a dozen notebooks. Most haven’t been worth writing about. But by now I’ve found the best reasonably available notebooks, and I can say this: you probably don’t actually want a Guildhall Pocket Notebook, which is number two. You want a Rhodia Webnotebook.

Like many notebooks, the Guildhall starts off with promise: the pages do lie flat more easily than alternatives. Lines are closely spaced, maximizing writable area, which is important in an expensive notebook that shouldn’t be replaced frequently.

IMG_3900I like the Guildhall, but it’s too flimsy and has a binding that appears unlikely to withstand daily carry. Mine is already bending, and I haven’t even hauled it around that much. The Rhodia is somewhat stiffer. Its pages don’t lie flat quite as easily. The lines should go to the end of each page. But its great paper quality and durability advantage make it better than the alternatives.

The Rhodia is not perfect. The A7 version, which I like better than the 3.5 x 5.5 American version, is only available in Europe and Australia, which entails high shipping costs. The Webbie’s lines should stretch to the bottom of the page and be spaced slightly closer together. The name is stupid; perhaps it sounds better in French. The notebook’s cover extends slightly over its paper instead of aligning perfectly. Steve Jobs would demand perfect alignment. To return to Isaacson’s biography:

The connection between the design of a product, its essence, and its manufacturing was illustrated for Jobs and Ive when they were traveling in France and went into a kitchen supply store. Ive picked up a knife he admired, but then put it down in disappointment. Jobs did the same. ‘We both noticed the tiny bit of glue between the handle and the blade,’ Ive recalled. They talked about how the knife’s good design had been ruined by the way it was being manufactured. ‘We don’t like to think of our knives as being glued together,’ Ive said. ‘Steve and I care about things like that, which ruin the purity and detract from the essence of something like a utensil, and we think alike about how products should be made to look pure and seamless.

I wish the Rhodia were that good. But the Rhodia’s virtues are more important than its flaws: the paper quality is the highest I’ve seen, and none of the Rhodias I’ve bought have broken. If anyone knows of a notebook that combines the Rhodia’s durability with the qualities it lacks, by all means send me an e-mail.


More on the subject: The Pocket Notebooks of 20 Famous Men.

EDIT: See also Kevin Devlin’s The Death of Mathematics, which is about the allure of math by hand, rather than by computer; though I don’t endorse what he says, in part because it reminds me so much of Socrates decrying the advent of written over oral culture, I find it stimulating.

Why you should become a nurse or physicians assistant instead of a doctor: the underrated perils of medical school

Many if not most people who go to medical school are making a huge mistake—one they won’t realize they’ve made until it’s too late to undo.

So many medical students, residents, and doctors say they wish they could go back in time and tell themselves to do something—anything—else. Their stories are so similar that they’ve inspired me to explain, in detail, the underappreciated yet essential problems with medical school and residency. Potential doctors also don’t realize becoming a nurse or physicians assistant (PA) provides many of the job security advantages of medical school without binding those who start to at least a decade, and probably a lifetime, of finance-induced servitude.

The big reasons to be a doctor are a) lifetime earning potential, b) the limited number of doctors who are credentialed annually, which implies that doctors can restrict supply and thus will always have jobs available, c) higher perceived social status, and d) a desire to “help people” (there will be much more on the dubious value of that last one below).

These reasons come with numerous problems: a) it takes a long time for doctors to make that money, b) it’s almost impossible to gauge whether you’ll actually like a profession or the process of joining that profession until you’re already done, c) most people underestimate opportunity costs, and d) you have to be able to help yourself before you can help other people (and the culture of medicine and medical education is toxic).

Straight talk about doctors and money.

You’re reading this because you tell your friends and maybe yourself that you “want to help people,” but let’s start with the cash. Although many doctors will eventually make a lot of money, they take a long time to get there. Nurses can start making real salaries of around $50,000 when they’re 22. Doctors can’t start making real money until they’re at least 29, and often not until they’re much older.

Keep that in mind when you read the following numbers.

Student Doctor reports that family docs make about $130 – $200K on average, which sounds high compared to what I’ve heard on the street (Student Doctor’s numbers also don’t discuss hours worked). The Bureau of Labor Statistics—a more reliable source—reports that primary care physicians make an average of $186,044 per year. Notice, however, that’s an average, and it also doesn’t take into account overhead. Notice too that the table showing that BLS data indicates more than 40% of doctors are in primary care specialties. Family and general practice doctors make a career median annual wage of $163,510.

Nurses, by contrast, make about $70K a year. They also have a lot of market power—especially skilled nurses who might otherwise be doctors. Christine Mackey-Ross describes these economic dynamics in “The New Face of Health Care: Why Nurses Are in Such High Demand.” Nurses are gaining market power because medical costs are rising and residency programs have a stranglehold on the doctor supply. More providers must come from somewhere. As we know from econ 101, when you limit supply in the face of rising demand, prices rise.

The limit on the number of doctors is pretty sweet if you’re already a doctor, because it means you have very little competition and, if you choose a sufficiently demanding specialty, you can make a lot of money. But it’s bad for the healthcare system as a whole because too many patients chase too few doctors. Consequently, the system is lurching in the direction of finding ways to provide healthcare at lower costs. Like, say, through nurses and PAs.

Those nurses and PAs are going to end up competing with primary care docs. Look at one example, from the New York Times’s U.S. Moves to Cut Back Regulations on Hospitals:”

Under the proposals, issued with a view to “impending physician shortages,” it would be easier for hospitals to use “advanced practice nurse practitioners and physician assistants in lieu of higher-paid physicians.” This change alone “could provide immediate savings to hospitals,” the administration said.

Primary care docs are increasingly going to see pressure on their wages from nurse practitioners for as long as health care costs outstrip inflation. Consider “Yes, the P.A. Will See You Now:”

Ever since he was a hospital volunteer in high school, Adam Kelly was interested in a medical career. What he wasn’t interested in was the lifestyle attached to the M.D. degree. “I wanted to treat patients, but I wanted free time for myself, too,” he said. “I didn’t want to be 30 or 35 before I got on my feet — and then still have a lot of loans to pay back.”

To recap: nurses start making money when they’re 22, not 29, and they are eating into the market for primary care docs. Quality of care is a concern, but the evidence thus far shows no difference between nurse practitioners who act as primary-care providers and MDs who do.

Calls to lower doctor pay, like the one found in Matt Ygleasias’s “We pay our doctors way too much,” are likely to grow louder. Note that I’m not taking a moral or economic stance on whether physician pay should be higher or lower: I’m arguing that the pressure on doctors’ pay is likely to increase because of fundamental forces on healthcare.

To belabor the point about money, The Atlantic recently published “The average female primary-care physician would have been financially better off becoming a physician assistant.” Notice: “Interestingly, while the PA field started out all male, the majority of graduates today are female. The PA training program is generally 2 years, shorter than that for doctors. Unsurprisingly, subsequent hourly earnings of PAs are lower than subsequent hourly earnings of doctors.”

Although the following sentence doesn’t use the word “opportunity costs,” it should: “Even though both male and female doctors both earn higher wages than their PA counterparts, most female doctors don’t work enough hours at those wages to financially justify the costs of becoming a doctor.” I’m not arguing that women shouldn’t become doctors. But I am arguing that women and men both underestimate the opportunity costs of med school. If they understood those costs, fewer would go.

Plus, if you get a nursing degree, you can still go to medical school (as long as you have the pre-requisite courses; hell, you can major in English and go to med school as long as you take the biology, math, physics, and chemistry courses that med schools require). Apparently some medical schools will sniff at nurses who want to become doctors because of the nursing shortage and, I suspect, because med schools want to maintain a clear class / status hierarchy with doctors at top. Med schools are run by doctors invested in the dotor mystique. But the reality is simpler: medical schools want people with good MCAT scores and GPAs. Got a 4.0 and whatever a high MCAT score is? A med school will defect and take you.

One medical resident friend read a draft of this essay and simply said that she “didn’t realize that I was looking for nursing.” Or being a PA. She hated her third year of medical school, as most med students do, and got shafted in her residency—which she effectively can’t leave. Adam Kelly is right: more people should realize what “the lifestyle attached to an M.D. degree” means.

They should also understand “The Bullying Culture of Medical School” and residency, which is pervasive and pernicious—and it contributes to the relationship failures that notoriously plague the medical world. Yet med schools and residencies can get away with this because they have students and residents by the loans.

Why would my friend have realized that she wanted to be a nurse? Our culture doesn’t glorify nursing the way it does doctoring (except, maybe, on Halloween and in adult cinema). High academic achievers think being a doctor is the optimal road to success in the medical world. They see eye-popping surgeon salary numbers and rhetoric about helping people without realizing that nurses help people too, or that their desire to help people is likely to be pounded out of them by a cold, uncaring system that uses the rhetoric of helping to sucker undergrads into mortgaging their souls to student loans. Through the magic of student loans, schools are steadily siphoning off more of doctors’ lifetime earnings. Given constraints and barriers to entry into medicine, I suspect med schools and residencies will be able to continue doing so for the foreseeable future. The logical response for individuals is exit the market because they have so little control over it.

Sure, $160K/year probably sounds like a lot to a random 21-year-old college student, because it is, but after taking into account the investment value of money, student loans for undergrad, student loans for med school, how much nurses make, and residents’ salaries, most doctors’ earnings probably fail to outstrip nurses’ earnings until well after the age of 40. Dollars per hour worked probably don’t outstrip nurses’ earnings until even later.

To some extent, you’re trading happiness, security, dignity, and your sex life in your 20s, and possibly early 30s, for a financial opportunity that might not pay off until your 50s.

Social status is nice, but not nearly as nice when you’re exhausted at 3 a.m. as a third-year, or exhausted at 3 a.m. as a first-year resident, or exhausted at 3 a.m. as a third-year resident and you’re 30 and you just want a quasi-normal life, damnit, and maybe some time to be an artist. Or when you’re exhausted at 3 a.m. as an attending on-call physician because the senior doctors at the HMO know how to stiff the newbies by forcing them to “pay their dues.”

This is where prospective medical students protest, “I’m not going to be a family medicine doc.” Which is okay: maybe you won’t be. Have fun in five or seven years of residency instead of three. But don’t confuse the salaries of superstar specialties like neurosurgery and cardiology with the average experience; more likely than not you’re average. There’s this social ideal of doctors being rich. Not all are, even with barriers to entry in place.

The underrated miseries of residency

As one resident friend said, “You can see why doctors turn into the kind of people they do.” He meant that the system itself lets patients abuse doctors, doctors abuse residents, and for people to generally treat each other not like people, but like cogs. At least nurses who discover they hate nursing can quit, since they will have a portable undergrad degree and won’t have obscene graduate school student loans. They can probably go back to school and get a second degree in twelve to twenty-four months. (Someone with a standard bachelor’s degree can probably enter nursing in the same time period.)

In normal jobs, a worker who learns about a better opportunity in another company or industry can pursue it. Students sufficiently dissatisfied with their university can transfer.[1] Many academic grad schools make quitting easy. Residencies don’t. The residency market is tightly controlled by residency programs that want to restrict residents’ autonomy—and thus their wages and bargaining power. Once you’re in a residency, it’s very hard to leave, and you can only do so at particular in the gap between each residency year.

This is a recipe for exploitation; many of the labor battles during the first half of the twentieth century were fought to prevent employers from wielding this kind of power. For medical residents, however, employers have absolute power enshrined in law—though employers cloak their power in the specious word “education.”

Once a residency program has you, they can do almost anything they want to you, and you have little leverage. You don’t want to be in situations where you have no leverage, yet that’s precisely what happens the moment you enter the “match.”

Let’s explain the match, since almost no potential med students understand it. The match occurs in the second half of the fourth year of medical school. Students apply to residencies in the first half of their fourth year, interview at potential hospitals, and then list the residencies they’re interested in. Residency program directors then rank the students, and the National Residency Match Program “matches” students to programs using a hazily described algorithm.

Students are then obligated to attend that residency program. They can’t privately negotiate with other programs, as students can for, say, undergrad admissions, or med school admissions—or almost any other normal employment situation. Let me repeat and bold: Residents can’t negotiate. They can’t say, “How about another five grand?” or “Can I modify my contract to give me fewer days?” If a resident refuses to accept her “match,” then she’s blackballed from re-entering for the next three years.

Residency programs have formed a cartel designed to control cost and reduce employee autonomy, and hence salaries. I only went to law school for a year, by accident, but even I know enough law and history to recognize a very clear situation of the sort that anti-trust laws are supposed to address in order to protect workers. When my friend entered the match process like a mouse into a snake’s mouth, I became curious, because the system’s cruelty, exploitation, and unfairness to residents is an obvious example of employers banding together to harm employees. Lawyers often get a bad rap—sometimes for good reasons—but the match looked ripe for lawyers to me.

It turns out that I’m not a legal genius and that real lawyers have noticed this obvious anti-trust violation; an anti-trust lawsuit was filed in the early 2000s. Read about it in the NYTimes, including a grimly hilarious line about how “The defendants say the Match is intended to help students and performs a valuable service.” Ha! A valuable service to employers, since employees effectively can’t quit or negotiate with individual employers. Curtailing employee power by distorting markets is a valuable service. The article also notes regulatory capture:

Meanwhile, the medical establishment, growing increasingly concerned about the legal fees and the potential liability for hundreds of millions of dollars in damages, turned to Congress for help. They hired lobbyists to request legislation that would exempt the residency program from the accusations. A rider, sponsored by Senators Edward M. Kennedy, Democrat of Massachusetts, and Judd Gregg, Republican of New Hampshire, was attached to a pension act, which President Bush signed into law in April.

In other words, employers bought Congress and President Bush in order to screw residents.[2] If you attend med school, you’re agreeing to be screwed for three to eight years after you’ve incurred hundreds of thousands of dollars of debt, and you have few if any legal rights to attack the exploitative system you’ve entered.

(One question I have for knowledgeable readers: do you know of any comprehensive discussion of residents and unions? Residents can apparently unionize—which, if I were a medical resident, would be my first order of business—but the only extended treatment of the issue I’ve found so far is here, which deals with a single institution. Given how poorly  residents are treated, I’m surprised there haven’t been more unionization efforts, especially in union-friendly, resident-heavy states like California and New York. One reason might be simple: people fear being blackballed at their ultimate jobs, and a lot of residents seem to have Stockholm Syndrome.)

Self-interested residency program directors will no doubt argue that residency is set up the way it is because the residency experience is educational. So will doctors. Doctors argue for residency being essential because they have a stake in the process. Residency directors and other administrators make money off residents who work longer hours and don’t have alternatives. We shouldn’t be surprised that they seek other legal means of restricting competition—so much of the fight around medicine isn’t about patient care; it’s about regulatory environments and legislative initiatives. For one recent but very small example of the problems, see “When the Nurse Wants to Be Called ‘Doctor’,” concerning nursing doctorates.

I don’t buy their arguments for more than ad hominem reasons. The education at many residency programs is tenuous at best. One friend, for example, is in a program that requires residents to attend “conference,” where residents are supposed to learn. But “conference” usually degenerates into someone nattering and most of the residents reading or checking their phones. Conference is mandatory, regardless of its utility. Residents aren’t 10 year olds, yet they’re treated as such.

These problems are well-known (“What other profession routinely kicks out a third of its seasoned work force and replaces it with brand new interns every year?”). But there’s no political impetus to act: doctors like limiting their competition, and people are still fighting to get into medical school.

Soldiers usually make four-year commitments to the military. Even ROTC only demands a four- to five-year commitment after college graduation—at which point officers can choose to quit and do something else. Medicine is, in effect, at least a ten-year commitment: four of medical school, at least three of residency, and at least another three to pay off med school loans. At which point a smiling twenty-two-year-old graduate will be a glum thirty-two-year-old doctor who doesn’t entirely get how she got to be a doctor anyway, and might tell her earlier self the things that earlier self didn’t know.

Contrast this experience with nursing, which requires only a four-year degree, or PAs, who have two to three years of additional school. As John Goodman points out in “Why Not A Nurse?“, nursing is much less heavily or uniformly regulated than doctoring. Nurses can move to Oregon:

Take JoEllen Wynne. When she lived in Oregon, she had her own practice. As a nurse practitioner, she could draw blood, prescribe medication (including narcotics) and even admit patients to the hospital. She operated like a primary care physician and without any supervision from a doctor. But, JoEllen moved to Texas to be closer to family in 2006. She says, “I would have loved to open a practice here, but due to the restrictions, it is difficult to even volunteer.” She now works as an advocate at the American Academy of Nurse Practitioners.

and, based on the article, avoid Texas. Over time, we’ll see more articles like “Why Nurses Need More Authority: Allowing nurses to act as primary-care providers will increase coverage and lower health-care costs. So why is there so much opposition from physicians?” Doctors will oppose this, because it’s in their economic self-interest to avoid more competition.

The next problem with becoming a doctor involves what economists call “information asymmetry.” Most undergraduates making life choices don’t realize the economic problems I’ve described above, let alone some of the other problems I’m going to describe here. When I lay out the facts about becoming a doctor to my freshmen writing students, many of those who want to be doctors look at me suspiciously, like I’m offering them a miracle weight-loss drug or have grown horns and a tail.

“No,” I can see them thinking, “this can’t be true because it contradicts so much of what I’ve been implicitly told by society.” They don’t want to believe. Which is great—right up to the point they have to live their lives, and see how their how those are lives are being shaped by forces that no one told them about. Just like no one told them about opportunity costs or what residencies are really like.

Medical students and doctors have complained to me about how no one told them how bad it is. No one really told them, that is. I’m not sure how much of this I should believe, but, at the very least, if you’re reading this essay you’ve been told. I suspect a lot of now-doctors were told or had an inkling of what it’s really like, but they failed to imagine the nasty reality of 24- or 30-hour call.

They, like most people, ignore information that conflicts with their current belief system about the glamor of medicine to avoid cognitive dissonance (as we all do: this is part of what Jonathan Haidt points out in The Righteous Mind, as does Daniel Kahneman in Thinking, Fast and Slow). Many now-doctors, even if they were aware, probably ignored that awareness and now complain—in other words, even if they had better information, they’d have ignored it and continued on their current path. They pay attention to status and money instead of happiness.

For example, Penelope Trunk cites Daniel Gilert’s Stumbling on Happiness and says:

Unfortunately, people are not good at picking a job that will make them happy. Gilbert found that people are ill equipped to imagine what their life would be like in a given job, and the advice they get from other people is bad, (typified by some version of “You should do what I did.”)

Let’s examine some other vital takeaways from Stumbling on Happiness: [3]

* Making more than about $40,000/year does little to improve happiness (this should probably be greater in, say, NYC, but the main point stands: people think money and happiness show a linear correlation when they really don’t).

* Most people value friends, family, and social connections more than additional money, at least once their income reaches about $40K/year. If you’re trading time with friends and family for money, or, worse, for commuting, you’re making a tremendous, doctor-like mistake.

* Your sex life probably matters more than your job, and many people mis-optimize in this area. I’ve heard many residents and med students say they’re too busy to develop relationships or have sex with their significant others, if they manage to retain one or more, and this probably makes them really miserable.

* Making your work meaningful is important.

Attend med school without reading Gilbert at your own peril. No one in high school or college warns you of the dangers of seeking jobs that harm your sex life, because high schools are too busy trying to convince you not to have one. So I’m going issue the warning: if you take a job that makes you too tired to have sex or too tired to engage in contemporary mate-seeking behaviors, you’re probably making a mistake.

The sex-life issue might be overblown, because people who really want to have one find a way to have one; some med students and residents are just offering the kinds of generic romantic complaints that everyone stupidly offers, and which mean nothing more than discussion about the weather. You can tell what a person really wants by observing what they do, rather than what they say.

But med students and residents have shown enough agony over trade-offs and time costs to make me believe that med school does generate a genuine pall over romantic lives. There is a correlation-is-not-causation problem—maybe med school attracts the romantically inept—but I’m willing to assume for now that it doesn’t.

The title of Trunk’s post is “How much money do you need to be happy? Hint: Your sex life matters more.” If you’re in an industry that consistently makes you too tired for sex, you’re doing things wrong and need to re-prioritize. Nurses can work three twelves a week, or thirty-six total hours, and be okay. But, as described above, being a doctor doesn’t let employees re-prioritize.

Proto-doctors screw up their 20s and 30s, sexually speaking, because they’ve committed to a job that’s so cruel to its occupants that, if doctors were equally cruel to patients, those doctors would be sued for malpractice. And the student loans mean that med students effectively can’t quit. They’ve traded sex for money and gotten a raw deal. They’ll be surrounded by people who are miserable and uptight—and who have also mis-prioritized.

You probably also don’t realize how ill-equipped you are to what your life would be like as a doctor because a lot of doctors sugarcoat their jobs, or because you don’t know any actual doctors. So you extrapolate from people who say, “That’s great” when you say you want to be a doctor. If you say you’re going to stay upwind and see what happens, they don’t say, “That’s great,” because they simply think you’re another flaky college student. But saying “I want to go to med school” or “I want to go to law school” isn’t a good way to seem level-headed (though I took the latter route; fortunately, I had the foresight to quit). Those routes, if they once led to relative success and happiness, don’t any more, at least for most people, who can’t imagine what life is like on the other end of the process. With law, at least the process is three years, not seven or more.

No one tells you this because there’s still a social and cultural meme about how smart doctors are. Some are. Lots more are very good memorizers and otherwise a bit dull. And you know what? That’s okay. Average doctors seeing average patients for average complaints are fixing routine problems. They’re directing traffic when it comes to problems they can’t solve. Medicine doesn’t select for being well-rounded, innovative, or interesting; if anything, it selects against those traits through its relentless focus on test scores, which don’t appear to correlate strongly with being interesting or intellectual.

Doctors aren’t necessarily associating with the great minds of your generation by going to medical school. Doctors may not even really be associating with great minds. They might just be associating with excellent memorizers. I didn’t realize this until I met lots of of doctors, had repeated stabs at real conversations with them, and eventually realized that many aren’t intellectually curious and imaginative. There are, of course, plenty of smart, intellectually curious doctors, but given the meme about the intelligence of doctors, there are fewer than imagined and plenty who see themselves as skilled technicians and little more.

A lot of doctors are the smartest stupid people you’ve met. Smart, because they’ve survived the academic grind. Stupid, because they signed up for med school, which is effectively signing away extraordinarily valuable options. Life isn’t a videogame. There is no reset button, no do-over. Once your 20s are gone, they’re gone forever.

Maybe your 20s are supposed to be confusing. Although I’m still in that decade, I’m inclined to believe this idea. Medical school offers a trade-off: your professional life isn’t confusing and you have a clear path to a job and paycheck. If you take that path, your main job is to jump through hoops. But the path and the hoops offer  clarity of professional purpose at great cost in terms of hours worked, debt assumed, and, perhaps worst of all, flexibility. Many doctors would be better off with the standard confusion, but those doctors take the clear, well-lit path out of fear—which is the same thing that drives so many bright but unfocused liberal grads into law schools.

I’ve already mentioned prestige and money as two big reasons people go to med school. Here’s another: fear of the unknown. Bright students start med school because it’s a clearly defined, well-lit path. Such paths are becoming increasingly crowded. Uncertainty is scary. You can fight the crowd, or you can find another way. Most people are scared of the other way. They shouldn’t be, and they wouldn’t be if they knew what graduate school paths are like.

For yet another perspective on the issue of not going to med school, see Ali Binazir’s “Why you should not go to medical school — a gleefully biased rant,” which has more than 200 comments as of this writing. Binazir correctly says there’s only one thing that should drive you to med school: “You have only ever envisioned yourself as a doctor and can only derive professional fulfillment in life by taking care of sick people.”

If you can only derive professional fulfillment in life by taking care of sick people, however, you should remember that you can do so by being a nurse or a physicians assistant. And notice the words Binazir chooses: he doesn’t say, “help people”—he says “taking care of sick people.” The path from this feeling to actually taking care of sick people is a long, miserable one. And you should work hard at envisioning yourself as something else before you sign up for med school.

You can help people in all kinds of ways; the most obvious ones are by having specialized, very unusual skills that lots of people value. Alternately, think of a scientist like Norman Borlaug (I only know about him through Tyler Cowen’s book The Great Stagnation; in it, Cowen also observes that “When it comes to motivating human beings, status often matters at least as much as money.” I suspect that a lot of people going to medical school are really doing it for the status).

Bourlag saved millions of lives through developing hardier seeds and through other work as an agronomist. I don’t want to say something overwrought and possibly wrong like, “Bourlag has done more to help people than the vast majority of doctors,” since that raises all kinds of questions about what “more” and “help” and “vast majority” mean, but it’s fair to use him as an example of how to help people outside of being a doctor. Programmers, too, write software that can be instantly disseminated to billions of people, and yet those who want to “help” seldom think of it as a helping profession, even though it is.

For a lot of the people who say they want to be a doctor so they can help people, greater intellectual honesty would lead them to acknowledge mixed motives in which helping people is only one and perhaps not the most powerful. On the other hand, if you really want to spend your professional life taking care of sick people, Binazir is right. But I’m not sure you can really know that before making the decision to go to medical school, and, worse, even if all you want to do is take care of sick people, you’re going to find a system stacked against you in that respect.

You’re not taking the best care of people at 3 a.m. on a 12- to 24-hour shift in which your supervisors have been screaming at you and your program has been jerking your schedule around like a marionette all month, leaving your sleep schedule out of whack. Yeah, someone has to do it, but it doesn’t have to be you, and if fewer people were struggling to become doctors, the system itself would have to change to entice more people into medical school.

One other, minor point: you should get an MD and maybe a PhD if you really, really want to do medical research. But that’s a really hard thing for an 18 – 22 year old to know, and most doctors aren’t researchers. Nonetheless, nurses (usually) aren’t involved in the same kind of research as research MDs. I don’t think this point changes the main thrust of my argument. Superstar researchers are tremendously valuable. If you think you’ve got the tenacity and curiosity and skills to be a superstar researcher, this essay doesn’t apply to you.

Very few people will tell you this, or tell even if you ask; Paul Graham even writes about a doctor friend in his essay “How to do What You Love:”

A friend of mine who is a quite successful doctor complains constantly about her job. When people applying to medical school ask her for advice, she wants to shake them and yell “Don’t do it!” (But she never does.) How did she get into this fix? In high school she already wanted to be a doctor. And she is so ambitious and determined that she overcame every obstacle along the way—including, unfortunately, not liking it.

Now she has a life chosen for her by a high-school kid.

When you’re young, you’re given the impression that you’ll get enough information to make each choice before you need to make it. But this is certainly not so with work. When you’re deciding what to do, you have to operate on ridiculously incomplete information. Even in college you get little idea what various types of work are like. At best you may have a couple internships, but not all jobs offer internships, and those that do don’t teach you much more about the work than being a batboy teaches you about playing baseball.

Having a life chosen for you by a 19-year-old college student or 23-year-old wondering what to do is only marginally better.

I’m not the first person to notice that people don’t always understand what they’ll be like when they’re older; in “Aged Wisdom,” Robin Hanson says:

You might look inside yourself and think you know yourself, but over many decades you can change in ways you won’t see ahead of time. Don’t assume you know who you will become. This applies all the more to folks around you. You may know who they are now, but not who they will become.

This doesn’t surprise me anymore. Now I acknowledge that I’m very unlikely to be able to gauge what I’ll want in the future.

Contemplate too the psychological makeup of many med students. They’re good rule-followers and test-takers; they tend to be very good on tracks but perhaps not so good outside of tracks. Prestige is very important, as is listening to one’s elders (who may or may not understand the ways the world is changing in fundamental ways). They may find the real world large and scary, while the academic world is small, highly directed, and sufficiently confined to prevent intellectual or monetary agoraphobia.

These issues are addressed well in two books: Excellent Sheep by William Deresiewicz and Zero to One by Peter Thiel and Blake Masters. I won’t endorse everything in either book, but pay special attention to their discussions of the psychology of elite students and especially the weaknesses that tend to appear in that psychology.

It is not easy for anyone to accept criticism, but that may be particularly true of potential med students, who have been endlessly told how “smart” they are, or supposedly are. Being smart in the sense of passing classes and acing tests may not necessarily lead you towards the right life, and, moreover, graduate schools and consulting have evolved to prey on your need for accomplishment, positive feedback, and clear metrics. You are the food they need to swallow and digest. Think long and hard about that.

If you don’t want to read Excellent Sheep and Zero to One, or think you’re “too busy,” I’m going to marvel: you’re willing to spend hundreds of thousands of dollars and years of your life to a field that you’re not wiling to spend $30 and half a day to understanding better? That’s a dangerous yet astonishingly common level of willful ignorance.

Another friend asked what I wanted to accomplish with this essay. The small answer: help people understand things they didn’t understand before. The larger answer—something like “change medical education”—isn’t very plausible because the forces encouraging people to be doctors are so much larger than me. The power of delusion and prestige is so vast that I doubt I can make a difference through writing alone. Almost no writer can: the best one can hope for is changes at the margin over time.

Some med school stakeholders are starting to recognize the issues discussed in this essay: for example, The New York Times has reported that New York University’s med school may be able to shorten its duration from four years to three, and “Administrators at N.Y.U. say they can make the change without compromising quality, by eliminating redundancies in their science curriculum, getting students into clinical training more quickly and adding some extra class time in the summer.” This may be a short-lived effort. But it may also be an indicator that word about the perils of med school is spreading.

I don’t expect this essay to have much impact. It would require people to a) find it, which most probably won’t do, b) read it, which most probably won’t do, c) understand it, which most of those who read it won’t or can’t do, and d) implement it. Most people don’t seem to give their own futures much real consideration. I know a staggering number of people who go to law or med or b-school because it “seems like a good idea.” Never mind the problem with following obvious paths, or the question of opportunity costs, or the difficulty in knowing what life is like on the other side.

People just don’t think that far ahead. I’m already imagining people on the Internet who are thinking about going to med school and who see the length of this essay and decide it’s not worth it—as if they’d rather spend a decade of their lives gathering the knowledge they could read in an hour. They just don’t understand the low quality of life medicine entails for many if not most doctors.

Despite the above, I will make one positive point about med school: if you go, if you jump through all the hoops, if you make it to the other side, you will have a remunerative job for life, as long as you don’t do anything grossly awful. Job demand and pay are important. Law school doesn’t offer either anymore. Many forms of academic grad schools are cruel pyramid schemes propagated by professors and universities. But medicine does in fact have a robust job market on the far end. That is a real consideration. You’re still probably better off being a nurse or PA—nurses are so in-demand that nursing schools can’t grow fast enough, at least as of 2015—but I don’t want to pretend that the job security of being a doctor doesn’t exist.

I’m not telling you what to do. I rarely tell anyone what to do. I’m describing trade-offs and asking if you understand them. It appears that few people do. Have you read this essay carefully? If not, read it again. Then at least you won’t be one of the many doctors who hate what you do, warn others about how doctors are sick of their profession, and wish you’d been wise enough to take a different course.

If you enjoyed this essay, you should also read my novel, Asking Anna. It’s a lot of fun for not a lot of money!


[0] Here’s another anti-doctor article: “Why I Gave Up Practicing Medicine.” Scott Alexander’s “Medicine As Not Seen On TV” is also good. The anti-med-school lit is available to those who seek it. Most potential med students don’t seem to. Read the literature and understand the perils. If after learning you still want to go anyway, great.

Here is too intelligent commenter ktswan, who qualifies the rest of the article. She went from nursing to med school and writes, “I am much happier in medicine than lots of my colleagues, I think in many ways because I knew exactly what I was getting into, what I was sacrificing, and what I wanted to gain from it.”

[1] One could argue that many of the problems in American K – 12 education stem from a captive audience whose presence or absence in a school is based on geography and geographical accidents rather than the school’s merit.

[2] You can read more about the match lawsuit here. Europe doesn’t have a match-like system; there, the equivalent of medical residency is much more like a job.

[3] Stumbling on Happiness did more to change my life and perspective than almost any other book. I’ve read thousands of books. Maybe tens of thousands. Yet this one influences my day-to-day decisions and practices by clarifying how a lot of what people say they value they don’t, and how a lot of us make poor life choices based on perceived status that end up screwing us. Which is another way of saying we end up screwing ourselves. Which is what a lot of medical students, doctors, and residents have done. No one holds the proverbial gun to your head and orders you into med school (unless you have exceptionally fanatical parents). When you’re doing life, at least in industrialized Western countries, you mostly have yourself to blame for your poor choices, made without enough knowledge to know the right choices.

Thanks to Derek Huang, Catherine Fiona MacPherson, and Bess Stillman for reading this essay.

What you should know BEFORE you start grad school / PhD programs in English Literature: The economic, financial, and opportunity costs

This post started life as an e-mail to a high school teacher who is thinking about grad school in English Lit. I expanded and cleaned it up slightly for the blog, but the substance remains.

Pleasure meeting you the other day. I’m too well-versed in the anti-grad school lit, and the short version of this e-mail is “don’t go to grad school in the humanities.” If you go anyway, make sure you have an obvious fallback career; don’t assume that you’ll figure it out after five to ten years. Grad school is not a good place to pointlessly delay adulthood (a phrase we’ll come back to later).

Let me start with Thomas Benton’s articles, like “The Big Lie About the ‘Life of the Mind’” and “Graduate School in the Humanities: Just Don’t Go” in the Chronicle of Higher Education. Read both. Read both twice. Then read Louis Menand’s The Marketplace of Ideas, and pay special attention to the sections where he discusses supply and demand: I get the sense that a lot of people spend more time deeply, critically thinking about fun restaurants for dinner tonight than whether grad school is really a good idea. I’m not saying you’re one of those people, but the number of would-be researchers who do almost no research in evaluating their grad school decisions is astounding. Menand’s basic point is simple: most people in English PhD programs are not going to be researchers and tenure-track professors at universities. [1] Some number will, but that number is tiny.

Don’t put too much stock in stories like “From Graduate School to Welfare: The Ph.D. Now Comes With Food Stamps,” but they’re being told and repeated for reasons. People like the woman featured have made spectacularly bad life choices, and, while she’s an extreme example, many would-be professors eventually curse themselves for starting grad school. If I didn’t have a second job working for a real business for real money, I’d probably be close to qualifying for food stamps (without that real job, however, I wouldn’t have made it this far in grad school, because it’s almost impossible to live a reasonably normal life on $13,000 – $16,000 per year).

I know grad students who can’t get a $7 sandwich at Paradise Bakery because it’ll blow their food budget for the month. They have to bring lunch to campus every day because they can’t afford not to. Tired in the morning? Tough: make your bean-sprout sandwich or your lentil curry. Personally I like bean-sprout sandwiches and lentil curry, but I also like the option of buying lunch on a whim. Not having any money also sucks if you need or want a book and can’t get it easily or expeditiously from the library and find yourself unable to buy it for $30. Someone who’s has four years of undergrad and two or more years of grad school should be able to buy a sandwich without carefully thinking about the financial repercussions.

Consider what you’ve got right now, today. You’re a teacher, so I’ll guess you make ~$30,000 – $40,000 a year. Call it $35,000. If you spend five years getting a PhD, you’ll be giving up at least $100,000 ($35,000*5=175,000; $15,000*5=$75,000) short of what you’d make teaching high school. And that’s not taking into account the raises you might get as a teacher, or the benefits, which can be substantial (especially if you’re on a 30-year retirement track). If you take 10 years, like the median PhD student, you’ll be giving up $225,000, again not counting benefits, which are far better as a teacher than they are as a grad student. Accounting for retirement benefits, you might be giving up more like $300,000. A lot of money, no?

If you get a tenure-track job, you could conceivably make up that amount over the course of your lifetime, but, remember, you’re not even likely to make that much as a TT prof; I’ve asked the University of Arizona’s TT-track but non-tenured faculty gauche money questions, and they report making about $50,000 a year—and U of A is a plum, super-competitive job straight out of grad school. It’s certainly possible to make less and work more. You can do the math on how long you’ll have to work to financially make up for income foregone during grad school. It’s ugly.

If you don’t get a tenure track job, you may wish very deeply for a couple extra hundred thousand dollars. These are loose numbers, but no one I’ve floated them to has disputed them, I’d guess that making them more precise by counting opportunity / investment costs would only weigh them more heavily to being a teacher, given how much of one’s lifetime income from being a teacher is backloaded by retirement pay.

So who’s grad school good for? Again, let’s follow the money, and I’ll use the University of Arizona as an example because that’s where I am. The out-of-state credit-hour fee for undergrads for Spring 2012 was $1,024. For in-state students it was $651. About a quarter of Arizona undergrads come from out-of-state. Grad students teach about 50 freshmen per semester, or about 100 per year. That’s $48,825 in in-state tuition collected, and $25,600 of out-of-state tuition—but each grad student teaches three credit hours. Triple those numbers. They’re $76,800 for out-of-state students and $146,000 for in-state students, for a total of $222,8000. Some of that money goes to profs who run grad seminars, to facilities, to various other administrative functions, and so on. (Grad students also get a couple of one-semester, one-class waivers), but the basic calculation shows why the university as a whole likes grad students, a lot.

Most universities love ABDs, who consume minimal university resources. Menand says:

One pressure on universities to reduce radically the time to degree is simple humanitarianism. Lives are warped because of the length and uncertainty of the doctoral education process. Many people drop in and drop out and then drop in again; a large proportion of students never finish; and some people have to retool at relatively advanced ages. Put in less personal terms, there is a huge social inefficiency in taking people of high intelligence and devoting resources to training them in programs that half will never complete for jobs that most will not get. Unfortunately, there is an institutional efficiency, which is that graduate students constitute a cheap labor force. There are not even search costs involved in appointing a graduate student to teach. The system works well from the institutional point of view not when it is producing PhDs, but when it is producing ABDs […] The longer students remain in graduate school, the more people are available to staff undergraduate classes. Of course, the overproduction of PhDs also creates a buyer’s advantage in the market for academic labor.

There’s little incentive for universities to speed up the grad school process. If anything, their financial incentive is to slow it further, and this is what we see. Regardless of their marketing, remember that universities are businesses, and businesses prefer to pay less for labor, not more, just as you probably prefer to pay less for goods and services, not more. Many articles decry the state of the adjunct labor force, but universities treat adjuncts like they do because they can. Supply and demand exist and they matter.

Most people I know who aren’t in grad school and talk about going discuss the life of the mind, the transformative power of education, how they want to be a professor, their interest in teaching, their love of research and so forth. Most people I know who are in grad school talk about finances, economics, and the job market. Not all the time, to be sure, and I’ve had some lovely conversations about The Professor of Desire and Billy Collins and Heart of Darkness. But jobs and money are on almost everyone’s mind, especially as peers from high school or college are getting jobs at Google, or finishing their residencies, or getting promoted enough to discuss their “401(k),” which is a sure sign of aging, along with in-depth real estate analysis—remember back when we only talked about sex and art? Neither do I.

Many grad students remain in a state of financial adolescence for a decade of their prime career-building years. Don’t do that. Become an adult: you’ll have to eventually, and the skills you build outside the academy are often more valuable than those you might build in humanities grad schools.

Some grad students complain about being financially exploited by universities, but it’s hard to exploit highly educated people who have terrific reading and writing skills and who should know better, or at least do some cursory research before they spend as long as a decade getting a degree. The anti-grad school literature is vast—and highly accessible: type “Why shouldn’t I go to grad school?” into a search engine. Spend a few hours with the results.

People who aren’t in grad school, along with people who are professors and have jobs, also talk about wanting to be involved with “the Conversation” (I capitalize “Conversation” in my head), which means the book chat that happens in peer-reviewed journals and books about writers and ideas. But if you want to contribute to the Conversation, get a blog from http://www.wordpress.com or http://www.substack.com and start producing valuable work. Comment on the work of other book people. Write about what you notice. This won’t get you tenure, and it will probably not get you read by other professors, but, if you’re any good, you will probably have more readers than the average literary journal. See “No One Really Reads Academic Papers” and “The Research Bust.” In writing a blog no one has heard of, I’ve had greater impact and reach than the published work of 98% of tenured humanities professors. The paucity of most humanities professors’ intellectual ambition is astounding, when you really think about it.

To be sure, some people succeed in grad school. Maybe I’ll be one, although this looks increasingly less likely. A PhD is not a lottery ticket, but it can start to feel like one. If you do go, you better know the odds and know the costs, financial and otherwise. You better know that there are very, very few tenure track jobs, though there are a lot of one-year gigs at random places that are happy to offer you not very much money for not very good job security. The system is rigged against you. Humanities academics are often very interested in talking about all kinds of exploitation, but they very rarely want to talk about the exploitation that happens in grad school itself. Play games you’re likely to win, not games you’re likely to lose. Choose status ladders to climb that matter, not ones that mattered 50 years ago.

Too many people—maybe most—enter grad school so they can pointlessly delay adulthood. Adulthood, however, arrives sooner or later anyway. Too many people enter grad school because they’ve succeeded by conventional academic metrics and hoop-jumping through most of their lives and find the big, amorphous real world terrifying. But grad school, if it was ever a good way of avoiding the real world, surely isn’t now, because the real world is a far harsher place when you’re 32 and have a degree of dubious value and are trying to cobble gigs together to pay rent. See again the link above concerning PhDs on food stamps.

There are also dangers that are rarely discussed. In humanities PhD programs, dissertation advisors and committee members may be distant or unhelpful. Outright theft of work is rare, but indifference is common. It’s possible for a single person to outright block or retard individual progress in a way that’s rare in normal jobs. A committee can offer no or positive feedback, then outright reject a dissertation. A sudden retirement, departure, or sabbatical can imperil years of a candidate’s work. You don’t want to get in a situation where a single person can annihilate your career. That’s what grad school in the humanities often means.

I don’t know anyone in the business who is really gung-ho about encouraging smart, motivated undergrads and recent graduates to go to humanities grad programs.

In addition, if you don’t thoroughly read everything I’ve linked to in this post, you shouldn’t go to grad school because you haven’t invested enough time in thinking about and learning about what you’re getting into.

Some of the problems above could be ameliorated, if it were in the system’s interest to do so (it’s not; universities’s finances are enabled by the cruel student loan system, while professors like the system, with the status and modest amounts of power it grants them, as it is). Eliminating tenure would help, because few schools want to make what might be 40+ year commitments to salary + benefits if they don’t have to. A shift to long-term contracts would be an improvement at the margins.

I’ve seen some proposals that universities offer a four-year “teaching PhD” that is awarded primarily on the basis of coursework; since most PhD students are at most going to become adjuncts or lecturers anyway, one might as well quit the facade that currently exists. The teaching dissertation would be a collection of coursework and/or experiment descriptions, depending on the field. Something like this paragraph could have been written any time in the last 15 or 20 years, and the system trundles along because it works well enough and a sufficient number of people are willing to chase the tenure dream to keep it going.

EDIT 2016: When I first wrote this in 2012 I was still in grad school. I’m updating it in January of 2016. Let me be blunter: going to grad school in the humanities is an idiotic life choice that will likely fuck up your life. Of the people I know who were my approximate grad school peers, two live at home; one works at an Apple Store; another works in a preschool; another is teaching the SAT, LSAT, and the like for one of the big companies that pay $15 – $20 an hour for such work; and a couple are adjuncts. A few have short-term contracts. Only one or two have the tenure-track positions they were training for.

If you must, must, must go to grad school despite knowing how dumb doing so is, quit after two years with an M.A. Don’t waste years of your life. There is often a false dichotomy presented between the “life of the mind” and pursuing lots of filthy money. But I like to observe that it’s reasonable to seek reasonable material conditions while pursuing the life of the mind. If you can’t achieve reasonable material conditions you should do something else, and that something else may enable the true life of the mind, not the potemkin life of the mind offered by most humanities graduate degree programs.

Further reading:

* Most universities hire exclusively from elite universities. If you don’t attend an elite university, you’re unlikely to get a job regardless of your publishing record.

* Robert Nagel’s “Straight Talk about Graduate School.”

* “Open Letter to My Students: No, You Cannot be a Professor.

* Penelope Trunk’s “Don’t try to dodge the recession with grad school,” as well as “Best alternative to grad school” and “Voices of the defenders of grad school. And me crushing them.”

* As of 2015, “The Job Market for Academics Is Still Terrifying.” Fewer than half of humanities PhDs are “employed” (using whatever metric they use) and about 35% are unemployed altogether—which is at least three times the national unemployment rate, which also counts high-school dropouts.

* If you are male, see “Insanity in academia, or, reason #1,103 why you should stay out of grad school: Kangaroo courts” to better understand the culture you seek to join. You’re an accusation away from having your career destroyed.

* “The New Intellectuals: Is the academic jobs crisis a boon to public culture?” (Note the sections about the bogosity of peer review and the economic precariousness of the “new intellectuals”).


[1] Menand also writes:

Between 1945 and 1975, the number of American undergraduates increased 500 percent, but the number of graduate students increased by nearly 900 percent. On the one hand, a doctorate was harder to get; on the other, it became less valuable because the market began to be flooded with PhDs.

This fact registered after 1970, when the rapid expansion of American higher education abruptly slowed to a crawl, depositing on generational shores a huge tenured faculty and too many doctoral programs churning out PhDs. The year 1970 is also the point from which we can trace the decline in the proportion of students majoring in liberal arts fields, and, within the decline, a proportionally larger decline in undergraduates majoring in the humanities. In 1970–71, English departments awarded 64,342 bachelor’s degrees; that represented 7.6 percent of all bachelor’s degrees, including those awarded in non-liberal arts fields, such as business. The only liberal arts category that awarded more degrees than English was history and social science, a category that combines several disciplines. Thirty years later, in 2000–01, the number of bachelor’s degrees awarded in all fields was 50 percent higher than in 1970–71, but the number of degrees in English was down both in absolute numbers—from 64,342 to 51,419—and as a percentage of all bachelor’s degrees, from 7.6 percent to around 4 percent.

Fewer students major in English. This means that the demand for English literature specialists has declined.

The number of undergrads in English Lit has declined while the number of people getting PhDs has remained constant or risen. There is basically no industry for English PhDs to enter. You do not have to be an economist to understand the result.

How to think about science and becoming a scientist

A lot of students want to know whether they should major in the humanities, business, or science, which is a hard choice because most of them have no idea whatsoever about what real science (or being a scientist) is like, and they won’t learn it from introductory lectures and lab classes. So freshmen and sophomores who are picking majors don’t, and can’t, really understand what they’re selecting—or so I’ve been told by a lot of grad students and youngish professors who are scientists.

One former student recently wrote me to say, “I was a biochemistry major with a dream of being a publisher and long story short, I am no longer a biochem major and I am going full force in getting into the publishing field right now” (emphasis added). I discouraged her from going “into” publishing, given that I’m not even convinced there is going to be a conventional publishing industry in five years, and forwarded her e-mail to a friend who was a biochemistry major. My friend’s response started as a letter about how to decide if you want to become a scientist but turned into a meditation on how to face the time in your life when you feel like you have to decide what, if anything, you want to become.


The thing about being “interested” in science is that the undergraduate survey classes rarely confirm if you really are. They’re boring. Rote. Dull. I credit my Bio 101 teacher with making the delicate, complicated mysteries of carbon based life as engaging as listening to my Cousin “M” discuss the subtle differences among protein powder supplements. I spent most of class surfing Wikipedia on my laptop. The next semester I weaseled my way into an advanced cell bio class that was fast and deep and intellectually stimulating, thanks to an eccentric teacher with a keen mind and a weird tendency to act out enzymatic reactions in a sort of bastardized interpretive dance. I dropped Bio 102, which didn’t cripple my ability to keep up with advanced cell bio in any way (showing that survey classes can be unnecessary, boring, and confusing—confusing primarily because they leave out the details that are supposed to be too “advanced” but in fact clarify what the hell is going on), and got an unpaid research position in a faculty lab that eventually turned into a paid gig. By the way: there is significant pressure to dumb survey courses down and virtually no pressure on professors to teach them well; there are still good ones, but don’t let the bad ones dissuade you.

If any field of scientific inquiry interests you, if you have the impulse to ask your own questions and are excited by the idea that you can go find the answers yourself and use what you’ve discovered to tinker and build and ask new questions—which is to say, if you like the idea of research—you’ve got a much better chance of figuring out if you want to be a scientist. How? Go and be one. Or, at least, play at being a scientist by finding a lab that will train you at doing the work until you wake up one day and realize that you are teaching a new undergrad how to program the PCR machine and your input is being used to develop experiments.

I was a biochemistry undergrad major, and I absolutely deplored the labs that were required by classes, but it turned out I loved the actual work of being in a lab. Classes lacked the creativity that makes science so appealing; they feel designed to discourage interest in science. In class, we had 50 minutes to purify a protein and learn to use the mass spectrometer. Big deal. Can I go now? But put me in front of the PCR machine with a purpose? I’d learn how to use it in an afternoon because doing so meant that I was one step closer to solving a problem no one had solved before. You don’t find that kind of motivation in most classrooms. And you don’t need a Ph.D. to contribute to the field. All you need is intellectual appetite. (For an exception to the “class is boring” rule, check out Richard Feynman’s intro to physics lectures.)

So: I didn’t like glossing over information, memorizing for tests, and being told I had till the end of class to identify how many hydrogen ions were in compound X. I wasn’t excited by my major, but I was excited by my subject—and the longer I spent working in a real lab with a professor who saw that I was there every day and willing to learn (he eventually gave me a pet project), the more engaged I became with biochemistry. Sure, the day-to-day involved a lot of pipetting and making nutrient-agar plates to grow bacteria on, but I was working towards something larger than a grade.

I was splicing the DNA of glucose galactose binding protein and green fluorescent protein to try to make a unique gene that could express a protein which fluoresced when binding to glucose. In essence, a protein flare. Then I built it into an e-coli plasmid so it would self-replicate, while a lab in Japan was trying to get the gene expressed into what effectively turned into glow-under-blacklight-just-add-sugarwater mice. The goal was to get the gene expressed in diabetic people who could wear a fluorimeter watch and check how brightly the genetically engineered freckle on their wrist glowed, in lieu of pricking their finger to check their blood glucose.

Do you have any idea how awesome it was to describe my research at parties? I left out the parts where I had to evacuate the whole lab for four hours after accidentally creating a chlorine cloud and especially the parts where I spent an entire day making 250 yeast-agar plates and went home with “nutrient powder” in my hair and lungs. But even with the accidents and drudgery, the bigger goal was motivating. Being part of the real scientific conversation gave my work purpose that a grade did not. I dreamed of building nanobots and splicing the DNA together to build biological machines. It sure as hell beat memorizing the Kreb’s cycle in return for a 4.0 GPA and med school.

That is what I love about science: you get to build something, you get to dream it up and ask questions and see if it works and even if you fail you learn something. What I loved was a long way from the dreary crap that characterizes so many undergrad classes. To be fair, the day-to-day isn’t all that whiz bang, but it’s rarely Dilbert-esque and I really liked the day-to-day challenges. There was something zen about turning on music and pipetting for three hours. That was right for me. It might not be for you; if you’re trying to be a scientist or get a feel for what science is like (more on that below), don’t be afraid to expose yourself to multiple labs if the first doesn’t work out for you.

My own heart will always be that of a splice-n-dicer. I’ll always love fiddling with DNA more than purifying compounds over a bunsen burner. But you don’t know what day-to-day tasks will give you the most pleasure. You don’t yet know that you might find zen at 3 a.m. setting up DNA assays, your mind clear, the fluid motion of your hand pulling you into a state of flow. You find out by doing, and you might be surprised—especially because classes don’t give you a good sense of what the life of a scientist is like. It also doesn’t introduce you to the graduate students, the post-doctorates and the assistant professors who show you what kind of struggle comes from love, which in turn generates internal motivation. It doesn’t take you away from your university into summer programs that show you how amazing it is to be in a lab with real money and the resources to make your crazy ideas possible.

Which brings me to choosing a field: If you like science, but don’t know what kind, pick the most substantive one that interests you, with as much math as you’re willing to handle, and just get started (math is handy because it applies virtually everywhere in the sciences). Chemistry, biochem and biology overlap to such a degree that I was working in a biochem lab on a genetics project with the goal of creating a protein, and biology labs worked with us in tandem. When you get into the real work, the lines between fields blur. You can major in biochem and get a Ph.D. in neuroscience, study organic chemistry and work for a physical chemistry / research firm. Other scientists don’t care about what classes you took or what your degree says—they care about what you know and what you can do and if what you can do can be applied in a useful way. When in doubt, focus on developing technical skills more than the words on your degree.

One summer I applied to the Mayo Clinic Summer Undergraduate Research Fellowship (something I recommend anyone interested in science do—there are “SURF” programs at almost every major university and research center and they will give you a stipend, housing and exposure to a new lab. It can do amazing things for your CV, your career and your relationship to the larger scientific community. In math and some other fields, your best bet is the NSF’s Research Experiences for Undergraduates (REU) Program). But I didn’t get the job. I had six months in a lab at that point. I had a 3.96 GPA. I had a pretty great “why me” essay. Still, nothing.

A year later I applied again. By that time I’d been in the lab for a year and a half. I knew how to handle most of our major equipment. My CV described the tasks I could perform unsupervised, the problems I tackled by myself, and solutions I’d found. My advisor explained my role and the amount of autonomy I had been given. This time I got the job. When I met with the director of my summer lab in person he made it clear that there were many fine applicants with stellar GPAs. I’d never even worked with radioactive iodine tagged proteins. They picked me because they knew undergrads only had three months to get substantive research done, and they simply didn’t have time to train someone (especially someone who might turn out to lack tenacity). They needed someone who knew how to work in the lab and could adapt quickly. They needed someone who knew how to work the machines my college lab used, and someone who knew how to work with e-coli plasmids. I could do that.

So pick whatever you think you like best, start with that, find a lab, and learn how to be adept at as many basic lab skills as possible. Delve more deeply into the ones associated with your research. Be ready to work when the right opportunity and research lab come along. The new lab will always ask what skills you have and whether they can be applied to the questions their lab is trying to solve, even if you’ve never asked similar questions. A chemistry major could therefore be what a certain biology lab needs at a given time.

A lot of what is frustrating and off-putting about science at first, including working in the research lab, is the same thing that’s frustrating and off-putting about math: to really enter the conversation you have to have the vocabulary, so there’s a lot of memorizing when you start. Which is just obnoxious. But it doesn’t take too long, and if you start interning in a lab early, then the memorizing feels justifiable and pertinent, even if you feel initially more frustrated at a) not knowing the information and b) not knowing how to apply it. If you don’t get into a lab, however, it’s just hard and pointlessly so (even though it isn’t).

(Virtually all fields have this learning curve, whether you realize it or not; one of Jake’s pet books is Daniel T. Willingham’s Why Don’t Students Like School: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom, which describes how people move from no knowledge to shallow knowledge to deep knowledge. It’s bewildering and disorienting to start with no knowledge on a subject, but you have to endure and transcend that state if you’re going to move to deep knowledge. He says that he’s climbed that mountain with regard to writing, which makes writing way more rewarding than it used to be.)

Once you have the language and are able to think about, say, protein folding, the way you would a paragraph of prose, or the rhythm in a popular song, science takes on a whole new life, like Frankenstein’s Monster but without the self-loathing or murder. You start to think about what questions you can ask, what you can build, and what you can do—as opposed to what you can regurgitate. The questions you pose to people in your lab will lead to larger conversations. Feeling like an insider is nice, not only because it’s nice to belong, but because you’ll realize that even being a small part of the conversation means you’re still part of the larger discussion.

Science is exciting, but not until you find a way to break through the barriers and into the real thing, so don’t give up prematurely. Like most things, however, your experience depends on whether you have or make the right opportunities. I went to medical school after a grad school detour. How I feel about that decision is an entirely different essay, and one I’ll post later. I ended up specializing in Emergency Medicine because I had enthusiastic ER docs teaching me. Before residency, I thought I’d do anesthesia, but the profs were boring and it seemed like awful work. I’m on a fabulous anesthesia rotation right now, the medical equivalent of a Riviera cruise, and am thinking, “Hey! Maybe I should have done this.” Same with rehab medicine. It’s a perfect fit for me, but I had two boring weeks of it in a non-representative place and so wasn’t about to sign myself over to a whole career without having any more to base my opinion on.

Some days I think that if I’d had a different lab, which exposed me to different things, if my Mayo summer had given me different connections, I would be pipetting merrily away at Cold Spring Harbor research center, building a nanobot that would deliver the next big cancer treatment on a cellular level. Or maybe I would be a disgruntled post-doc, wishing that I could finally have a lab of my own. Or working for Pfizer. Anything could have changed my path. And just because you choose to study something you love doesn’t mean you’ll succeed.

But not choosing to study something you love is even worse. Point is, most choices in life are luck and chance, but you shouldn’t discard viable options—especially ones in science—based on a couple of survey courses designed to move the meat. Universities do view freshmen as piggybanks whose tuition dollars fund professors’ salaries and research, which is why they cram 1,000 of you into lecture halls and deliver an industrial-grade product that’s every bit as pleasant as industrial-grade mystery meat. Unfortunately, those classes are often the only real way to know if you like something and to be exposed to it unless you seek out more real-world representative opportunities. Most universities won’t go out of their way to shunt you into those opportunities. You have to want them and seek them out. So if you think you like biology? Or physics? Read The Elegant Universe**. The greatest show on earth. The history of the polio vaccine. See if it stirs you.

That being said, if you don’t like science, you don’t like it; I’m just warning you that what you think you don’t like might simply be due to not quite knowing enough or having negative exposures. Still, you can have all the best intentions, follow my advice, find a great lab, try out different opportunities if the first or second don’t work out, and decide it’s just not for you. You probably can’t force it to be your passion, but you probably also underestimate the extent to which you, like most people, have a range of possible passions. I only caution you to make sure that you aren’t basing your choice on one bad class or a crappy lab advisor. This is good advice in any field.

Here’s an example of possible premature optimization: I received an email from Jake’s former student, saying she was thinking about being a judge as a “backup,” in case a career in publishing didn’t work out. Being a judge, that’s a life goal. A big one. And law does not make good on its promise of a comfortable income the way it once did. For more on that, see the Slate.com article “A Case of Supply v. Demand: Law schools are manufacturing more lawyers than America needs, and law students aren’t happy about it,” which points out that there are too many certified and credentialed “lawyers” for the amount of legal work available. Plus, while society needs a certain number of lawyers to function well, too many lawyers leads to diminishing returns as lawyers waste time ginning up work by suing each other over trivialities or chasing ambulances.

By contrast, an excess of scientists and engineers means more people who will build the stuff that lawyers then litigate over. Scientists and engineers expand the size of the economic pie; lawyers mostly work to divide it up differently. Whenever possible, work to be a person who creates things, instead of a person who tries to take stuff created by someone else. There is an effectively infinite amount of work in science because the universe is big and we don’t really understand it and we probably never will.* New answers to questions in science yields more questions. More lawsuits launched by lawyers just yields fighting over scraps.

Another personal example: I wasn’t just queen of the lab nerds. Sure, I tie-dyed my lab coat and dated a man who liked to hear me read aloud from organic chemistry textbook, but I also wanted to write: not academic papers and book chapters, but novels and essays. I’d always been dual-minded and never bought the “Two Cultures” idea one scientific and one humanistic, described in C.P. Snow’s eponymous book. This bifurcation is, to speak bluntly, bullshit. As a kid I spent as much time trying to win the science fair as I did submitting poetry to Highlights. May 1994’s Grumpy Dog issue was my first publication. You may have read it and enjoyed the story of “Sarah, the new puppy.” Or, you may not have been born yet. That was me as a kid. As an adult, I’m not confined to science either—and neither is any other scientist.

I imagine many of you reading this post who are struggling with whether or not to be a scientist are, fundamentally, not struggling with what you want to major in, but what you want to be and how your decisions in college influence your options. Many of you are likely creatively-minded, as scientific types often are, despite how “poindexter” characters are portrayed in popular T.V. Staying close to your passions outside the test tube gives you the creative spark that makes your scientific thinking unique and fresh. So you don’t have to pick science and say, “That’s it, I’m a scientist and only a scientist.” You become a scientist and say: Now what do I want to build/ask/figure out?


Jake again:

So what should you do now to get into science? Here’s a list that I, Jake Seliger the non-scientist, wrote, based on the experiences described by friends in the sciences:

0) Look for profs in your department. Look for ones who are doing research in an area in or adjacent to what you might be interested in doing.

1) Read a couple of their recent papers. You probably won’t understand them fully, but you should try to at least get a vague sense of what they’re doing. You may want to prepare a couple of questions you can ask in advance; some profs will try to weed out people who are merely firing off random e-mails or appearing in the office hours to beg.

2) Look for a website related to their lab or work, and try to get a sense of whether you might be interested in their work. Chances are you won’t be able to tell in advance. You should also figure out who their grad students are—most science profs will have somewhere between one and dozens of students working under them.

3) Go meet with said prof (or grad students) and say, “I’m interested in X, I’ve read papers W, Y, and Z, and I’d like to work in your lab.” Volunteer, since you probably won’t get paid at first.

4) They might say no. It’s probably not personal (rejection is rarely personal in dating, either, but it takes many people years or decades to figure this out). If the prof says no, go work on the grad students some, or generally make yourself a pest.

5) Try other labs.

6) Don’t give up. This is a persistent theme in this essay for good reason.

7) Keep reading papers in the area you’re interested in, even if you don’t understand them. Papers aren’t a substitute for research, but you’ll at least show that you’re interested and learn some of the lingo. Don’t underestimate the value of knowing a field’s jargon. Knowing the jargon can also be satisfying in its own right.

8) Take a computer science course or, even better, computer science courses. Almost all science labs have programming tasks no one wants to do, and your willingness to do scutwork will make you much more valuable. Simple but tedious programming tasks are the modern lab equivalent of sweeping the floor.

If you don’t have bench research experience, you probably won’t get into grad school, or into a good grad school. You might have to pay for an MA or something like that to get in, which is bad. If you’re thinking about grad school, read Louis Menand’s The Marketplace of Ideas as soon as possible. See also Penelope Trunk’s Don’t Try to Dodge the Recession with Grad School and Philip Greenspun’s Women in Science. Ignore the questionable gender comments Greenspun makes and attend to his discussion of what grad school in the sciences is like, especially this, his main point: “Adjusted for IQ, quantitative skills, and working hours, jobs in science are the lowest paid in the United States.”

Another: Alex Tabarrok points out in his book Launching The Innovation Renaissance: A New Path to Bring Smart Ideas to Market Fast that we appear to have too few people working in technical fields and too many majoring in business and dubious arts majors (notice that he doesn’t deal with graduate school, which is where he diverges from Greenspun). In his blog post “College has been oversold,” Tabarrok points out that student participation in fields that pay well and are likely “to create the kinds of innovations that drive economic growth” is flat. On an anecdotal level, virtually everyone I know who majored in the hard sciences and engineering is employed. Many of those who, like me, majored in English, aren’t.

According to a study discussed in the New York Times, people apparently leave engineering because it’s hard: “The typical engineering major today spends 18.5 hours per week studying. The typical social sciences major, by contrast, spends about 14.6 hours.” And:

So maybe students intending to major in STEM fields are changing their minds because those curriculums require more work, or because they’re scared off by the lower grades, or a combination of the two. Either way, it’s sort of discouraging when you consider that these requirements are intimidating enough to persuade students to forgo the additional earnings they are likely to get as engineers.

There’s another way to read these findings, though. Perhaps the higher wages earned by engineers reflect not only what they learn but also which students are likely to choose those majors in the first place and stay with them.

Don’t be scared by low grades. Yes, it’s discouraging to take classes where the exam average is 60, but keep taking them anyway. Low grades might be an indication that the field is more intellectually honest than one with easy, high grades.

In the process of writing and editing this essay, the usual panoply of articles is about topics like “science majors are more likely to get jobs” have been published. You’ve probably read these articles. They’re mostly correct. The smattering linked to here are just ones that happened to catch my attention.

Science grads may not get jobs just because science inherently makes you more employable—it may be that more tenacious, hard-working, and thus employable people are inclined to major in the sciences. But that means you should want to signal that you’re one of them. And healthier countries in general tend to focus on science, respect science, and product scientists; hence the story about the opposite in “Why the Arabic World Turned Away from Science.”

If you’re leaving science because the intro courses are too hard and your friends majoring in business are having more fun at parties, you’re probably doing yourself a tremendous disservice that you won’t even realize until years later. If you’re leaving science because of a genuine, passionate interest in some other field, you might have a better reason, but it still seems like you’d be better off double majoring or minoring in that other field.


My friend again, adding to what I said above:

As someone who was going to do the science PhD thing before deciding on medical school I agree with most of what Jake says. Let me emphasize: you will have to volunteer at first because you don’t have the skills to be hired in a lab for a job that will teach you something. Being hired without previous experience usually means the job doesn’t require the skills you want to learn, and so you won’t learn them. So you don’t want that job.

I had a paying job in a lab, so you can get them eventually—but I only started getting paid after I’d worked in it for a year, even then the pay was more like a nice boost because the money just happened to show up and they thought, “What the heck, she’s been helpful.” Think of this time as paying your way into graduate school, because if you don’t have lab work, despite how good your grades are, you will not get into a good graduate school with funding.

Here’s why: You have a limited amount of time in graduate school and are not just there to do independent research and learn. You’re there to do research with the department, and they need you to start immediately. If you already have years of bench research experience, the departments and the professors in that department know you can—and there is no substitute for experience.

The place where you really learn how to work in a lab and develop your skills is in one, not in the lab classes where you learn, at best, some rote things (plus, you need to know if you like the basic, day-to-day experience of working in a lab and the kind of culture you’ll find yourself in; not everyone does). Even if you do learn the tools you need for a certain lab, it doesn’t demonstrate that you’re actually interested in research.

The only thing that demonstrates an interest in research, which is all graduate school really cares about, is working in a lab and doing real research. I can’t stress that enough, which is why I’ve repeated it several times in this paragraph. A 4.0 means you can study. It doesn’t mean you can do research. People on grad school committees get an endless number of recommendation letters that say, “This candidate did well in class and got an ‘A.'” Those count for almost nothing. People on grad school committees want letters that say, “This candidate did X, Y, and Z in my lab.”

I recommend starting with your professors—the ones whose classes you’ve liked and who know you from office hours. Hit them up first. Tell them your goal is to be a scientist and that, while academics are nice, you want to start being a scientist now. If they don’t have space for you, tell them to point you in the direction of someone who does. Keep an open mind. Ask everybody. I was interested in nanobots, gene work, molecular what-nots, etc.

I started by asking my orgo [“organic chemistry” to laymen] teacher. Nothing. I asked my Biochem [“biological chemistry” or “biochemistry”] professor and was welcomed with open arms. Point is, if the labs you want have no space, go to another. Don’t give up. Be relentlessly resourceful. Be tenacious—and these aren’t qualities uniquely useful to scientists.

The skills I ended up with in the biochem lab turned out to be almost 100% on point with what I wanted to do later, even though the research was different. The kind of research you end up doing usually springs from the lab skills you have, and it’s much harder to decide what you want and try to find a lab that will give you those skills. So instead of trying to anticipate what research you’ll want to do from a position where you can’t know, just learn some research skills. Any skills are better than none. Then you have something to offer the lab you want when space / funding becomes available. I took what I learned in that biochem lab and spent a summer doing research on protein folding—it wasn’t like my initial research, but the prof needed someone who knew how to do X, Y and Z, which I did, and he was willing to train me on the rest.

You’ll face other decisions. For example, in many fields you’ll have to decide: do you want wet-lab research (this does not refer to adult entertainment) or do you want more clinical research? “Wet lab” means that you’re mucking with chemicals, or big machines, and stuff like. Clinical research means you’re dealing more with humans, or asking people questions, or something along those lines. I would suggest the wet lab if you think you may be even slightly interested (sort of like how you should experiment with lovers when you’re in college if you think you’re even slightly interested in different sorts of things). In fact, I’d suggest wet-lab work or some sort of computational lab in general, because clinical research skills can be extrapolated from wet lab—but not vice versa.

You can show that you can think if you’re in a clinical lab, but in a wet-lab you need to be able to think and use a pipette. Or that you can use modeling software, if you’re interested in the computer side of things. That’s where the programming comes in handy if you’re capable of doing it; if not, then I feel less strongly than Jake about programming, because often labs need someone with serious computer training, like a post doc, if their research revolves around modeling. But it could come in handy for you, anyway, and certainly couldn’t hurt, so if you’re interested it could be an extra perk.

Once you’re in the lab, if you want to learn skills outside what you’re working with. Ask. Ask everyone. Ask the computer guy, ask the woman on the other project. Get whatever you can get get good at it, then put it on you C.V. and make sure you can explain it clearly when someone asks, even if you’re not an expert, just be able to play on on T.V.

As for #3, about figuring out who their grad students are: I also find that less important. You need to talk to the primary investigator, the guy who runs the lab. If he’s not interested in you, it’s not worth going through grad student channels to convince him to take you. Someone is going to want you, and it’s best to go there in both science and love. Don’t fall for the false lead of the pretty girl in the alluring dress who disappears as soon as you get close. You can always try alternate channels later if you really want to get back into lab #1.

Think of it this way: if you’re struggling just to get a foot in the door, you’re going to struggle to get any research done. Not that the research will feel meaningful at first: you’ll be doing tasks assigned to you. But you should feel like this gets better, that you get more independence. And if that’s the not the ethos of the lab to start with, it never will be. As I mentioned before, if I’d gotten into that orgo lab, I’d have been a scut monkey for years.

As Jake said: read your professors’ papers. You probably won’t have any idea what’s going on. I still have no idea what’s going on half the time, but read ’em anyway. Shows you’re worth the effort, especially when you ask for that lab spot. Jake’s 100% right about ways to get your professors attention.

Don’t give up. Just don’t give up. Take “no” for an answer and kiss grad school (at least a good PhD program with full funding, which is what you want: don’t pay for this under any circumstances) goodbye. Scientists are distinguished by their tenacity, whether they’re in grad school or not. And make sure you know what you’re giving up before you do.

What kind of research are you interested in? What gets you going? Even if you’re not sure there are a certain number of fundamental things that, I believe, if you’re familiar with, will get you into whatever lab you want because they are used in most labs and shows you’re trainable for the other stuff. And you’ll know what science is like, which you simply don’t right now. Giving up on it based on some bad grades as a freshman or sophomore is folly.***


* One snarky but accurate commenter said, “There may be an infinite amount of work in science, but there is a finite (and very unevenly distributed) number of grants.

** Although a different friend observed, “Books are a step above classes, but in my experience, many aspiring theoretical physicists are really people who like reading popular science books more than they like doing math.”

*** You can download this essay as a .pdf.