Why you should become a nurse or physicians assistant instead of a doctor: the underrated perils of medical school

Many if not most people who go to medical school are making a huge mistake—one they won’t realize they’ve made until it’s too late to undo.

So many medical students, residents, and doctors say they wish they could go back in time and tell themselves to do something—anything—else. Their stories are so similar that they’ve inspired me to explain, in detail, the underappreciated yet essential problems with medical school and residency. Potential doctors also don’t realize becoming a nurse or physicians assistant (PA) provides many of the job security advantages of medical school without binding those who start to at least a decade, and probably a lifetime, of finance-induced servitude.

The big reasons to be a doctor are a) lifetime earning potential, b) the limited number of doctors who are credentialed annually, which implies that doctors can restrict supply and thus will always have jobs available, c) higher perceived social status, and d) a desire to “help people” (there will be much more on the dubious value of that last one below).

These reasons come with numerous problems: a) it takes a long time for doctors to make that money, b) it’s almost impossible to gauge whether you’ll actually like a profession or the process of joining that profession until you’re already done, c) most people underestimate opportunity costs, and d) you have to be able to help yourself before you can help other people (and the culture of medicine and medical education is toxic).

Straight talk about doctors and money.

You’re reading this because you tell your friends and maybe yourself that you “want to help people,” but let’s start with the cash. Although many doctors will eventually make a lot of money, they take a long time to get there. Nurses can start making real salaries of around $50,000 when they’re 22. Doctors can’t start making real money until they’re at least 29, and often not until they’re much older.

Keep that in mind when you read the following numbers.

Student Doctor reports that family docs make about $130 – $200K on average, which sounds high compared to what I’ve heard on the street (Student Doctor’s numbers also don’t discuss hours worked). The Bureau of Labor Statistics—a more reliable source—reports that primary care physicians make an average of $186,044 per year. Notice, however, that’s an average, and it also doesn’t take into account overhead. Notice too that the table showing that BLS data indicates more than 40% of doctors are in primary care specialties. Family and general practice doctors make a career median annual wage of $163,510.

Nurses, by contrast, make about $70K a year. They also have a lot of market power—especially skilled nurses who might otherwise be doctors. Christine Mackey-Ross describes these economic dynamics in “The New Face of Health Care: Why Nurses Are in Such High Demand.” Nurses are gaining market power because medical costs are rising and residency programs have a stranglehold on the doctor supply. More providers must come from somewhere. As we know from econ 101, when you limit supply in the face of rising demand, prices rise.

The limit on the number of doctors is pretty sweet if you’re already a doctor, because it means you have very little competition and, if you choose a sufficiently demanding specialty, you can make a lot of money. But it’s bad for the healthcare system as a whole because too many patients chase too few doctors. Consequently, the system is lurching in the direction of finding ways to provide healthcare at lower costs. Like, say, through nurses and PAs.

Those nurses and PAs are going to end up competing with primary care docs. Look at one example, from the New York Times’s U.S. Moves to Cut Back Regulations on Hospitals:”

Under the proposals, issued with a view to “impending physician shortages,” it would be easier for hospitals to use “advanced practice nurse practitioners and physician assistants in lieu of higher-paid physicians.” This change alone “could provide immediate savings to hospitals,” the administration said.

Primary care docs are increasingly going to see pressure on their wages from nurse practitioners for as long as health care costs outstrip inflation. Consider “Yes, the P.A. Will See You Now:”

Ever since he was a hospital volunteer in high school, Adam Kelly was interested in a medical career. What he wasn’t interested in was the lifestyle attached to the M.D. degree. “I wanted to treat patients, but I wanted free time for myself, too,” he said. “I didn’t want to be 30 or 35 before I got on my feet — and then still have a lot of loans to pay back.”

To recap: nurses start making money when they’re 22, not 29, and they are eating into the market for primary care docs. Quality of care is a concern, but the evidence thus far shows no difference between nurse practitioners who act as primary-care providers and MDs who do.

Calls to lower doctor pay, like the one found in Matt Ygleasias’s “We pay our doctors way too much,” are likely to grow louder. Note that I’m not taking a moral or economic stance on whether physician pay should be higher or lower: I’m arguing that the pressure on doctors’ pay is likely to increase because of fundamental forces on healthcare.

To belabor the point about money, The Atlantic recently published “The average female primary-care physician would have been financially better off becoming a physician assistant.” Notice: “Interestingly, while the PA field started out all male, the majority of graduates today are female. The PA training program is generally 2 years, shorter than that for doctors. Unsurprisingly, subsequent hourly earnings of PAs are lower than subsequent hourly earnings of doctors.”

Although the following sentence doesn’t use the word “opportunity costs,” it should: “Even though both male and female doctors both earn higher wages than their PA counterparts, most female doctors don’t work enough hours at those wages to financially justify the costs of becoming a doctor.” I’m not arguing that women shouldn’t become doctors. But I am arguing that women and men both underestimate the opportunity costs of med school. If they understood those costs, fewer would go.

Plus, if you get a nursing degree, you can still go to medical school (as long as you have the pre-requisite courses; hell, you can major in English and go to med school as long as you take the biology, math, physics, and chemistry courses that med schools require). Apparently some medical schools will sniff at nurses who want to become doctors because of the nursing shortage and, I suspect, because med schools want to maintain a clear class / status hierarchy with doctors at top. Med schools are run by doctors invested in the dotor mystique. But the reality is simpler: medical schools want people with good MCAT scores and GPAs. Got a 4.0 and whatever a high MCAT score is? A med school will defect and take you.

One medical resident friend read a draft of this essay and simply said that she “didn’t realize that I was looking for nursing.” Or being a PA. She hated her third year of medical school, as most med students do, and got shafted in her residency—which she effectively can’t leave. Adam Kelly is right: more people should realize what “the lifestyle attached to an M.D. degree” means.

They should also understand “The Bullying Culture of Medical School” and residency, which is pervasive and pernicious—and it contributes to the relationship failures that notoriously plague the medical world. Yet med schools and residencies can get away with this because they have students and residents by the loans.

Why would my friend have realized that she wanted to be a nurse? Our culture doesn’t glorify nursing the way it does doctoring (except, maybe, on Halloween and in adult cinema). High academic achievers think being a doctor is the optimal road to success in the medical world. They see eye-popping surgeon salary numbers and rhetoric about helping people without realizing that nurses help people too, or that their desire to help people is likely to be pounded out of them by a cold, uncaring system that uses the rhetoric of helping to sucker undergrads into mortgaging their souls to student loans. Through the magic of student loans, schools are steadily siphoning off more of doctors’ lifetime earnings. Given constraints and barriers to entry into medicine, I suspect med schools and residencies will be able to continue doing so for the foreseeable future. The logical response for individuals is exit the market because they have so little control over it.

Sure, $160K/year probably sounds like a lot to a random 21-year-old college student, because it is, but after taking into account the investment value of money, student loans for undergrad, student loans for med school, how much nurses make, and residents’ salaries, most doctors’ earnings probably fail to outstrip nurses’ earnings until well after the age of 40. Dollars per hour worked probably don’t outstrip nurses’ earnings until even later.

To some extent, you’re trading happiness, security, dignity, and your sex life in your 20s, and possibly early 30s, for a financial opportunity that might not pay off until your 50s.

Social status is nice, but not nearly as nice when you’re exhausted at 3 a.m. as a third-year, or exhausted at 3 a.m. as a first-year resident, or exhausted at 3 a.m. as a third-year resident and you’re 30 and you just want a quasi-normal life, damnit, and maybe some time to be an artist. Or when you’re exhausted at 3 a.m. as an attending on-call physician because the senior doctors at the HMO know how to stiff the newbies by forcing them to “pay their dues.”

This is where prospective medical students protest, “I’m not going to be a family medicine doc.” Which is okay: maybe you won’t be. Have fun in five or seven years of residency instead of three. But don’t confuse the salaries of superstar specialties like neurosurgery and cardiology with the average experience; more likely than not you’re average. There’s this social ideal of doctors being rich. Not all are, even with barriers to entry in place.

The underrated miseries of residency

As one resident friend said, “You can see why doctors turn into the kind of people they do.” He meant that the system itself lets patients abuse doctors, doctors abuse residents, and for people to generally treat each other not like people, but like cogs. At least nurses who discover they hate nursing can quit, since they will have a portable undergrad degree and won’t have obscene graduate school student loans. They can probably go back to school and get a second degree in twelve to twenty-four months. (Someone with a standard bachelor’s degree can probably enter nursing in the same time period.)

In normal jobs, a worker who learns about a better opportunity in another company or industry can pursue it. Students sufficiently dissatisfied with their university can transfer.[1] Many academic grad schools make quitting easy. Residencies don’t. The residency market is tightly controlled by residency programs that want to restrict residents’ autonomy—and thus their wages and bargaining power. Once you’re in a residency, it’s very hard to leave, and you can only do so at particular in the gap between each residency year.

This is a recipe for exploitation; many of the labor battles during the first half of the twentieth century were fought to prevent employers from wielding this kind of power. For medical residents, however, employers have absolute power enshrined in law—though employers cloak their power in the specious word “education.”

Once a residency program has you, they can do almost anything they want to you, and you have little leverage. You don’t want to be in situations where you have no leverage, yet that’s precisely what happens the moment you enter the “match.”

Let’s explain the match, since almost no potential med students understand it. The match occurs in the second half of the fourth year of medical school. Students apply to residencies in the first half of their fourth year, interview at potential hospitals, and then list the residencies they’re interested in. Residency program directors then rank the students, and the National Residency Match Program “matches” students to programs using a hazily described algorithm.

Students are then obligated to attend that residency program. They can’t privately negotiate with other programs, as students can for, say, undergrad admissions, or med school admissions—or almost any other normal employment situation. Let me repeat and bold: Residents can’t negotiate. They can’t say, “How about another five grand?” or “Can I modify my contract to give me fewer days?” If a resident refuses to accept her “match,” then she’s blackballed from re-entering for the next three years.

Residency programs have formed a cartel designed to control cost and reduce employee autonomy, and hence salaries. I only went to law school for a year, by accident, but even I know enough law and history to recognize a very clear situation of the sort that anti-trust laws are supposed to address in order to protect workers. When my friend entered the match process like a mouse into a snake’s mouth, I became curious, because the system’s cruelty, exploitation, and unfairness to residents is an obvious example of employers banding together to harm employees. Lawyers often get a bad rap—sometimes for good reasons—but the match looked ripe for lawyers to me.

It turns out that I’m not a legal genius and that real lawyers have noticed this obvious anti-trust violation; an anti-trust lawsuit was filed in the early 2000s. Read about it in the NYTimes, including a grimly hilarious line about how “The defendants say the Match is intended to help students and performs a valuable service.” Ha! A valuable service to employers, since employees effectively can’t quit or negotiate with individual employers. Curtailing employee power by distorting markets is a valuable service. The article also notes regulatory capture:

Meanwhile, the medical establishment, growing increasingly concerned about the legal fees and the potential liability for hundreds of millions of dollars in damages, turned to Congress for help. They hired lobbyists to request legislation that would exempt the residency program from the accusations. A rider, sponsored by Senators Edward M. Kennedy, Democrat of Massachusetts, and Judd Gregg, Republican of New Hampshire, was attached to a pension act, which President Bush signed into law in April.

In other words, employers bought Congress and President Bush in order to screw residents.[2] If you attend med school, you’re agreeing to be screwed for three to eight years after you’ve incurred hundreds of thousands of dollars of debt, and you have few if any legal rights to attack the exploitative system you’ve entered.

(One question I have for knowledgeable readers: do you know of any comprehensive discussion of residents and unions? Residents can apparently unionize—which, if I were a medical resident, would be my first order of business—but the only extended treatment of the issue I’ve found so far is here, which deals with a single institution. Given how poorly  residents are treated, I’m surprised there haven’t been more unionization efforts, especially in union-friendly, resident-heavy states like California and New York. One reason might be simple: people fear being blackballed at their ultimate jobs, and a lot of residents seem to have Stockholm Syndrome.)

Self-interested residency program directors will no doubt argue that residency is set up the way it is because the residency experience is educational. So will doctors. Doctors argue for residency being essential because they have a stake in the process. Residency directors and other administrators make money off residents who work longer hours and don’t have alternatives. We shouldn’t be surprised that they seek other legal means of restricting competition—so much of the fight around medicine isn’t about patient care; it’s about regulatory environments and legislative initiatives. For one recent but very small example of the problems, see “When the Nurse Wants to Be Called ‘Doctor’,” concerning nursing doctorates.

I don’t buy their arguments for more than ad hominem reasons. The education at many residency programs is tenuous at best. One friend, for example, is in a program that requires residents to attend “conference,” where residents are supposed to learn. But “conference” usually degenerates into someone nattering and most of the residents reading or checking their phones. Conference is mandatory, regardless of its utility. Residents aren’t 10 year olds, yet they’re treated as such.

These problems are well-known (“What other profession routinely kicks out a third of its seasoned work force and replaces it with brand new interns every year?”). But there’s no political impetus to act: doctors like limiting their competition, and people are still fighting to get into medical school.

Soldiers usually make four-year commitments to the military. Even ROTC only demands a four- to five-year commitment after college graduation—at which point officers can choose to quit and do something else. Medicine is, in effect, at least a ten-year commitment: four of medical school, at least three of residency, and at least another three to pay off med school loans. At which point a smiling twenty-two-year-old graduate will be a glum thirty-two-year-old doctor who doesn’t entirely get how she got to be a doctor anyway, and might tell her earlier self the things that earlier self didn’t know.

Contrast this experience with nursing, which requires only a four-year degree, or PAs, who have two to three years of additional school. As John Goodman points out in “Why Not A Nurse?“, nursing is much less heavily or uniformly regulated than doctoring. Nurses can move to Oregon:

Take JoEllen Wynne. When she lived in Oregon, she had her own practice. As a nurse practitioner, she could draw blood, prescribe medication (including narcotics) and even admit patients to the hospital. She operated like a primary care physician and without any supervision from a doctor. But, JoEllen moved to Texas to be closer to family in 2006. She says, “I would have loved to open a practice here, but due to the restrictions, it is difficult to even volunteer.” She now works as an advocate at the American Academy of Nurse Practitioners.

and, based on the article, avoid Texas. Over time, we’ll see more articles like “Why Nurses Need More Authority: Allowing nurses to act as primary-care providers will increase coverage and lower health-care costs. So why is there so much opposition from physicians?” Doctors will oppose this, because it’s in their economic self-interest to avoid more competition.

The next problem with becoming a doctor involves what economists call “information asymmetry.” Most undergraduates making life choices don’t realize the economic problems I’ve described above, let alone some of the other problems I’m going to describe here. When I lay out the facts about becoming a doctor to my freshmen writing students, many of those who want to be doctors look at me suspiciously, like I’m offering them a miracle weight-loss drug or have grown horns and a tail.

“No,” I can see them thinking, “this can’t be true because it contradicts so much of what I’ve been implicitly told by society.” They don’t want to believe. Which is great—right up to the point they have to live their lives, and see how their how those are lives are being shaped by forces that no one told them about. Just like no one told them about opportunity costs or what residencies are really like.

Medical students and doctors have complained to me about how no one told them how bad it is. No one really told them, that is. I’m not sure how much of this I should believe, but, at the very least, if you’re reading this essay you’ve been told. I suspect a lot of now-doctors were told or had an inkling of what it’s really like, but they failed to imagine the nasty reality of 24- or 30-hour call.

They, like most people, ignore information that conflicts with their current belief system about the glamor of medicine to avoid cognitive dissonance (as we all do: this is part of what Jonathan Haidt points out in The Righteous Mind, as does Daniel Kahneman in Thinking, Fast and Slow). Many now-doctors, even if they were aware, probably ignored that awareness and now complain—in other words, even if they had better information, they’d have ignored it and continued on their current path. They pay attention to status and money instead of happiness.

For example, Penelope Trunk cites Daniel Gilert’s Stumbling on Happiness and says:

Unfortunately, people are not good at picking a job that will make them happy. Gilbert found that people are ill equipped to imagine what their life would be like in a given job, and the advice they get from other people is bad, (typified by some version of “You should do what I did.”)

Let’s examine some other vital takeaways from Stumbling on Happiness: [3]

* Making more than about $40,000/year does little to improve happiness (this should probably be greater in, say, NYC, but the main point stands: people think money and happiness show a linear correlation when they really don’t).

* Most people value friends, family, and social connections more than additional money, at least once their income reaches about $40K/year. If you’re trading time with friends and family for money, or, worse, for commuting, you’re making a tremendous, doctor-like mistake.

* Your sex life probably matters more than your job, and many people mis-optimize in this area. I’ve heard many residents and med students say they’re too busy to develop relationships or have sex with their significant others, if they manage to retain one or more, and this probably makes them really miserable.

* Making your work meaningful is important.

Attend med school without reading Gilbert at your own peril. No one in high school or college warns you of the dangers of seeking jobs that harm your sex life, because high schools are too busy trying to convince you not to have one. So I’m going issue the warning: if you take a job that makes you too tired to have sex or too tired to engage in contemporary mate-seeking behaviors, you’re probably making a mistake.

The sex-life issue might be overblown, because people who really want to have one find a way to have one; some med students and residents are just offering the kinds of generic romantic complaints that everyone stupidly offers, and which mean nothing more than discussion about the weather. You can tell what a person really wants by observing what they do, rather than what they say.

But med students and residents have shown enough agony over trade-offs and time costs to make me believe that med school does generate a genuine pall over romantic lives. There is a correlation-is-not-causation problem—maybe med school attracts the romantically inept—but I’m willing to assume for now that it doesn’t.

The title of Trunk’s post is “How much money do you need to be happy? Hint: Your sex life matters more.” If you’re in an industry that consistently makes you too tired for sex, you’re doing things wrong and need to re-prioritize. Nurses can work three twelves a week, or thirty-six total hours, and be okay. But, as described above, being a doctor doesn’t let employees re-prioritize.

Proto-doctors screw up their 20s and 30s, sexually speaking, because they’ve committed to a job that’s so cruel to its occupants that, if doctors were equally cruel to patients, those doctors would be sued for malpractice. And the student loans mean that med students effectively can’t quit. They’ve traded sex for money and gotten a raw deal. They’ll be surrounded by people who are miserable and uptight—and who have also mis-prioritized.

You probably also don’t realize how ill-equipped you are to what your life would be like as a doctor because a lot of doctors sugarcoat their jobs, or because you don’t know any actual doctors. So you extrapolate from people who say, “That’s great” when you say you want to be a doctor. If you say you’re going to stay upwind and see what happens, they don’t say, “That’s great,” because they simply think you’re another flaky college student. But saying “I want to go to med school” or “I want to go to law school” isn’t a good way to seem level-headed (though I took the latter route; fortunately, I had the foresight to quit). Those routes, if they once led to relative success and happiness, don’t any more, at least for most people, who can’t imagine what life is like on the other end of the process. With law, at least the process is three years, not seven or more.

No one tells you this because there’s still a social and cultural meme about how smart doctors are. Some are. Lots more are very good memorizers and otherwise a bit dull. And you know what? That’s okay. Average doctors seeing average patients for average complaints are fixing routine problems. They’re directing traffic when it comes to problems they can’t solve. Medicine doesn’t select for being well-rounded, innovative, or interesting; if anything, it selects against those traits through its relentless focus on test scores, which don’t appear to correlate strongly with being interesting or intellectual.

Doctors aren’t necessarily associating with the great minds of your generation by going to medical school. Doctors may not even really be associating with great minds. They might just be associating with excellent memorizers. I didn’t realize this until I met lots of of doctors, had repeated stabs at real conversations with them, and eventually realized that many aren’t intellectually curious and imaginative. There are, of course, plenty of smart, intellectually curious doctors, but given the meme about the intelligence of doctors, there are fewer than imagined and plenty who see themselves as skilled technicians and little more.

A lot of doctors are the smartest stupid people you’ve met. Smart, because they’ve survived the academic grind. Stupid, because they signed up for med school, which is effectively signing away extraordinarily valuable options. Life isn’t a videogame. There is no reset button, no do-over. Once your 20s are gone, they’re gone forever.

Maybe your 20s are supposed to be confusing. Although I’m still in that decade, I’m inclined to believe this idea. Medical school offers a trade-off: your professional life isn’t confusing and you have a clear path to a job and paycheck. If you take that path, your main job is to jump through hoops. But the path and the hoops offer  clarity of professional purpose at great cost in terms of hours worked, debt assumed, and, perhaps worst of all, flexibility. Many doctors would be better off with the standard confusion, but those doctors take the clear, well-lit path out of fear—which is the same thing that drives so many bright but unfocused liberal grads into law schools.

I’ve already mentioned prestige and money as two big reasons people go to med school. Here’s another: fear of the unknown. Bright students start med school because it’s a clearly defined, well-lit path. Such paths are becoming increasingly crowded. Uncertainty is scary. You can fight the crowd, or you can find another way. Most people are scared of the other way. They shouldn’t be, and they wouldn’t be if they knew what graduate school paths are like.

For yet another perspective on the issue of not going to med school, see Ali Binazir’s “Why you should not go to medical school — a gleefully biased rant,” which has more than 200 comments as of this writing. Binazir correctly says there’s only one thing that should drive you to med school: “You have only ever envisioned yourself as a doctor and can only derive professional fulfillment in life by taking care of sick people.”

If you can only derive professional fulfillment in life by taking care of sick people, however, you should remember that you can do so by being a nurse or a physicians assistant. And notice the words Binazir chooses: he doesn’t say, “help people”—he says “taking care of sick people.” The path from this feeling to actually taking care of sick people is a long, miserable one. And you should work hard at envisioning yourself as something else before you sign up for med school.

You can help people in all kinds of ways; the most obvious ones are by having specialized, very unusual skills that lots of people value. Alternately, think of a scientist like Norman Borlaug (I only know about him through Tyler Cowen’s book The Great Stagnation; in it, Cowen also observes that “When it comes to motivating human beings, status often matters at least as much as money.” I suspect that a lot of people going to medical school are really doing it for the status).

Bourlag saved millions of lives through developing hardier seeds and through other work as an agronomist. I don’t want to say something overwrought and possibly wrong like, “Bourlag has done more to help people than the vast majority of doctors,” since that raises all kinds of questions about what “more” and “help” and “vast majority” mean, but it’s fair to use him as an example of how to help people outside of being a doctor. Programmers, too, write software that can be instantly disseminated to billions of people, and yet those who want to “help” seldom think of it as a helping profession, even though it is.

For a lot of the people who say they want to be a doctor so they can help people, greater intellectual honesty would lead them to acknowledge mixed motives in which helping people is only one and perhaps not the most powerful. On the other hand, if you really want to spend your professional life taking care of sick people, Binazir is right. But I’m not sure you can really know that before making the decision to go to medical school, and, worse, even if all you want to do is take care of sick people, you’re going to find a system stacked against you in that respect.

You’re not taking the best care of people at 3 a.m. on a 12- to 24-hour shift in which your supervisors have been screaming at you and your program has been jerking your schedule around like a marionette all month, leaving your sleep schedule out of whack. Yeah, someone has to do it, but it doesn’t have to be you, and if fewer people were struggling to become doctors, the system itself would have to change to entice more people into medical school.

One other, minor point: you should get an MD and maybe a PhD if you really, really want to do medical research. But that’s a really hard thing for an 18 – 22 year old to know, and most doctors aren’t researchers. Nonetheless, nurses (usually) aren’t involved in the same kind of research as research MDs. I don’t think this point changes the main thrust of my argument. Superstar researchers are tremendously valuable. If you think you’ve got the tenacity and curiosity and skills to be a superstar researcher, this essay doesn’t apply to you.

Very few people will tell you this, or tell even if you ask; Paul Graham even writes about a doctor friend in his essay “How to do What You Love:”

A friend of mine who is a quite successful doctor complains constantly about her job. When people applying to medical school ask her for advice, she wants to shake them and yell “Don’t do it!” (But she never does.) How did she get into this fix? In high school she already wanted to be a doctor. And she is so ambitious and determined that she overcame every obstacle along the way—including, unfortunately, not liking it.

Now she has a life chosen for her by a high-school kid.

When you’re young, you’re given the impression that you’ll get enough information to make each choice before you need to make it. But this is certainly not so with work. When you’re deciding what to do, you have to operate on ridiculously incomplete information. Even in college you get little idea what various types of work are like. At best you may have a couple internships, but not all jobs offer internships, and those that do don’t teach you much more about the work than being a batboy teaches you about playing baseball.

Having a life chosen for you by a 19-year-old college student or 23-year-old wondering what to do is only marginally better.

I’m not the first person to notice that people don’t always understand what they’ll be like when they’re older; in “Aged Wisdom,” Robin Hanson says:

You might look inside yourself and think you know yourself, but over many decades you can change in ways you won’t see ahead of time. Don’t assume you know who you will become. This applies all the more to folks around you. You may know who they are now, but not who they will become.

This doesn’t surprise me anymore. Now I acknowledge that I’m very unlikely to be able to gauge what I’ll want in the future.

Contemplate too the psychological makeup of many med students. They’re good rule-followers and test-takers; they tend to be very good on tracks but perhaps not so good outside of tracks. Prestige is very important, as is listening to one’s elders (who may or may not understand the ways the world is changing in fundamental ways). They may find the real world large and scary, while the academic world is small, highly directed, and sufficiently confined to prevent intellectual or monetary agoraphobia.

These issues are addressed well in two books: Excellent Sheep by William Deresiewicz and Zero to One by Peter Thiel and Blake Masters. I won’t endorse everything in either book, but pay special attention to their discussions of the psychology of elite students and especially the weaknesses that tend to appear in that psychology.

It is not easy for anyone to accept criticism, but that may be particularly true of potential med students, who have been endlessly told how “smart” they are, or supposedly are. Being smart in the sense of passing classes and acing tests may not necessarily lead you towards the right life, and, moreover, graduate schools and consulting have evolved to prey on your need for accomplishment, positive feedback, and clear metrics. You are the food they need to swallow and digest. Think long and hard about that.

If you don’t want to read Excellent Sheep and Zero to One, or think you’re “too busy,” I’m going to marvel: you’re willing to spend hundreds of thousands of dollars and years of your life to a field that you’re not wiling to spend $30 and half a day to understanding better? That’s a dangerous yet astonishingly common level of willful ignorance.

Another friend asked what I wanted to accomplish with this essay. The small answer: help people understand things they didn’t understand before. The larger answer—something like “change medical education”—isn’t very plausible because the forces encouraging people to be doctors are so much larger than me. The power of delusion and prestige is so vast that I doubt I can make a difference through writing alone. Almost no writer can: the best one can hope for is changes at the margin over time.

Some med school stakeholders are starting to recognize the issues discussed in this essay: for example, The New York Times has reported that New York University’s med school may be able to shorten its duration from four years to three, and “Administrators at N.Y.U. say they can make the change without compromising quality, by eliminating redundancies in their science curriculum, getting students into clinical training more quickly and adding some extra class time in the summer.” This may be a short-lived effort. But it may also be an indicator that word about the perils of med school is spreading.

I don’t expect this essay to have much impact. It would require people to a) find it, which most probably won’t do, b) read it, which most probably won’t do, c) understand it, which most of those who read it won’t or can’t do, and d) implement it. Most people don’t seem to give their own futures much real consideration. I know a staggering number of people who go to law or med or b-school because it “seems like a good idea.” Never mind the problem with following obvious paths, or the question of opportunity costs, or the difficulty in knowing what life is like on the other side.

People just don’t think that far ahead. I’m already imagining people on the Internet who are thinking about going to med school and who see the length of this essay and decide it’s not worth it—as if they’d rather spend a decade of their lives gathering the knowledge they could read in an hour. They just don’t understand the low quality of life medicine entails for many if not most doctors.

Despite the above, I will make one positive point about med school: if you go, if you jump through all the hoops, if you make it to the other side, you will have a remunerative job for life, as long as you don’t do anything grossly awful. Job demand and pay are important. Law school doesn’t offer either anymore. Many forms of academic grad schools are cruel pyramid schemes propagated by professors and universities. But medicine does in fact have a robust job market on the far end. That is a real consideration. You’re still probably better off being a nurse or PA—nurses are so in-demand that nursing schools can’t grow fast enough, at least as of 2015—but I don’t want to pretend that the job security of being a doctor doesn’t exist.

I’m not telling you what to do. I rarely tell anyone what to do. I’m describing trade-offs and asking if you understand them. It appears that few people do. Have you read this essay carefully? If not, read it again. Then at least you won’t be one of the many doctors who hate what you do, warn others about how doctors are sick of their profession, and wish you’d been wise enough to take a different course.

If you enjoyed this essay, you should also read my novel, Asking Anna. It’s a lot of fun for not a lot of money!


[0] Here’s another anti-doctor article: “Why I Gave Up Practicing Medicine.” Scott Alexander’s “Medicine As Not Seen On TV” is also good. The anti-med-school lit is available to those who seek it. Most potential med students don’t seem to. Read the literature and understand the perils. If after learning you still want to go anyway, great.

Here is too intelligent commenter ktswan, who qualifies the rest of the article. She went from nursing to med school and writes, “I am much happier in medicine than lots of my colleagues, I think in many ways because I knew exactly what I was getting into, what I was sacrificing, and what I wanted to gain from it.”

[1] One could argue that many of the problems in American K – 12 education stem from a captive audience whose presence or absence in a school is based on geography and geographical accidents rather than the school’s merit.

[2] You can read more about the match lawsuit here. Europe doesn’t have a match-like system; there, the equivalent of medical residency is much more like a job.

[3] Stumbling on Happiness did more to change my life and perspective than almost any other book. I’ve read thousands of books. Maybe tens of thousands. Yet this one influences my day-to-day decisions and practices by clarifying how a lot of what people say they value they don’t, and how a lot of us make poor life choices based on perceived status that end up screwing us. Which is another way of saying we end up screwing ourselves. Which is what a lot of medical students, doctors, and residents have done. No one holds the proverbial gun to your head and orders you into med school (unless you have exceptionally fanatical parents). When you’re doing life, at least in industrialized Western countries, you mostly have yourself to blame for your poor choices, made without enough knowledge to know the right choices.

Thanks to Derek Huang, Catherine Fiona MacPherson, and Bess Stillman for reading this essay.

Why “Man’s Search for Meaning” and Viktor Frankl

I recommend Viktor Frankl’s Man’s Search for Meaning to a fair number of people in a wide array of contexts, and one of my students asked why I included him in a short list of books at the back of the syllabus. Though I’ve mentioned him on blog a number of times (see here for one example), I hadn’t really considered why I admire his book and so wanted to take a shot at doing so.

As Frankl says, we’re suffering from a bizarre dearth of meaning in our everyday lives. One can see this in the emptiness that a lot of people report feeling and, more seriously, in suicide rates. In material terms, people in Western societies have never been as well off as we are today—and most of Asia and Latin America, along with much of Africa, are catching up with surprising speed. Yet in “spiritual” terms (I hate that much-abused word but can think of no better one—metaphysical, perhaps?) many of us aren’t doing so well, which is odd, given the cornucopia of goods and opportunities around us. I think Frankl tries to teach us how to better actualize our lives—we truly don’t live by bread alone—and I think he has a keen sense of the malaise many of us feel. I’ve struggled with these issues too and think Frankl’s treatment of them is a good one.

One can see another version or statement of this general problem in Louis CK’s much-linked bit “Everything is amazing right now and nobody is happy.” It has 7 million views, and while YouTube views are hardly a good metric for importance or content, I think CK’s bit has gone viral because he’s touching a profound problem that many people feel, even if they don’t articulate it, or usually won’t articulate to themselves or others.

Many people also seem to feel isolated (see Putnam’s possibly flawed Bowling Alone for one account). Yet because they feel isolated, they have no one to talk to about feeling isolated! The paradox worsens isolation, and there isn’t an obvious outlet for these kinds of feelings or problems. Plus, technology seems to enable crappier and more tenuous relationships, when many of us really want the opposite. That’s partly a problem of the person using the technology—we can talk to anyone, anywhere despite many of us having nothing to say—but technology also pushes use to use it in particular ways, which is one of my points about how Facebook is bad for relationships.

And people are mostly on their own in dealing with this. Schools, as they’re widely conceived of right now, are largely seen as job-training centers, rather than as places to figure out how you should live your life. So they’re not very helpful. Religion or religious feeling is one answer for some people, but religious thinking or feeling isn’t very satisfying for me and a growing number of people.

I don’t know what is helpful—problems are often easier to see than solutions—but Frankl offers a framework for thinking about leading a meaningful existence through attempting to do the best with what you’ve got and choosing an aim for your life, however small or absurd (Hence: “Nietzsche’s words, ‘He who has a why to live for can bear with almost any how,’ could be the guiding motto for all psychotherapeutic and psychohygienic efforts regarding prisoners. Whenever there was an opportunity for it, one had to give them a why—an aim—for their lives, in order to strengthen them to bear the terrible how of their existence”).

Frankl and Louis CK are hardly the only people to notice this—All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age is a contemporary example of a book tackling similar basic concepts from a different angle. Stumbling on Happiness and The Happiness Hypothesis are others. The fact that this problem persists across decades and arguably becomes more urgent means that I don’t think these books will be the last. As Frankl says in a preface:

I do not at all see in the bestseller status of my book so much an achievement and accomplishment on my part as an expression of the misery of our time: if hundreds of thousands of people reach out for a book whose very title promises to deal with the question of a meaning to life, it must be a question that burns under the fingernails.

Universities for artists: Know your purpose, know what you’re getting

A friend is in his 20s and wants to be a writer. He’s mucked around in college some without amassing enough credits to count towards anything, and he thinks he might want to start at a university again in order to become a better writer. I’ve been discouraging him, because of his age and his state goals. He started classes again this semester but seems disenchanted with them, and after talking for a while the other night, I wrote a long e-mail that summarized my views and why college is probably the wrong route for him:

If you said to me that you’re tired of working in coffee shops and want an office job in a corporation or government, a degree should be your number one priority. Not only is that not your goal, but your goal is to be a better writer. To accomplish that, school is at best a mixed bag.

At anything below the most elite schools, most students in intro-level writing courses are not particularly good writers or interested in becoming good writers (and even in elite schools, bad writers but good hoop-jumpers abound). Intro courses won’t necessarily be of much help to you. Most intro-level non-writing courses (like “Rocks for Jocks,” AKA geology) are likely to be even worse. My honors students say their classmates in classes like “Love and Romance in the Middle Ages” and “Intro to Art History” are barely literate; the honors students turn in bullshit they’ve slammed out the night before and get 100% because they are, most of them, functionally literate. They complain about not learning anything about writing in their other humanities classes. You will probably have to wade through at least a year or two of courses that provide almost no value to your stated goal—becoming a better writer—before you get a real shot at, say, English classes.

Once you are there, however, many professors aren’t especially interested in teaching, even in English classes, and the effect of many English classes on your writing skills might be small. Does reading Paradise Lost and Gulliver’s Travels and the Romantic poets in a Brit Lit I survey make you a better writer of contemporary fiction, essays, and criticism, if your professor / TA spends no time covering the basics of writing? Will sitting through a lecture on Beckett’s role in the Modernism / Postmodernism divide help you understand better metaphors in your writing, or help you construct a plot that has any actual motion?

The questions suggest the answers. I’m not saying these English classes will hurt you. But I’ve sat through a lot of those classes, and few have anything to do with writing, which is one of my many beefs with English departments and classes; too little time is spent building concrete writing and reading skills, and too much time is spent discussing works of some historical value and very little contemporary value (I’m not convinced Sister Carrie, which is one massive violation of the cliche “Show, don’t tell” will make you a better novelist today, any more than studying the math of the 1850s in its original context will make you a better mathematician).

Some professors teach close reading and who will really work with you to develop your writing skills, especially if you follow the advice I offer. But those experiences are at best hit-and-miss, and more often than not misses. They depend on the professor, and you won’t know if a class might be useful until you’re already in it.

Plus, getting to those classes will probably take a long time and a lot of money and hoop jumping. The more direct route for you is through a writers’ workshop, which almost all communities of any size have.

That’s the learning part of the equation. From the job/status/credential part of the equation, and as I’ve said before, the effect of school on labor market outcomes is quite binary: you have a degree and make a lot more money in the aggregate, or you don’t and you make a lot less money. Starting a degree without finishing it is one of the worst things you can do, speaking financially and in terms of opportunity cost. That’s why it’s so vital for you to either start and finish or not start.

If you were 18 and didn’t know what the hell else to do, I would tell you to go to college because your peers are doing it and most 18-year-olds don’t know anything and waste most of their time anyway. You could noodle around in a lot of classes and maybe learn something and at least you’ll finish with a degree. Beyond that, a lot of college happens between the lines, through living in dorms and developing a peer network. But you’re not 18, you already know something (you do), and you have a (presumed) goal that you don’t necessarily have to go through school to accomplish. If your goal changes—i.e. you decide you don’t want to work in retail or coffee or unskilled labor and you want to get some other kind of job—then my advice will change.

A distressingly small amount of actual learning goes on in college classrooms. You can see this in Arun and Josipka’s Academically Adrift: Limited Learning on College Campuses. You can read a different take by searching for “The Case Against Education,” which is the title of Bryan Caplan’s book concerning signaling / credentialism in education. Or you can look at the people around you, who might be the most compelling argument. People who are really determined to get education do get it, but outside of the hard sciences, there’s a LOT of bullshit. The stuff that isn’t bullshit will be hard for you to find. Not impossible, but hard. And you don’t get the monetary benefits without finishing.

The college wage premium is still real, but it only applies to people who actually want to work at jobs that require college degrees. If you want to be an engineer, go to college. In “How Liberal Arts Colleges Are Failing America,” Scott Gerber points out that “A degree does not guarantee you or your children a good job anymore. In fact, it doesn’t guarantee you a job: last year, 1 out of 2 bachelor’s degree holders under 25 were jobless or unemployed.” I look around the University of Arizona, and it’s clear to me that a variety of majors—comm and sociology are the most obvious—provide almost no real intellectual challenges and hence no real skills, whatsoever. The business school at the U of A seems better, but it’s still hard for me to ascertain, from the outside, if what goes on there really matters.

To recap: I don’t think going to school is bad or will hurt you. But I’m also not convinced that going to school is an optimal use of time / money for you.

I still think that, if you really want to be a writer, the absolute number one thing you have to do is write a lot—and want to write a lot, because the writing itself comes from the desire. In Malcolm Gladwell’s book Outliers, he discusses the research on the “10,000-hour rule,” or the idea that it takes 10,000 hours of deliberate practice to achieve mastery of a skill. I’m not totally convinced that 10,000 hours is the magic number, or that anyone can deliberately practice for 10,000 hours in a given field and master it, but the basic idea—that you have to spend a LOT of time practicing in order to achieve mastery—is sound. To the extent you want to be a writer and that you spend time in classes that are at best tangentially involved with being a writer, I think you are making a mistake in the way you’re allocating your limited time and resources. You might be better off, say, going to the library and reading every Paris Review interview, going back to the beginning, and writing down every quote that speaks to you.

All of us have 24 hours in a day. Any time you spend doing one thing can’t be spent doing another. If you want to become a writer, I think you should allocate most of your time to writing, not to classes, unless you want to be a writer in some officially sanctioned organ, like a newspaper.

Finally, if you want to be a better writer, write stuff (blog posts, novels, essays, whatever) and send them to me. I will give you more detailed feedback than 99% of your professors. With me, the price is also right.

Beyond that, I want to emphasize just how hit-and-miss my education was, especially now that I look back on it. This was clearest to me in high school: as a freshman and sophomore, I had three really good English teachers from whom I learned a lot: Thor Sigmar, Mindy Leffler, and Jack someone, who taught journalism but whose name now escapes me, though he was very good at what he did and had a very dry sense and hilarious of humor. He also drove a black Miata and was clearly in the closet, at least from the perspective of his students. Then I had two terrible teachers: one named Rich Glowacki, who, distressingly, appears to still be teaching (at least based on a cursory examination of Google, and another named Nancy Potter. The former did an excellent impression of a animatronic corpse and was fond of tests like “What color was the character’s shoe in Chapter 6?” Moreover, one time I came in to talk to him about the “literary terms” he wanted us to memorize for a test. He couldn’t define many of the terms himself; in other words, he was testing us on material that he himself didn’t know.

That moment of disillusionment has stayed with me for a very long.

The other, Nancy Potter, was so scattered that I don’t think anything was accomplished in her class. She also wrote a college letter of recommendation for me that was so screwed up, and so strewn with typos and non sequiturs, that my Dad and I had to rewrite it for her. When your 18-year-old student is a better and more competent writer than you, the teacher, something is seriously amiss.

In college, I went to Clark University, where pretty much all the professors in all the departments are selected for their interest and skill in teaching. I ran into few exceptions; one was a guy who appeared to be about a thousand years old and who taught astronomy. He has trouble speaking and didn’t appear to know what he wanted to speak about on any given day.

Now that I know more about universities, I can only assume he was on the verge of retirement, or was already emeritus, and had been given our class of non-majors because a) he couldn’t do much damage there and b) the department knew it was filled with students who were taking the class solely to fulfill the somewhat bogus science requirement. He didn’t do much damage, except for some infinitesimally small amount to Clark’s reputation, and I assume the other people in the department were happy to avoid babysitting duty.

He, however, was very much the exception at Clark.

Most public colleges and universities are quite different than Clark, and the teaching experience is closer to public high schools, with some good moments and some bad. If your goal is to be an artist, or to learn any kind of skill in depth, you could spend years paying tuition, taking prerequisites of dubious utility, and struggling to find the right teacher or teachers, all without actually accomplishing your goal: learning some kind of skill in-depth.

I don’t think this applies solely to writers, either. If you’re a programmer, there are hacker collectives, or user groups, or equivalents, in many places. Online communities are even more prevalent. I have no idea how good or useful such places and people are. But the price is right and the cost of entry is low. Determined people will find each other. If you’ve got the right attitude towards receiving and processing criticism, you should be ready to take advantage. Knowledgable people should be able to point you in the direction of good books, which are hard to find. You should signal that you’re ready to learn. If you do those things right, you can get most if not all of what you would normally get out of school. But you also have to be unusually driven, and you have to be able to function without the syllabus/exam/paper structure imposed by school. If you can’t function without the external imposition of those constraints, however, you’re probably not going to make it as an artist anyway. The first thing you need is want. The second thing you need is tenacity. The first is useless without the second.

Stories like “Minimum Viable Movie: How I Made a Feature-Length Film for $0″ should inspire you, especially because you need even less money to be a writer than you do to make a movie. Arguably you also need less money to be a musician than you do to make a movie, although I’m less knowledgable on that subject and won’t make absolute pronouncements on it.

Again, I am not anti-school, per se, but it is important to understand how much or most school is about signaling and credentialing, and how easy a lot of school is if you’re willing to stay quiet, keep your ducks in a line, and jump through the hoops presented. It’s also important to understand the people who benefit most from offering arts training: the instructors. They get a (relatively) light teaching load, the possibility of tenure, a cut of your tuition, and time and space to pursue their passion, while you pay for their advice. Getting a gig as a creative writing professor is pretty damn sweet, regardless of the outcomes for students. That doesn’t mean creative writing professors can’t be very good, or very helpful, or improve your work, or dedicated to teaching, but it does mean that you should be cognizant of what benefits are being derived in any particular economic transaction. When small amounts of money are involved, it’s easy to ignore the economic transaction part of school, but now that tuition is so high, it’s impossible for anyone but the stupendously rich to ignore financial reality, like who gains the most when you enroll in a creative writing seminar.

As a side note, I think we’re already starting to see a shift away from the college-for-everyone mentality (that’s what the posts by Gerber and others are doing). Ironically enough, the universities themselves are involved in a perverse loan-based system whose present incentives are eventually going to drive their customer base away through price hikes. Universities are still going to be good deals and useful for some people, but those people will probably turn out to be more intellectual and analytical—the kinds of people who will benefit from knowledge dissemination and who will ultimately feel the need to create new knowledge. I also suspect a lot of non-elite private schools are going to have even larger problems than public schools. This isn’t a novel argument, but that doesn’t make it any less real, or any less likely to happen.

Anyway, I’m broadening the view too far here. The important thing is that you understand yourself and understand the system that you’re entering and how it incentivizes its participants. If you understand that, I think you’ll increasingly understand my skepticism about the utility of college classes for someone in your situation.

The stupidity of what I’m doing and the meaning of real work: Reading for PhD comprehensive exams

Last weekend, I wrote a flurry of posts after months of relative silence because I needed to do real work.

This might sound strange: I am doing a lot of things, especially reading, but all of it is make-believe, pretend work. That’s because the primary thing I’m doing is studying for PhD comprehensive exams in English lit. The exam set is structured in four parts: three, four-hour written segments, and a single oral exam, on topics related to stuff that’s not very important to me and probably not very important to most people. The exams also aren’t very relevant to being an English professor, because the key skill that English professors possess and practice is writing long-form essays/articles that are published in peer-reviewed journals. The tests I’m taking don’t, as far as I can tell, map very effectively to that skill.

As a consequence, the tests, although very time consuming, aren’t very good proxies for what the job market actually wants me to do.*

Consequently, PhD exams—at least in English—aren’t real work. They’re pretend work—another hoop to be jumped through on the way to getting a union card. Paul Graham makes a useful distinction in “Good and Bad Procrastination,” when he says that “Good procrastination is avoiding errands to do real work.” That’s what I’ve done through most of grad school, and that’s part of the reason why I have a fairly large body of work on this blog, which you can obviously read, a fairly large body of fiction, which you can’t (at the moment, but that’s going to change in the coming months). To Graham, the kind of small stuff that represents bad procrastination is “Roughly, work that has zero chance of being mentioned in your obituary.” Passing exams has zero chance of being mentioned in my obituary. Writing books or articles does.** PhD exams feel like bad procrastination because they’re not really examining anything useful.

They’re also hard, but hard in the wrong way, like picking patterns out of noise. Being hard in the right way means the soreness you get after working out, or when a challenging math problem suddenly clicks. The quasi-work I’m doing is intellectually unsatisfying—the mental equivalent of eating ice cream and candy all day, every day. Sure, they’re technically food, but you’re going to develop some serious problems if you persist in the ice cream and candy diet. The same is true of grad school, which might be why so many people emerge from it with a lugubrious, unpalatable writing style. Grad school doesn’t select or train for style; it selects and trains for a kind of strange anti-style, in which the less you can say in more words is rewarded. It’s the kind of style I’m consciously trying to un-cultivate, however hard the process might be, and this blog is one outlet for keeping the real writer alive in the face of excessive doses from tedious but canonized work and literary theory. Exams, if anything, reinforce this bogus hardness. If I’m ever in a position of power in an English department with a grad program, I’m going to try and offer an alternative to conventional exams, and say that four to six publishable, high-quality papers can or should take their place. That, at least, mirrors the skills valued by the job market.

The bogosity of exams relates to a separate problem in English academia, which I started noticing when I was an undergrad and have really noticed lately: the English curriculum is focused on the wrong thing. The problem can be stated concisely: Should English department teach content (like, say, Medieval poetry, or Modernist writers), or skills (like writing coherently and close reading)? Louis Menand describes the issue in The Marketplace of Ideas:

[C]ompare the English departments at two otherwise quite similar schools, Amherst and Wellesley. English majors at Wellesley are required to take ten English department courses [. . .] All English majors must take a core course called ‘Critical Interpretations’; one course on Shakespeare; and at least two courses on literature written before 1900 [. . .] The course listing reflects attention to every traditional historical period in English and American literature. Down the turnpike at Amherst, on the other hand, majors have only to take ten courses ‘offered or approved by the department’—in other words, apparently, they may be course sin any department. Majors have no core requirement and no period requirements. (Menand 89-90)

Most departments right now appear to answer “content.” Mine does. But I increasingly think that’s the wrong answer. I’m not convinced that it’s insanely important for undergrads to know Chaucer, or to have read Sister Carrie and Maggie: Girl of the Streets, or to have read any particular body of work. I do think it’s insanely important for them to have very strong close reading skills and exceptional writing skills. Unfortunately, I appear to be in the minority of professional Englishers in this respect. And I’m in grad school, where the answer skill mostly appears to be “content,” and relatively few people appear to be focusing on skills; those are mostly left to individuals to develop on their own. I don’t think I’ve heard anyone discuss what makes good writing at conferences, in seminars, or in peer-reviewed papers (MFA programs appear to be very interested in this subject, however, which might explain some of their rise since 1945).

As Menand points out, no one is sure what an “‘English’ department or degree is supposed to be.” That’s part of the field’s problem. I think it’s also part of the reason many students are drawn to creative writing classes: in those, at least the better ones, writing gets taught; the reading is more contemporary; and I think many people are doing things that matter. When I read the Romantic Poets, I mostly want to do anything but read the Romantic Poets. Again, I have nothing against the Romantic Poets or against other people reading the Romantic Poets—I just don’t want to do it. Yet English undergrad and grad school forces the reading of them. Maybe it should. But if so, it should temper the reading of them with a stronger focus on writing, and what makes good writing.

Then again, if English departments really wanted to do more to reward the producing of real content, they’d probably structure the publishing of peer-reviewed articles better. Contrary to what some readers have said in e-mails to me, or inferred from what I’ve written, I’m actually not at all opposed to peer review or peer-reviewed publications. But the important thing these days isn’t a medium for publishing—pretty much anyone with an Internet connection can get that for free—but the imprimatur of peer-review, which says, “This guy [or gal] knows what he’s talking about.” A more intellectually honest way to go about peer-review would be to have every academic have a blog / website. When he or she has an article ready to go, he should post it, send a link to an editor, and ask the editor to kick it out to a peer-reviewer. Their comments, whether anonymous or not, should be appended to the article. If it’s accepted, it gets a link and perhaps the full-text copied and put in the “journal’s” main page. If it doesn’t, readers can judge its merits or lack thereof for themselves.

The sciences arguably already have this, because important papers appear on arXiv.org before they’re officially “published.” But papers in the sciences appear to be less status-based and more content-based than papers in the humanities.

I think this change will happen in the humanities, very slowly, over time; it won’t be fast because there’s no reason for it to be fast, and the profession’s gatekeepers are entrenched and have zero incentive to change. If anything, they have a strong incentive to maintain the system, because doing that raises their own status and increases their own power within the profession. So I don’t foresee this happening, even if it would be an important thing. But then again, academics are almost always behind the important thing: the important thing is happening in some marginal, liminal space, and academics inhabit a much more central area, where it’s easy to ignore stuff at the margins. I don’t see that changing either, especially in a world where many people compete for few academic slots. In that world, pointless hoop-jumping is going to remain.


* There’s a vast literature in industrial organization on the subject of hiring practices, and most of that literature finds that the most effective ways to hire workers is to give them an IQ test and a work-skills or work-practice test. The former is effectively illegal in the U.S., so the best bet is to give workers a test of the thing they’ll actually be called on to do.

** I also consciously ask myself this question set:

In his famous essay You and Your Research (which I recommend to anyone ambitious, no matter what they’re working on), Richard Hamming suggests that you ask yourself three questions:

1. What are the most important problems in your field?

2. Are you working on one of them?

3. Why not?

I have an answer to number three, but it doesn’t seem like a very good one.

What you should know BEFORE you start grad school / PhD programs in English Literature: The economic, financial, and opportunity costs

This post started life as an e-mail to a high school teacher who is thinking about grad school in English Lit. I expanded and cleaned it up slightly for the blog, but the substance remains.

Pleasure meeting you the other day. I’m too well-versed in the anti-grad school lit, and the short version of this e-mail is “don’t go to grad school in the humanities.” If you go anyway, make sure you have an obvious fallback career; don’t assume that you’ll figure it out after five to ten years. Grad school is not a good place to pointlessly delay adulthood (a phrase we’ll come back to later).

Let me start with Thomas Benton’s articles, like “The Big Lie About the ‘Life of the Mind’” and “Graduate School in the Humanities: Just Don’t Go” in the Chronicle of Higher Education. Read both. Read both twice. Then read Louis Menand’s The Marketplace of Ideas, and pay special attention to the sections where he discusses supply and demand: I get the sense that a lot of people spend more time deeply, critically thinking about fun restaurants for dinner tonight than whether grad school is really a good idea. I’m not saying you’re one of those people, but the number of would-be researchers who do almost no research in evaluating their grad school decisions is astounding. Menand’s basic point is simple: most people in English PhD programs are not going to be researchers and tenure-track professors at universities. [1] Some number will, but that number is tiny.

Don’t put too much stock in stories like “From Graduate School to Welfare: The Ph.D. Now Comes With Food Stamps,” but they’re being told and repeated for reasons. People like the woman featured have made spectacularly bad life choices, and, while she’s an extreme example, many would-be professors eventually curse themselves for starting grad school. If I didn’t have a second job working for a real business for real money, I’d probably be close to qualifying for food stamps (without that real job, however, I wouldn’t have made it this far in grad school, because it’s almost impossible to live a reasonably normal life on $13,000 – $16,000 per year).

I know grad students who can’t get a $7 sandwich at Paradise Bakery because it’ll blow their food budget for the month. They have to bring lunch to campus every day because they can’t afford not to. Tired in the morning? Tough: make your bean-sprout sandwich or your lentil curry. Personally I like bean-sprout sandwiches and lentil curry, but I also like the option of buying lunch on a whim. Not having any money also sucks if you need or want a book and can’t get it easily or expeditiously from the library and find yourself unable to buy it for $30. Someone who’s has four years of undergrad and two or more years of grad school should be able to buy a sandwich without carefully thinking about the financial repercussions.

Consider what you’ve got right now, today. You’re a teacher, so I’ll guess you make ~$30,000 – $40,000 a year. Call it $35,000. If you spend five years getting a PhD, you’ll be giving up at least $100,000 ($35,000*5=175,000; $15,000*5=$75,000) short of what you’d make teaching high school. And that’s not taking into account the raises you might get as a teacher, or the benefits, which can be substantial (especially if you’re on a 30-year retirement track). If you take 10 years, like the median PhD student, you’ll be giving up $225,000, again not counting benefits, which are far better as a teacher than they are as a grad student. Accounting for retirement benefits, you might be giving up more like $300,000. A lot of money, no?

If you get a tenure-track job, you could conceivably make up that amount over the course of your lifetime, but, remember, you’re not even likely to make that much as a TT prof; I’ve asked the University of Arizona’s TT-track but non-tenured faculty gauche money questions, and they report making about $50,000 a year—and U of A is a plum, super-competitive job straight out of grad school. It’s certainly possible to make less and work more. You can do the math on how long you’ll have to work to financially make up for income foregone during grad school. It’s ugly.

If you don’t get a tenure track job, you may wish very deeply for a couple extra hundred thousand dollars. These are loose numbers, but no one I’ve floated them to has disputed them, I’d guess that making them more precise by counting opportunity / investment costs would only weigh them more heavily to being a teacher, given how much of one’s lifetime income from being a teacher is backloaded by retirement pay.

So who’s grad school good for? Again, let’s follow the money, and I’ll use the University of Arizona as an example because that’s where I am. The out-of-state credit-hour fee for undergrads for Spring 2012 was $1,024. For in-state students it was $651. About a quarter of Arizona undergrads come from out-of-state. Grad students teach about 50 freshmen per semester, or about 100 per year. That’s $48,825 in in-state tuition collected, and $25,600 of out-of-state tuition—but each grad student teaches three credit hours. Triple those numbers. They’re $76,800 for out-of-state students and $146,000 for in-state students, for a total of $222,8000. Some of that money goes to profs who run grad seminars, to facilities, to various other administrative functions, and so on. (Grad students also get a couple of one-semester, one-class waivers), but the basic calculation shows why the university as a whole likes grad students, a lot.

Most universities love ABDs, who consume minimal university resources. Menand says:

One pressure on universities to reduce radically the time to degree is simple humanitarianism. Lives are warped because of the length and uncertainty of the doctoral education process. Many people drop in and drop out and then drop in again; a large proportion of students never finish; and some people have to retool at relatively advanced ages. Put in less personal terms, there is a huge social inefficiency in taking people of high intelligence and devoting resources to training them in programs that half will never complete for jobs that most will not get. Unfortunately, there is an institutional efficiency, which is that graduate students constitute a cheap labor force. There are not even search costs involved in appointing a graduate student to teach. The system works well from the institutional point of view not when it is producing PhDs, but when it is producing ABDs […] The longer students remain in graduate school, the more people are available to staff undergraduate classes. Of course, the overproduction of PhDs also creates a buyer’s advantage in the market for academic labor.

There’s little incentive for universities to speed up the grad school process. If anything, their financial incentive is to slow it further, and this is what we see. Regardless of their marketing, remember that universities are businesses, and businesses prefer to pay less for labor, not more, just as you probably prefer to pay less for goods and services, not more. Many articles decry the state of the adjunct labor force, but universities treat adjuncts like they do because they can. Supply and demand exist and they matter.

Most people I know who aren’t in grad school and talk about going discuss the life of the mind, the transformative power of education, how they want to be a professor, their interest in teaching, their love of research and so forth. Most people I know who are in grad school talk about finances, economics, and the job market. Not all the time, to be sure, and I’ve had some lovely conversations about The Professor of Desire and Billy Collins and Heart of Darkness. But jobs and money are on almost everyone’s mind, especially as peers from high school or college are getting jobs at Google, or finishing their residencies, or getting promoted enough to discuss their “401(k),” which is a sure sign of aging, along with in-depth real estate analysis—remember back when we only talked about sex and art? Neither do I.

Many grad students remain in a state of financial adolescence for a decade of their prime career-building years. Don’t do that. Become an adult: you’ll have to eventually, and the skills you build outside the academy are often more valuable than those you might build in humanities grad schools.

Some grad students complain about being financially exploited by universities, but it’s hard to exploit highly educated people who have terrific reading and writing skills and who should know better, or at least do some cursory research before they spend as long as a decade getting a degree. The anti-grad school literature is vast—and highly accessible: type “Why shouldn’t I go to grad school?” into a search engine. Spend a few hours with the results.

People who aren’t in grad school, along with people who are professors and have jobs, also talk about wanting to be involved with “the Conversation” (I capitalize “Conversation” in my head), which means the book chat that happens in peer-reviewed journals and books about writers and ideas. But if you want to contribute to the Conversation, get a blog from http://www.wordpress.com or http://www.substack.com and start producing valuable work. Comment on the work of other book people. Write about what you notice. This won’t get you tenure, and it will probably not get you read by other professors, but, if you’re any good, you will probably have more readers than the average literary journal. See “No One Really Reads Academic Papers” and “The Research Bust.” In writing a blog no one has heard of, I’ve had greater impact and reach than the published work of 98% of tenured humanities professors. The paucity of most humanities professors’ intellectual ambition is astounding, when you really think about it.

To be sure, some people succeed in grad school. Maybe I’ll be one, although this looks increasingly less likely. A PhD is not a lottery ticket, but it can start to feel like one. If you do go, you better know the odds and know the costs, financial and otherwise. You better know that there are very, very few tenure track jobs, though there are a lot of one-year gigs at random places that are happy to offer you not very much money for not very good job security. The system is rigged against you. Humanities academics are often very interested in talking about all kinds of exploitation, but they very rarely want to talk about the exploitation that happens in grad school itself. Play games you’re likely to win, not games you’re likely to lose. Choose status ladders to climb that matter, not ones that mattered 50 years ago.

Too many people—maybe most—enter grad school so they can pointlessly delay adulthood. Adulthood, however, arrives sooner or later anyway. Too many people enter grad school because they’ve succeeded by conventional academic metrics and hoop-jumping through most of their lives and find the big, amorphous real world terrifying. But grad school, if it was ever a good way of avoiding the real world, surely isn’t now, because the real world is a far harsher place when you’re 32 and have a degree of dubious value and are trying to cobble gigs together to pay rent. See again the link above concerning PhDs on food stamps.

There are also dangers that are rarely discussed. In humanities PhD programs, dissertation advisors and committee members may be distant or unhelpful. Outright theft of work is rare, but indifference is common. It’s possible for a single person to outright block or retard individual progress in a way that’s rare in normal jobs. A committee can offer no or positive feedback, then outright reject a dissertation. A sudden retirement, departure, or sabbatical can imperil years of a candidate’s work. You don’t want to get in a situation where a single person can annihilate your career. That’s what grad school in the humanities often means.

I don’t know anyone in the business who is really gung-ho about encouraging smart, motivated undergrads and recent graduates to go to humanities grad programs.

In addition, if you don’t thoroughly read everything I’ve linked to in this post, you shouldn’t go to grad school because you haven’t invested enough time in thinking about and learning about what you’re getting into.

Some of the problems above could be ameliorated, if it were in the system’s interest to do so (it’s not; universities’s finances are enabled by the cruel student loan system, while professors like the system, with the status and modest amounts of power it grants them, as it is). Eliminating tenure would help, because few schools want to make what might be 40+ year commitments to salary + benefits if they don’t have to. A shift to long-term contracts would be an improvement at the margins.

I’ve seen some proposals that universities offer a four-year “teaching PhD” that is awarded primarily on the basis of coursework; since most PhD students are at most going to become adjuncts or lecturers anyway, one might as well quit the facade that currently exists. The teaching dissertation would be a collection of coursework and/or experiment descriptions, depending on the field. Something like this paragraph could have been written any time in the last 15 or 20 years, and the system trundles along because it works well enough and a sufficient number of people are willing to chase the tenure dream to keep it going.

EDIT 2016: When I first wrote this in 2012 I was still in grad school. I’m updating it in January of 2016. Let me be blunter: going to grad school in the humanities is an idiotic life choice that will likely fuck up your life. Of the people I know who were my approximate grad school peers, two live at home; one works at an Apple Store; another works in a preschool; another is teaching the SAT, LSAT, and the like for one of the big companies that pay $15 – $20 an hour for such work; and a couple are adjuncts. A few have short-term contracts. Only one or two have the tenure-track positions they were training for.

If you must, must, must go to grad school despite knowing how dumb doing so is, quit after two years with an M.A. Don’t waste years of your life. There is often a false dichotomy presented between the “life of the mind” and pursuing lots of filthy money. But I like to observe that it’s reasonable to seek reasonable material conditions while pursuing the life of the mind. If you can’t achieve reasonable material conditions you should do something else, and that something else may enable the true life of the mind, not the potemkin life of the mind offered by most humanities graduate degree programs.

Further reading:

* Most universities hire exclusively from elite universities. If you don’t attend an elite university, you’re unlikely to get a job regardless of your publishing record.

* Robert Nagel’s “Straight Talk about Graduate School.”

* “Open Letter to My Students: No, You Cannot be a Professor.

* Penelope Trunk’s “Don’t try to dodge the recession with grad school,” as well as “Best alternative to grad school” and “Voices of the defenders of grad school. And me crushing them.”

* As of 2015, “The Job Market for Academics Is Still Terrifying.” Fewer than half of humanities PhDs are “employed” (using whatever metric they use) and about 35% are unemployed altogether—which is at least three times the national unemployment rate, which also counts high-school dropouts.

* If you are male, see “Insanity in academia, or, reason #1,103 why you should stay out of grad school: Kangaroo courts” to better understand the culture you seek to join. You’re an accusation away from having your career destroyed.

* “The New Intellectuals: Is the academic jobs crisis a boon to public culture?” (Note the sections about the bogosity of peer review and the economic precariousness of the “new intellectuals”).


[1] Menand also writes:

Between 1945 and 1975, the number of American undergraduates increased 500 percent, but the number of graduate students increased by nearly 900 percent. On the one hand, a doctorate was harder to get; on the other, it became less valuable because the market began to be flooded with PhDs.

This fact registered after 1970, when the rapid expansion of American higher education abruptly slowed to a crawl, depositing on generational shores a huge tenured faculty and too many doctoral programs churning out PhDs. The year 1970 is also the point from which we can trace the decline in the proportion of students majoring in liberal arts fields, and, within the decline, a proportionally larger decline in undergraduates majoring in the humanities. In 1970–71, English departments awarded 64,342 bachelor’s degrees; that represented 7.6 percent of all bachelor’s degrees, including those awarded in non-liberal arts fields, such as business. The only liberal arts category that awarded more degrees than English was history and social science, a category that combines several disciplines. Thirty years later, in 2000–01, the number of bachelor’s degrees awarded in all fields was 50 percent higher than in 1970–71, but the number of degrees in English was down both in absolute numbers—from 64,342 to 51,419—and as a percentage of all bachelor’s degrees, from 7.6 percent to around 4 percent.

Fewer students major in English. This means that the demand for English literature specialists has declined.

The number of undergrads in English Lit has declined while the number of people getting PhDs has remained constant or risen. There is basically no industry for English PhDs to enter. You do not have to be an economist to understand the result.

Bad academic writing:

I’m reading for an essay on Tom Perrotta’s Election and Anita Shreve’s Testimony and came across this, from Timothy Aubry’s “Middlebrow Aesthetics and the Therapeutic: The Politics of Interiority in Anita Shreve’s The Pilot’s Wife:” “Although occasionally called upon to perform certain emeritus functions, the omniscient narrator has retired decisively from the scene of contemporary United States fiction.” Translated from academic-ese to English, this roughly means, “Contemporary writers seldom use omniscient narrators.” If absolutely necessary, you could say, “Contemporary American writers seldom use omniscient narrators.”

EDIT: And, for an entertaining counterpoint, Paul Dawson says in “The Return of Omniscience in Contemporary Fiction:”

I want to begin this essay by pointing out what I think has become a salient feature, or at least significant trend, in contemporary British and American literary fiction: namely, a prominent reappearance of the ostensibly outmoded omniscient narrator. In the last two decades, and particularly since the turn of the millennium, a number of important and popular novelists have produced books which exhibit all the formal elements we typically associate with literary omniscience: an all-knowing, heterodiegetic narrator who addresses the reader directly, offers intrusive commentary on the events being narrated, provides access to the consciousness of a range of characters, and generally asserts a palpable presence within the fictional world.

So what’s happening to omniscient narrators? Are they “seldom use[d]” or making “a prominent reappearance?” Or both?

Why I try not to be too hard on students:

I was going through some old boxes from my parents’ house and found papers and stories I’d written as a freshman and sophomore in college; while the papers weren’t bad, the stories were terrible. The kind of terrible that gets justifiably mocked in academic novels like Blue Angel. The kind that would make many instructors throw up their hands in dismay and their lunch thanks in nausea. The kind that make me wonder what the hell could’ve made me want to keep going. Actually, I don’t wonder, because the answer is probably ignorance—the sort of ignorance I’ve been trying to cure, probably futilely, ever since.

Reading Eileen Pollack’s “Flannery O’Connor and the New Criticism: A Response to Mark McGurl” reminded me of those early experiences (the article is behind a bullshit paywall, by the way):

The careless inclusion of random details or digressions or the unintentional revelation of aspects of one’s own character are precisely what gets beaten out of a student even in the nicest, most tactful workshop (let alone the considerably more venomous workshops that tend to be the norm at Iowa). Even if novice writers are not narcissists in the therapeutic sense, they have rarely had the experience of writing for a disinterested audience (i.e., readers other than their mothers and doting high school English teachers). This means that a story’s prose must be coherent, the plot comprehensible, the characters (and the world they live in) believable and consistent (even if the story isn’t meant to be realistic). Writers soon learn that their classmates do not want to read a 20-page digression about a character’s fight with her parents because they did not buy her a BMW for her sixteenth birthday, or the endless details of a championship high school football game, or an angry fantasy about raping and mutilating a beautiful woman who rejects the main character’s amorous advances, or a sermon against abortion or nuclear war. A student may never receive a lecture on craft or the tenets of the New Criticism, but by handing out drafts of a story and listening to detailed responses from readers of all sorts, he or she will learn how best to convey the ideas and emotions he or she intends to convey and not include anything that reveals aspects of his or her psyche or autobiography that are irrelevant and/or embarrassing.

I gave and got criticism that was designed to beat out the merely random, and the class’s response was certainly a useful, if “venomous” and vitriolic, way of imparting important messages about audience reaction. If I teach fiction writing classes, I might include Pollack’s paragraph in the syllabus, even if the experience of attempting to write about not receiving a BMW or football games will probably still be necessary for students to get the lesson.

Looking at that old work shows me that, if I didn’t have the particular problems Pollack enumerates, I still had one analogous to them. When I’m looking at student work, I should remember my own development at an equivalent age.

The academic papers, fortunately, were much better, and I’d see a couple of them relatively recently. As a first-year grad student, I split a two-bedroom apartment with another first-year, and one night we found and traded papers from when we were freshmen, mostly out of curiosity: would we have lived up to the standards we imposed on others? The answer, fortunately, was yes, but I wouldn’t rank the papers I wrote as a freshman as among the best that my students have written over the last three and a half years.

Beyond critiquing it, however, the writing is itself a window on the old person I used to be, who has become a stranger to me over time, kept alive only through the random bits of writing that he chose to commit to paper or hard drive and drag around.

College graduate earning and learning: more on student choice

There’s been a lot of talk among economists and others lately about declining wages for college graduates as a group (for example: Arnold Kling, Michael Mandel, and Tyler Cowen) and males in particular. Mandel says:

Real earnings for young male college grads are down 19% since their peak in 2000.
Real earnings for young female college grads are down 16% since their peak in 2003.

See the pretty graphs at the links. These accounts are interesting but don’t emphasize, or don’t emphasize as much as they should, student choice in college majors and how that affects earnings. In “Student choice, employment skills, and grade inflation,” I said that colleges and universities are, to some extent, responding to student demand for easier classes and majors that probably end up imparting fewer skills and paying less. I’ve linked to this Payscale.com salary data chart before, and I’ll do it again; the majors at the top of the income scale are really, really hard and have brutal weed-out classes for freshmen and sophomores, while those at the bottom aren’t that tough.

It appears that students are, on average, opting for majors that don’t require all that much effort.

From what I’ve observed, even naive undergrads “know” somehow that engineering, finance, econ, and a couple other majors produce graduates that pay more, yet many end up majoring in simple business (notice the linked NYT article: “Business majors spend less time preparing for class than do students in any other broad field, according to the most recent National Survey of Student Engagement [. . .]”), comm, and other fields not noted for their rigor. As such, I wonder how much of the earnings picture in your graph is about declining wages as such and how much of it is really about students choosing majors that don’t impart job skills of knowledge (cf Academically Adrift, etc.) but do leave plenty of time to hit the bars on Thursday night. Notice too what Philip Babcock and Mindy Marks found in “The Falling Time Cost of College: Evidence from Half a Century of Time Use Data:” “Full-time students allocated 40 hours per week toward class and studying in 1961, whereas by 2004 they were investing about 26 to 28 hours per week. Declines were extremely broad-based, and are not easily accounted for by compositional changes or framing effects.”

If students are studying less, maybe we shouldn’t be surprised that their earnings decline when they graduate. I can imagine a system in which students are told that “college” is the key to financial, economic, and social success, so they go to “college” but don’t want to study very hard or learn much. They want beer and circus. So they choose majors in which they don’t have to. Schools, in the meantime, like the tuition dollars such students bring—especially when freshmen and sophomores are often crammed in 300 – 1,000-person lecture halls that are extraordinarily cheap to operate because students are charged the same amount per credit hour for a class of 1,000 as they are for a seminar of 10. Some disciplines increasingly weaken their offerings in response to student demand.

Business appears to be one of those majors. It’s in the broad middle of Payscale.com’s salary data, which is interesting given how business majors presumably go into their discipline in part hoping to make money—but notice too just how many generic business majors there are. The New York Times article says “The family of majors under the business umbrella — including finance, accounting, marketing, management and “general business” — accounts for just over 20 percent [. . .] of all bachelor’s degrees awarded annually in the United States, making it the most popular field of study.” That’s close to what Louis Menand reports in The Marketplace of Ideas: “The biggest undergraduate major by far in the United States is business. Twenty-two percent of all bachelor’s degrees are awarded in that field. Ten percent of all bachelor’s degrees are awarded in education.” If all these business majors graduate without any job skills, maybe we shouldn’t be all that surprised at their inability to command high wages when they graduate.

I’d like to know: has the composition of majors changed over the years Mandel documents? If so, from what to what? Menand has some coarse data:

There are almost twice as many bachelor’s degrees conferred every year in social work as there are in all foreign languages and literatures combined. Only 4 percent of college graduates major in English. Just 2 percent major in history. In fact, the proportion of undergraduate degrees awarded annually in the liberal arts and sciences has been declining for a hundred years, apart from a brief rise between 1955 and 1970, which was a period of rapidly increasing enrollments and national economic growth. Except for those fifteen unusual years, the more American higher education has expanded, the more the liberal arts sector has shrunk in proportion to the whole.

But he’s not trying to answer questions about wages. Note too that my question about composition is a genuine one: I have no idea of what the answer is.

One other major point: if Bryan Caplan is right about college being about signaling, then there might also be a larger composition issue than the one I’ve already raised: people who aren’t skilled learners and who don’t have the willingness or capacity to succeed after college may be increasingly attending college. In that case, the signal of a college degree isn’t as valuable because the people themselves going through college aren’t as good—they’re on the margins, and the improvement to their skillset is limited. Furthermore, colleges universities aren’t doing all that much to improve that skillset—see again Academically Adrift.

I don’t know what, if anything, can be done to improve this dynamic. Information problems about which college major pay the most don’t seem to be a major issue, at least anecdotally; students know that comm degrees are easy and other, more lucrative degrees are hard. There may be Zimbardo / Boyd-style time preference issues going on, where students want to consume present pleasure in the form of parties and “hanging out” now at the expense of earnings later, and universities are abetting this in the form of easy majors.

This is the part where I’m supposed to posit how the issues described above might be improved. I don’t have top-down, pragmatic solutions to this problem—nor do I see strong incentives on the part of any major actors to solve it. Actually, I don’t see any solutions, whether top-down or bottom-up, because I don’t think the information asymmetry is all that great and consumption preferences mean that, even with better information, students might still choose comm and generic business.

Mandel ends his post by saying, “Finally, if we were going to design some economic policies to help young college grads, what would they be?” The answer might be something like, “make university disciplines harder, so students have to learn something by the end,” but I don’t see that happening. That he asks the question indicates to me he doesn’t have an answer either. If there were one, we wouldn’t have a set of interrelated problems regarding education, earnings, globalization, and economics, which aren’t easy to disentangle.

Although I don’t have solutions, I will say this post is a call to pay more attention to how student choices and preferences affect education and earnings discussions.

EDIT: See also College has been oversold, and pay special attention to the data on arts versus science majors. I say this as someone who majored in English and now is in grad school in the same subject, but by anecdotal observation I would guess about 75% of people in humanities grad schools are pointlessly delaying real life.

Thoughts on the first 100 pages of Jeffrey Eugenides' The Marriage Plot

1) I would have stopped reading The Marriage Plot if it weren’t also related to some of my academic work. It captures the feel of slogging through a 19th Century novel. As you might imagine, this isn’t a compliment.

2) Until about 100 pages in, no characters have real problems. They have fake, rich-college-student problems. I’m not opposed to such problems for the people experiencing them—I remember having similar ones and thinking they were significant at the time, too—but the real problem in the form of Leonard’s psychotic breakdown should arrive closer to page 40 or 50. Madeleine’s minor undergraduate affairs are much less interesting and hilarious than Karen Owen’s “An education beyond the classroom: excelling in the realm of horizontal academics” (which is a PowerPoint document). Owen’s work feels more honest.

3) If you want a better but less hyped novel about the undergraduate experience in an Ivy-League setting, try Tom Perrotta’s Joe College. Notice that you can also get the hardback for $4, shipped, from Amazon. Notice too how Danny in that novel has real problems: he’s a fish-out-of-water, his father’s business might be falling apart, and his actions have real consequences for him and others around him. He has to master a skill (being a lunch-truck driver) and understand that skill. Failure may result in his ejection from Edenic Yale. So far no one in The Marriage Plot has a real job; they’re like characters in Jane Austen. There may be consequences coming in the latter sections, but based on the dust jacket (a trip to India to find one’s self, a possible stint in grad school), I’m not optimistic.

4) Eugenides’ earlier novels both have major conflicts and problems from the beginning: Middlesex asks how to survive and adapt as a transexual (who as a group still have major problems in contemporary society, compared to average heterosexuals) and how to flee dictator-encumbered countries, while The Virgin Suicides (probably my favorite of Eugenides’ work) asks about what really happened to the Lisbon sisters—and, because of the very clever narrative structure, we can never really find out. It’s teasing yet effective, melancholy and happy, a meditation on how we understand the past, deal with love, grow up, don’t grow up, and much more. That last bit sounds grandiose and stupid, but in the context of the novel it’s not.

5) Given the timeline in the section I’ve read so far—late 1970s, early 1980s—I keep thinking about the most consequential thing happening in the world at that time: the personal computer revolution in Silicon Valley. Jobs, Wozniack, Gates, and millions of other, less famous names were building the future. This is an insanely unfair criticism of a novel, but it’s stuck in my mind anyway, like a background process that occasionally pops an alert into my consciousness: some people are doing real things. I dismiss the alert, but it’s set to go off occasionally anyway, and I don’t have the heart to sudo kill -9.

EDIT: I was reading Hacker News this morning and found this:

The offices of Zelnick Media were packed on a recent evening for #DigitalWes, an alumni gathering for the graduates of Wesleyan University who had made their way from jam bands and cultural theory to the warp-speed world of Silicon Alley. Guests nibbled shrimp and steak skewers while taking in a sumptuous view of midtown Manhattan from the roof deck. The hosts were Strauss Zelnick and his partner, Jim Freidlich, both class of ’79, whose Take Two Interactive has produced some of the best-selling and most controversial video games of the past decade.

Same demographic, same timeline, note the mention of “cultural theory.”

6) Reading The Game has spoiled me on excessive beta-male behavior. Watching Mitchell around the beautiful and distant Madeleine mostly makes me want to tell him what he’s doing wrong. The Game was published in 2005, so saying this about a novel set before The Game’s publication isn’t fair, but the book still crystalized for me a) what not to do, b) how to eliminate certain kinds of obviously unsuccessful mating behavior, and c) how to think systematically about useful principles in men dealing with women. Being a whiny hanger-on to a person with relatively high dating market value is not good for Mitchell or for Madeleine, the object of his desire. Note that this is not limited to men: I also have low tolerance for women who spend long periods of time throwing themselves on distant alpha males who at best hook up with and then dump them. Don’t want to be hooked up with and dumped? Don’t chase alpha males whose primary attraction appears to be their unattainability. I don’t love novels whose characters’ primary problems can be solved with a simple, one-line piece of advice that, if followed, will result in the solution to said problem.

7) Nineteenth-century novels are not good guides to behavior in the 21st century. Hell, they’re not even good guides to behavior in Brown in the 1979 – 1983 period. This is as true for Madeline and for others. Literary theory is also a pretty crappy guide to real life, which may be part of the reason theory’s hold on English departments has loosened in the last 30 years. Still, perhaps the most hilarious and best scene involves Madeleine throwing Roland Barthes’ A Lover’s Discourse, which alleges that there is no such thing as love, only the speaking of love, at the boy she loves.

8) I can follow the inside-baseball parts of literary theory (Barthes, Derrida, and other English-department heroes appear, mostly as signals of what various characters believe), but I doubt such things would be of great interest to anyone not in English departments. This relates to #5: it turns out that the really important stuff happening in this time period is happening among tech people, not among grad students in the humanities. A novel about someone who jumps from the one to the other might be interesting, and it could dramatize events with real consequences that don’t automatically revolve around sex and death. Intellectual curiosity is an underutilized motivation in fiction.

9) Another book to read if you want campus-war stuff: Richard Russo’s Straight Man, which is also much funnier.

EDIT: 10) See my full review here.