“All American fiction is young adult fiction: Discuss”

Via Twitter Hollis Robbins offers a prompt: “‘[A]ll American fiction is young-adult fiction.’ Discuss.” Her takeoff is A. O. Scott’s excellent “The Death of Adulthood in American Culture,” which you should go read; oddly, it does not mention the show Entourage, which may be the best contemporary narrative artifact / fantasy about the perpetual party.*

American fiction tends toward comedy more than “young-adult” because comedy = tragedy – consequences. AIDS fiction is tragic because people die. Most contemporary heterosexual love stories are comedy because the STIs tend to be curable or not that important; people who are diligent with birth control rarely get pregnant. Facing death, starvation, or other privations have always been the adult’s lot, and adults who made sufficiently bad choices regarding resource allocation or politics died. Think of the numerous adults who could have done everything they could to flee the area between Russia and Germany in 1914 and didn’t, or the ones who didn’t after 1918 and before the Holocaust. The example is extreme but it illustrates the principle. Frontier and farm life was relentlessly difficult and perilous.

Today by contrast we live in the a world of second chances. America is a “victim,” although that is the wrong word, of its own success. If you color more or less inside the lines and don’t do anything horrendous, life can be awesome. People with an agreeable and conscientious disposition can experience intense pleasures and avoid serious pain for decades; not everyone takes to this (see for example the works of Michel Houellebecq) but many do. The literary can write essays, the scientists can do science, the philosophers can argue with each other, the business guys have a fecund environment, and the world’s major problems are usually over “there” somewhere, across the oceans. If we ever get around to legalizing drugs we’ll immediately stabilize every country from Mexico to Chile.**

What are the serious challenges that Americans face as a whole? In the larger world there is no real or serious—”serious” being a word associated with adulthood—ideological alternatives to democracy or capitalism. Dictatorships still exist but politics are on the whole progressing instead of regressing, Russia and parts of the Middle East excepted.

One could reframe the question of all American fiction being young adult fiction to: “Why not young adult fiction?” Adults send young people to war to die; adulthood is World War II, us against them, thinking that if we don’t fight them in Saigon we’ll have to fight them in Seattle. Adults brought us Vietnam. Young people brought us rock ‘n’ roll, rap, and EDM. Adults want to be dictators, whether politically or religiously, and the young want to party and snag the girl(s) or guy(s) of their dreams.

Adulthood is associated with boredom, stagnation, suburbs, and death. Responsibility is for someone else, if possible, and those who voluntarily assume responsibility rarely seem to be rewarded for it in the ways that really count (I will be deliberately ambiguous on what those ways are). Gender politics and incentives in the U.S. and arguably Western Europe are more screwed up than many of us would want to admit, and in ways that current chat among the clerisy and intellectual class do not reflect or discuss. If adulthood means responsibility, steady jobs, and intense fidelity, then we’ve been dis-incentivizing it for decades, though we rarely want to confront that.

Many people are so wealthy and safe that they are bored. In the absence of real threats they invent fake ones (vaccines) or worry disproportionately about extremely unlikely events (kidnapping). Being a steady person in a steady (seeming) world is often thus perceived as being dull. In contemporary dating, does the stolid guy or girl win, or does hot funny and unreliable guy or girl win?

A lot of guys have read the tea leaves: divorce can be a dangerous gamble while marriage offers few relationship rewards that can’t be achieved without involving the legal establishment or the state more generally. A shockingly large number of women are willing to bear the children of men they aren’t married to: 40.7%$ of births now occur to unmarried women, and that number has been rising for decades.

Why take on responsibility when no one punishes you for evading it and arguably active irresponsibility is rewarded in many ways, while safety nets exist to catch those who are hurt by the consequences of their actions? That’s our world, and it’s often the world of young adulthood; in fiction we can give ourselves monsters to fight and true enduring love that lasts forever, doesn’t have bad breath in the morning, and doesn’t get bored of us in four years. Young adult fiction gives us the structure lacking in the rest of our lives.

Moreover, there has always been something childlike in the greatest scientists and artists. Children feel unconstrained by boundaries, and as they grow older they feel boundaries more and more acutely. I’m not about to argue that no one should have boundaries, but I am going to argue that retaining an adult version of the curiosity children have and the freedom they have is useful today and in many cases has always been useful.

The world has gotten so efficient that vast pools of money are available for venture capitalists to fund the future and tech guys to build or make it. The biggest “problem” may be that so many of us want to watch TV instead of writing code, but that may be a totally bunk argument because consumption has probably always been more common and easier than production.

In this world fiction should tend towards comedy, not the seriousness too typically associated with Literature.

If American fiction is young adult fiction, that may be a sign of progress.***


* Another show, Californication, mines similar themes but with (even weaker) plots and total implausibility. Here is an essay disagreeing with Scott: Adulthood Isn’t Dead.

** Breaking Bad and innumerable crime novels would have no driving impetus without drug prohibition. The entire crime sector would be drastically smaller almost overnight were we to legalize drugs and prostitution. That would be a huge win for society but harmful to fiction writers.

*** Usually I eschew polemics but today I make an exception.

The appeal of “pickup” or “game” or “The Redpill” is a failure of education and socialization

Since posting “The inequality that matters II: Why does dating in Seattle get left out?” and “Men are where women were 30 years ago?” I’ve gotten into a couple discussions about why Neil Strauss’s The Game is popular and why adjacent subjects like “pickup” and the “Redpill” have picked up steam too. One friend wrote, “It’s so tedious to see how resentful men get—a subject much in the news lately because of the Santa Barbara shooting…”

That’s somewhat true, but underlying, longer-term trends are worth examining. The world is more complex than it used to be in many respects, and that includes sex and dating. Until relatively recently—probably the late 60s / early 70s—it was probably common for most guys to marry a local girl, maybe straight out of high school, and marry a girl whose parents the guy probably knows and her parents probably know the guy’s. Parents, families, and religious authorities probably had a strong effect on what their children did, and a lot of men and women probably married as virgins. The dating script was relatively easy to follow and relatively many people paired up early. In the 60s an explosion of divorces began, and that complicated matters in ways that are still being sorting out.

Today there are more hookups for a longer period of time and fewer universal scripts that everyone follows, or is supposed to be following. Instead, one sees a proliferation of possibilities, from the adventurous player—which is not solely a male role—to early marriage (though those early marriages tend to end in divorce) Dating “inequality” has probably increased, since the top guys are certainly having a lot more sex than the median or bottom guys. To some extent that dynamic has probably always been true, but now “top” could mean dozens of partners at a relatively early age, and the numerical top is more readily available to guys who want it. In the old regime it was probably possible for almost everyone to find a significant other of some sort (and I think families had more sway and say). Now that may be harder, especially for guys towards the bottom who don’t want to realize that if they’re towards the bottom the women they’re likely to attract are likely to be around the same place.

I’ve also noticed an elegiac sense that a weirdly large number of the “pickup artists” or “Red Pill” (sometimes it’s used as two words, sometimes as one) or “manosphere” guys have about the past, and how back then it was relatively easy to find, date, and marry a woman. Much of this is probably mythological, and I don’t think most of them would be happy marrying at 20 or 24 and having two or three kids by 28 or 29.

These stereotypes are no doubt riddled with holes—see further the oeuvre of John Updike—but they probably hold up reasonably well in terms of examining broad trends. Today almost no one gets married straight out of high school. Routine moves from city to city are normal, and each move often rips someone from the social networks that provide romantic connections. Families play a smaller and smaller role. If you don’t have the infrastructure of school, how do you meet lots of new people? Jobs are one possibility but looking for romantic prospects at work has obvious pitfalls. Online dating is another, but people who can’t effectively date offline often aren’t any better on (and are often worse).

Technology matters too. Technologies take a long time—decades, at least—to really reach fruition and for their ripples to be felt throughout societies and cultures. Virtually all big ideas start small.* That’s an important lesson from Where Good Ideas Come From, The Great Stagnation, The Enlightened Economy, and similar books about technological, economic, and social history. A suite of interrelated technologies around birth control (like hormonal birth control itself, better forms of it, and easy condom distribution and acquisition) are still playing out. Same with antibiotics and vaccines against STIs. VOX offers one way to think about this in “From shame to game in one hundred years: An economic model of the rise in premarital sex and its de-stigmatisation.” It begins:

The last one hundred years have witnessed a revolution in sexual behaviour. In 1900, only 6% of US women would have engaged in premarital sex by the age of 19, compared to 75% today . . . Public acceptance of premarital sex has reacted with a lag.

Culture is still catching up. Pickup, game, and the Redpill are part of that, and they are responses from guys frustrated by the way their own efforts fail while some of their peers’s efforts succeed. A lot of women appear less interested in an okay guy with an okay job and an okay but not that exciting or fun life, relative to guys with a different set of qualities. Men invest in what they think women want and women invest in what they think men want, and relative wants have changed over time.

Pickup artists and those who read them are responding to a cultural milieu in which most guys get terrible socialization regarding dating and women. At the same time guys see a smallish number of extraordinarily successful guys (though they often don’t see the value behind the extraordinarily successful guys). What are those successful guys doing? How? Why? Pickup artists, whatever their flaws, are trying to answer that question, sometimes more successfully and sometimes less. They’re also trying to answer that question and related questions in a concrete way, which most people, including their detractors, aren’t. I wrote about that issue in a review of Clarisse Thorn’s Confessions of a Pickup Artist Chaser:

feminism does very little to describe, let alone evaluate, how micro, day-to-day interactions are structured. Pickup artists, or whatever one may want to call guys who are consciously building their skills at going out and getting women, are describing the specific comments, conversations, styles, and venues women respond to. The pickup artists are saying, “This is how you approach a woman in a bar, this is how you strike up a conversation at the grocery store, and so forth.” In other words, they’re looking at how people actually go about the business of getting laid. Their work is often very detailed, and the overall thrust is toward the effectiveness of getting laid rather than how male-female interactions work in theory. Feminism, in Thorn’s view, appears to be silent, or mostly silent, on the day-to-day interactions.

Who else is doing that? Almost no one. As with virtually any other topic, one can muddle along through trial and error (and mostly error) or one can try to systematically learn about it and apply that learning to the problem domain, along with the learning others have done. That’s what the pickup people are doing, or trying to do.

To be sure, the worst of the group if just trying to sell shit, and sell as much of it as possible to fools. The best of the group is saying things that almost no one else is saying. They also say it’s hard. Look at “Krauser:”

The PUA cartel saw you coming and will sell you magic pills and 3 Secrets To Make Her Wet as long as your credit card is below it’s limit. If you’re looking to score something for nothing, you’ll end up with nothing. Daygame is hard. Very very hard.

He calls out the “hack mentality” in the same post. Caricature is easy, but the guys who are really paying attention aren’t easily caricatured.

Though it isn’t out yet, Tucker Max, Geoffrey Miller, and Nils Parker are writing Mate: The Young Man’s Guide To Sex And Dating, which is, among other things, a description of modern dating and a description of why so many guys do it so badly for so long. Confusion reigns, and the book promises to be the sort of fun-but-comprehensive read that can be given to unhappy, puzzled guys who understand something is wrong but don’t know how to fix it.

One strategy in response to new social circumstances is to figure out what you should do to be reasonably successful and what you can do to make yourself more appealing. This is not a male-only question: virtually every issue of Cosmo is about how to attract men, retain men, and deal with female friends and rivals. Another is to blame women, or withdraw from dating, or kill innocents because of your own frustration. If you think half the population isn’t into you, the problem is with you, not the population. There’s an important similarity to business here: If you start a business and no one wants to buy your products or services, you can blame the market or you can realize that you’re not doing what people want.

It’s easier to blame women than it is to make real changes, and there is a tendency among some of the self-proclaimed “Redpill”-types to do that. Paul Graham notes that the real secret to making wealth is to “Make something people want.” In dating the real secret (which isn’t a secret) is to be a person who people like. How to do that can be a whole book’s worth of material.

Blame is easy and improvement is hard. Short guys do have it harder than tall guys—but so what? Go ask a fat girl, or a flat-chested one, how much fun dating is for her, compared to her slenderer or better-endowed competitors. Honesty in those conversations is probably rare, but it is out there: usually in late-night conversations after a couple drinks.

I don’t hate “pickup artists” as a group, though I dislike the term and wish there was something better. Many of the things critics say are accurate. But criticizing without recognizing the impetus for the development in the first place is attacking the plant while ignoring the roots. This post, like so many of the posts I write, is looking at or attempting to look at the root.

Feminism didn’t come from nowhere. Neither had pickup.


* Which is not to say that all small ideas will automatically become big. Most don’t. But ideas, technologies, practices, and cultures spread much more slowly than is sometimes assumed, especially among the rah-rah tech press.

The modern art (and photography) problem

In “Modern art: I could have done that… so I did: After years of going to photography exhibitions and thinking he could do better, Julian Baggini gave it a go. But could he convince The Royal West of England Academy with his work?“, Baggini writes:

there are times when we come across something so simple, so unimpressive, and so devoid of technical merit that we just can’t help believing we could have done as well or better ourselves.

He’s right—except that this happens entirely too often and helps explain much of modern art’s bogosity. I’m not the only person to have noticed—in Glittering Images, Camille Paglia writes:

the big draws [for museums] remain Old Master or Impressionist painting, not contemporary art. No galvanizing new style has emerged since Pop Art, which killed the avant-garde by embracing commercial culture. Art makes news today only when a painting is stolen or auctioned at a record price.

She’s right too; many people have noticed this but few apparently have in the art world itself, which seems to have become more interested in marketing than making (a problem afflicting the humanities in academia too). But there are enough people invested in and profiting from propagating bogosity that they can remain indifferent to countervailing indifference.

OLYMPUS DIGITAL CAMERAYears ago I was at the Seattle Art Museum and looking various pieces of modern supposed “art” that consisted mostly of a couple lines or splotches and what not, and they made me think: “there’s a hilarious novel in here about a director who surreptitiously hangs her own work—and no one notices.” Unfortunately, now I’ve realized that people have already done this, or things like it, in the real world—and no one cared. It’s barely possible to generate scandal in the art world anymore; conservatives have mostly learned about the Streisand effect and thus don’t react to the latest faux provocation. The artists themselves often lack both anything to say and any coherent way of saying it.

To the extent people respond to art, they respond to the art that people made when it took skill be an artist.

Photography has a somewhat similar problem, except that it’s been created by technology. Up until relatively recent it took a lot of time, money, and patience to become a reasonably skilled photographer. Now it doesn’t take nearly as much of any of those things: last year’s cameras and lenses still work incredibly well; improvements in autofocus, auto-exposure, and related technologies make photos look much better; and it’s possible to take, review, and edit hundreds or thousands of photos at a time, reducing the time necessary to go from “I took a picture” to expert.

The results are obvious for anyone who pays attention. Look through Flickr, or 500px, or any number of other sites and you’ll see thousands of brilliant, beautiful photos. I won’t say “anyone can do it,” but many people can. It’s also possible to take great photos by accident, with the machine doing almost all the work apart from the pointing and clicking. Adding a little bit of knowledge to the process is only likely to increase the keeper rate. Marketing seems to be one of the primary differentiators among professional photographers; tools like Lightroom expand the range of possibility for recovering from error.

One of the all-time top posts on Reddit’s photography section is “I am a professional photographer. I’d like to share some uncomfortable truths about photography,” where the author writes that “It’s more about equipment than we’d like to admit” and “Photography is easier than we’d like to admit.”

The profession is dying, for reasons not identical to painting but adjacent to it. In photography, we’re drowning in quality. In fine art, we’re drowning in bogosity, and few people appear to be interested in rescuing the victim.

Journalism, physics and other glamor professions as hobbies

The short version of this Atlantic post by Alex C. Madrigal is “Don’t be a journalist,” and, by the way, “The Atlantic.com thinks it can get writers to work for free” (I’m not quoting directly because the article isn’t worth quoting). Apparently The Atlantic is getting writers to work for free, because many writers are capable of producing decent-quality work, and the number of paying outlets are shrinking. Anyone reading this and contemplating journalism as a profession should know that they need to seek another way of making money.

The basic problems journalism faces, however, are obvious and have been for a long time. In 2001, I was the co-editor-and-chief of my high school newspaper and thought about going into journalism. But it was clear that the Internet was going to destroy a lot of careers in journalism. It has. The only thing I still find puzzling is that some people want to major in journalism in college, or attempt to be “freelance writers.”

A lot of friends know about my journalism background and writing, and they’ve asked why I don’t do freelance writing. When I tell them that there’s less money in it than getting a job at Wal*Mart they look at me like I’m a little crazy—they don’t really believe that’s true, even when I ask them how many newspapers they subscribe to (median and mode answer: zero). Many of them, however, spend hours reading stuff for free online.

In important ways I’m part of the problem, because on this blog I’m doing something that used to be paid most of the time: reviewing books. Granted, I write erratically and idiosyncratically, usually eschewing the standard practices of book reviews (dull, two-paragraph plot summaries are stupid in my view, for instance), but I nonetheless do it and often do it better than actual newspapers or magazines, which I can say with confidence because I’ve read so many dry little book reports in major or once-major newspapers. Not every review I write is a critical gem, but I like doing it and thus do it. Many of my posts also start life as e-mails to friends (as this one did). I also commit far more typos than a decently edited newspaper or magazine. Which I do correct when you point them out.

The trajectory of journalism is indicative of other trends in American society and indeed the industrialized world. For example, a friend debating whether he should consider physics grad school wrote this to me recently: “I think physics is something that is fun to study for fun, but to try to become a professional physicist is almost like too much of a good thing.” He’s right. Doing physics for fun, rather than trying to get a tenure-track job, makes more sense from a lifestyle standpoint.

A growing number of what used to occupations seem to be moving in this direction. Artists got here first, but others are making their way here. I’m actually going to write a post about how journalism increasingly looks like this too. The obvious question is how far this trend will go—what happens when many jobs that used to be paid become un-paid?

Tyler Cowen thinks we might be headed towards a guaranteed annual income, an idea that was last popular in the 60s and 70s. When I asked Cowen his opinions about guaranteed annual incomes, he wrote back to say that he’d address the issue in a forthcoming book. The book hasn’t arrived yet, but I look forward to reading it. As a side not, apparently Britain has, or had, a concept called the “Dole,” which many people went on, especially poor artists. Geoff Dyer wrote about this some in Otherwise Known as the Human Condition. The Dole subsidized a lot of people who didn’t do much, but it also subsidized a lot of artists, which is pretty sweet; one can see student loans and grad school serving analogous roles in the U.S. today.

IMG_1469-1Even in programming, which is now the canonical “Thar be jobs!” (pirate voice intentional) profession, some parts of programming—like languages and language development—basically aren’t remunerative. Too many people will do it free because it’s fun, like amateur porn. In the 80s there were many language and library vendors, but nearly all have died, and libraries have become either open source or rolled into a few large companies like Apple and Microsoft. Some aspects of language development are cross-subsidized in various ways, like professors doing research, or companies paying for specific components or maintenance, but it’s one field that has, in some ways, become like photography, or writing, or physics, even though programming jobs as a whole are still pretty good.

I’m not convinced that the artist lifestyle of living cheap and being poor in the pursuit of some larger goal or glamor profession seems is good or bad, but I do think it is (that we have a lot of good cheap stuff out there, and especially cheap stuff in the form of consumer electronics, may help: it’s possible to buy or acquire a nearly free, five-year-old computer that works perfectly well as a writing box).* Of course, many starving artists adopt that as a pose—they think it’s cool to say they’re working on a novel or photography project or “a series of shorts” or whatever, but don’t actually do anything, while many people with jobs put out astonishing work. Or at least work, which is usually a precursor to astonishing work.

For some people, the growing ability of people to disseminate ideas and art forms even without being paid is a real win. In the old days, if you wanted to write something and get it out there, you needed an editor or editors to agree with you. Now we have a direct way of resolving questions about what people actually want to read. Of course, the downside is that whole payment thing, but that’s the general downside of the new world in which we live, and, frankly it’s one that I don’t have a society-wide solution for.

In writing, my best guess is that more people are going to book-ify blogs, and try to sell the book for $1 – $5, under the (probably correct) assumption that very few people want to go back and read a blog’s entire archives, but an ebook could collect and organize the material of those archives. If I read a powerful post by someone who seemed interesting, I’d buy a $4 ebook that covers their greatest hits or introduced me to their broader thinking.

This is tied into other issues around what people spend their time doing. My friend also wrote that he read “a couple of articles on Keynes’ predictions of utopia and declining work hours,” but he noted that work still takes up a huge amount of most people’s lives. He’s right, but most reports show that median hours worked in the U.S. has declined, and male labor force participation has declined precipitously. Labor force participation in general is surprisingly low. Ross Douthat has been discussing this issue in The New York Times (a paid gig I might add), and, like, most reasonable people he has a nuanced take on what’s happening. See also this Wikipedia link on working time for some arguments that working time has declined overall.

Working time, however, probably hasn’t decreased for everyone. My guess is that working time has increased for some smallish number of people at the top of their professors (think lawyers, doctors, programmers, writers, business founders), with people at the bottom often relying more on government or gray market income sources. Douthat starts his essay by saying that we might expect working hours among the rich to decline first, so they can pursue more leisure, but he points out that the rich are working more than ever.

Though I am tempted to put “working” in scare quotes, because it seems like many of the rich are doing things they would enjoy doing on some level anyway; certainly a lot of programmers say they would keep programming even if they were millionaires, and many of them become millionaires and keep programming. The same is true of writers (though fewer become millionaires). Is writing a leisure or work activity for me? Both, depending. If I self-publish Asking Anna tomorrow and make a zillion dollars, the day after I’ll still be writing something. I would like to get paid but some of the work I do for fun isn’t contingent on me getting paid.

Turning blogs into books and self-publishing probably won’t replace the salaries that news organizations used to pay, but it’s one means for writers or would-be writers to get some traction.

Incidentally, the hobby-ification of many professions makes me feel pretty good about working as a grant writing consultant. No one think when they’re 14, “I want to be a grant writer like Isaac and Jake Seliger!”, while lots of people want to be like famous actors, musicians, or journalists. There is no glamor, and grant writing is an example of the classic aphorism, “Where there’s shit, there’s gold” at work.

Grant writing is also challenging. Very few people have the weird intersection of skills necessary to be good, and it’s a decade-long process to build those skills—especially for people who aren’t good writers already. The field is perpetually mutating, with new RFPs appearing and old ones disappearing, so that we’re not competing with proposals written two years ago (where many novelists, for example, are in effect still competing with their peers from the 20s or 60s or 90s).

To return to journalism as a specific example, I can think of one situation in which I’d want The Atlantic or another big publisher to publish my work: if I was worried about being sued. Journalism is replete with stories about heroic reporters being threatened by entrenched interests; Watergate and the Pentagon Papers are the best-known examples, but even small-town papers turn up corruption in city hall and so forth. As centralized organizations decline, individuals are to some extent picking up the slack, but individuals are also more susceptible to legal and other threats. If you discovered something nasty about a major corporation and knew they’d tie up your life in legal bullshit for the next ten years, would you publish, or would you listen to your wife telling you to think of the kids, or your parents telling you to think about your career and future? Most of us are not martyrs. But it’s much harder for Mega Corp or Mega Individual to threaten The Atlantic and similar outlets.

The power and wealth of a big media company has its uses.

But such a use is definitely a niche case. I could imagine some of the bigger foundations, like ProPublica, offering a legal umbrella to bloggers and other muckrakers to mitigate such risks.

I have intentionally elided the question of what people are going to do if their industries turn towards hobbies. That’s for a couple reasons: as I said above, I don’t have a good solution. In addition, the parts of the economy I’m discussing here are pretty small, and small problems don’t necessarily need “solutions,” per se. People who want to turn their hours into a lot of income should try to find ways and skills to do that, and people who want to turn their hours into fun products like writing or movies should try to find ways to do that too. Crying over industry loss or change isn’t going to turn back the clock, and just because someone could make a career as a journalist doesn’t mean they can today.


* To some extent I’ve subsidized other people’s computers, because Macs hold their value surprisingly well and can be sold for a quarter to half of their original purchase price three to five years after they’ve been bought. Every computer replaced by my family or our business has been sold on Craigslist. Its also possible, with a little knowledge and some online guides, to add RAM and an SSD to most computers made in the last couple of years, which will make them feel much more responsive.

Why little black books instead of phones and computers

“Despite being a denizen of the digital world, or maybe because he knew too well its isolating potential, Jobs was a strong believer in face-to-face meetings.” That’s from Walter Isaacson’s biography of Steve Jobs. It’s a strange way to begin a post about notebooks, but Jobs’ views on the power of a potentially anachronistic practice applies to other seemingly anachronistic practices. I’m a believer in notebooks, though I’m hardly a luddite and use a computer too much.

The notebook has an immediate tactile advantage over phones: they aren’t connected to the Internet. It’s intimate in a way computers aren’t. A notebook has never interrupted me with a screen that says, “Wuz up?” Notebooks are easy to use without thinking. I know where I have everything I’ve written on-the-go over the last eight years: in the same stack. It’s easy to draw on paper. I don’t have to manage files and have yet to delete something important. The only way to “accidentally delete” something is to leave the notebook submerged in water.Notebook stack

A notebook is the written equivalent of a face-to-face meeting. It has no distractions, no pop-up icons, and no software upgrades. For a notebook, fewer features are better and fewer options are more. If you take a notebook out of your pocket to record an idea, you won’t see nude photos of your significant other. You’re going to see the page where you left off. Maybe you’ll see another idea that reminds you of the one you’re working on, and you’ll combine the two in a novel way. If you want to flip back to an earlier page, it’s easy.

The lack of editability is a feature, not a bug, and the notebook is an enigma of stopped time. Similar writing in a computer can function this way but doesn’t for me: the text is too open and too malleable. Which is wonderful in its own way, and that way opens many new possibilities. But those possibilities are different from the notebook’s. It’s become a cliche to argue that the technologies we use affect the thoughts we have and the way we express those thoughts, but despite being cliche the basic power of that observation remains. I have complete confidence that, unless I misplace them, I’ll still be able to read my notebooks in 20 years, regardless of changes in technology.

In Distrust That Particular Flavor, William Gibson says, “Once perfected, communication technologies rarely die out entirely; rather, they shrink to fit particular niches in the global info-structure.” The notebook’s niche is perfect. I don’t think it’s a coincidence that Moleskine racks have proliferated in stores at the same time everyone has acquired cell phones, laptops, and now tablets.

In The Shallows, Nicholas Carr says: “The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.” Cell phones subtly change our relationship with time. Notebooks subtly change our relationship with words and drawings. I’m not entirely sure how, and if I were struggling for tenure in industrial design or psychology I might start examining the relationship. For now, it’s enough to feel the relationship. Farhad Manjoo even cites someone who studies these things:

“The research shows that the type of content you produce is different whether you handwrite or type,” says Ken Hinckley, an interface expert at Microsoft Research who’s long studied pen-based electronic devices. “Typing tends to be for complete sentences and thoughts—you go deeper into each line of thought. Handwriting is for short phrases, for jotting ideas. It’s a different mode of thought for most people.” This makes intuitive sense: It’s why people like to brainstorm using whiteboards rather than Word documents.

IMG_2100I like to write in notebooks despite carrying around a smartphone. Some of this might be indicative of the technology I grew up with—would someone familiar with smartphone touchscreens from age seven have sufficiently dexterous fingers to be faster than they would be with paper?—but I think the obvious answer to “handwriting or computer?” is “both, depending.” As I write this sentence, I have a printout of a novel called ASKING ANNA in front of me, covered with blue pen, because editing on the printed page feels different to me than editing on the screen. I write long-form on computers, though. The plural of anecdote is not data. Still, I have to notice that using different mediums appears to improve the final work product (insert joke about low quality here).

There’s also a shallow and yet compelling reason to like notebooks: a disproportionate number of writers, artists, scientists, and thinkers like using them too, and I suspect that even contemporary writers, artists, scientists, and thinkers realize that sometimes silence and not being connected is useful, like quiet and solitude.

In “With the decline of the wristwatch, will time become just another app?”, Matthew Battles says:

Westerners have long been keenly interested in horology, as David Landes, an economic historian, points out in Revolution in Time, his landmark study of the development of timekeeping technology. It wasn’t the advent of clocks that forced us to fret over the hours; our obsession with time was fully in force when monks first began to say their matins, keeping track of the hours out of strict religious obligation. By the 18th century, secular time had acquired the pressure of routine that would rule its modern mode. Tristram Shandy’s father, waiting interminably for the birth of his son, bemoans the “computations of time” that segment life into “minutes, hours, weeks, and months” and despairs “of clocks (I wish there were not a clock in the kingdom).” Shandy’s father fretted that, by their constant tolling of the hours, clocks would overshadow the personal, innate sense of time—ever flexible, ever dependent upon mood and sociability.

The revolution in electronic technology is wonderful in many ways, but its downsides—distraction, most obviously—are present too. The notebook combats them. Notebooks are an organizing or disorganizing principle: organizing because one keeps one’s thoughts, but disorganizing because one cannot rearrange, tag, and structure thoughts in a notebook as one can on a screen (Devonthink Pro is impossible in the real world, and Scrivener can be done but only with a great deal of friction).

Once you try a notebook, you may realize that you’re a notebook person. You might realize it without trying. If you’re obsessed with this sort of thing, see Michael Loper / Rands’ Sweet Decay, which is better on validating why a notebook is important than evaluating the notebooks at hand. It was also written in 2008, before Rhodia updated its Webbie.

Like Rands, I’ve never had a sewn binding catastrophically fail. As a result, notebooks without sewn bindings are invisible to me. I find it telling that so many people are willing to write at length about their notebooks and use a nominally obsolete technology.

Once you decide that you like notebooks, you have to decide which one you want. I used to like Moleskines, until one broke, and I began reading other stories online about the highly variable quality level.

So I’ve begun ranging further afield.

I’ve tested about a dozen notebooks. Most haven’t been worth writing about. But by now I’ve found the best reasonably available notebooks, and I can say this: you probably don’t actually want a Guildhall Pocket Notebook, which is number two. You want a Rhodia Webnotebook.

Like many notebooks, the Guildhall starts off with promise: the pages do lie flat more easily than alternatives. Lines are closely spaced, maximizing writable area, which is important in an expensive notebook that shouldn’t be replaced frequently.

IMG_3900I like the Guildhall, but it’s too flimsy and has a binding that appears unlikely to withstand daily carry. Mine is already bending, and I haven’t even hauled it around that much. The Rhodia is somewhat stiffer. Its pages don’t lie flat quite as easily. The lines should go to the end of each page. But its great paper quality and durability advantage make it better than the alternatives.

The Rhodia is not perfect. The A7 version, which I like better than the 3.5 x 5.5 American version, is only available in Europe and Australia, which entails high shipping costs. The Webbie’s lines should stretch to the bottom of the page and be spaced slightly closer together. The name is stupid; perhaps it sounds better in French. The notebook’s cover extends slightly over its paper instead of aligning perfectly. Steve Jobs would demand perfect alignment. To return to Isaacson’s biography:

The connection between the design of a product, its essence, and its manufacturing was illustrated for Jobs and Ive when they were traveling in France and went into a kitchen supply store. Ive picked up a knife he admired, but then put it down in disappointment. Jobs did the same. ‘We both noticed the tiny bit of glue between the handle and the blade,’ Ive recalled. They talked about how the knife’s good design had been ruined by the way it was being manufactured. ‘We don’t like to think of our knives as being glued together,’ Ive said. ‘Steve and I care about things like that, which ruin the purity and detract from the essence of something like a utensil, and we think alike about how products should be made to look pure and seamless.

I wish the Rhodia were that good. But the Rhodia’s virtues are more important than its flaws: the paper quality is the highest I’ve seen, and none of the Rhodias I’ve bought have broken. If anyone knows of a notebook that combines the Rhodia’s durability with the qualities it lacks, by all means send me an e-mail.


More on the subject: The Pocket Notebooks of 20 Famous Men.

EDIT: See also Kevin Devlin’s The Death of Mathematics, which is about the allure of math by hand, rather than by computer; though I don’t endorse what he says, in part because it reminds me so much of Socrates decrying the advent of written over oral culture, I find it stimulating.

Subjectivity in writing and evaluating writing

This essay started its life as an e-mail to a student who wanted to know if all writing was, on some level, “just subjective,” which would imply that grading is bogus and so is much of what we do in English classes. I didn’t have time to offer a nuanced explanation of what makes good writing good, so I wrote to him later that night. He didn’t reply to this e-mail.

I was thinking about our conversation and realized that I have more to say about the issues of subjectivity and skill in writing. As you observed, there is an element of subjectivity in judging what’s good writing and what isn’t. But it’s also worth noting that dominant opinions change over time—a lot of the writing from the 18th and 19th Century, for example, was considered “good” if it contained long sentences with balanced, nested clauses. Such stylistic preferences are one reason why a lot of contemporary students have trouble reading such material today, because most of us value variety in sentence structure and value less complexity, overall, in sentence structure. This is normally the place where I could go off on a rant about Facebook and cell phones and texting speak and how the kids these days are going to hell, but I’ll avoid that because it doesn’t appear true overall and certainly isn’t true regarding writing. The overall trend, including among professional writers writing for other expert writers, has been towards simpler structures and informality, which probably tells you a lot about the culture as a whole.

IMG_3049That being said, if you want to write a paper full of long, windy clauses and abstruse classical allusions, I’m not going to stop or penalize you and may even reward you; as long as the content is strong, I’m willing to roll with somewhat unusual stylistic quirks. I’m fairly pluralistic in my view of language use.

So how do you figure out what good writing is? You practice, you read, you think about it, you practice some more, like you would if you were learning to play a guitar. I’ve never heard guitar instructors say that their students say all music is subjective; playing the guitar appears to be transparently hard, in the sense that you know you’re bad at it, in a way that writing isn’t. Still, if you’d like to know a lot more about good writing, take a look at Francine Prose’s Reading Like a Writer, James Wood’s ıHow Fiction Works, and Jan Venolia’s Write Right! When you’re done with those, move on to B. R. Myers’ A Reader’s Manifesto. When you’re done with that, move on to the New York Times’ series Writers on Writing. Collectively, these books will teach you that every word counts and every word choice says something about the writer and the thing the writer is conveying, or trying to convey. Not only that, but every word changes, slightly, the meaning of every word around it. Good writers learn to automatically, subconsciously ask themselves, “Does this word work? Why? Why not? How should I change it? What am I trying to convey here?”

Eventually, over time, skilled writers and thinkers internalize these and other ideas, and their conscious mind moves to other issues, much like a basketball player’s shot happens via muscle memory after it’s been practiced and tweaked over 100,000 repetitions.

In addition, skilled writers are almost always skilled readers, so they have a fairly large, subconscious stock of built-in phrases, ideas, and concepts. Somewhere along the line I’ve read a fair amount about how athletes practice and how athletes become good (perhaps some of that material came from Malcolm Gladwell’s Outliers, or Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience). I know how important practice and repetition are to any skill-based human endeavor. So I combined the idea of skill with writing and skill in basketball, since many students are more familiar with sports than with writing. Where did that analogy come from? I don’t know, exactly, but it’s there now, along with the idea that analogies are good, and explaining what I’m doing is good, and so are many other things.

To return to the athletic analogy, skill in sports also has a subjective element. Is Lebron James now better than Michael Jordan was when Jordan ruled? You can have this argument with morons in bars all day long. I’ve heard it and find it particularly tedious because the outcome is so unimportant. But both players are very clearly good, and at the top of their peers in their respective eras. The comparison at least makes sense.

One could also argue about whether Elmore Leonard or Alain de Botton is the better writer, although I would argue that they’re too different to make that a fruitful comparison; Elmore Leonard would be better matched against someone like Raymond Chandler or Patricia Highsmith. But Leonard and de Botton are both fantastically better writers than most freshmen; for one thing, most freshmen haven’t yet mastered the mechanical parts of writing, like how to use commas consistently and correctly if they wish to, let alone higher questions about vocabulary, metaphor, and so on.

If you really want to get better, spend a lot of time reading, writing, and thinking about those activities. Then look back at your earlier work and judge its quality for yourself. Very few students think the first draft of their first paper is as good as the final draft, and I tend to agree.

With regard to thesis statements, good ones tend to have some aspect of how a text (I hate the term “text,” but it fits here) shows something (“Free-indirect speech in ‘She Wasn’t Soft. . .’”), what a text shows, usually symbolically (“is used to demonstrate how Paula and Jason, despite being a couple, really disdain each other”) and have some larger point to make (“which shows that what people think and how people behave don’t always match”). That’s not a great thesis statement because I’m doing it quickly and freeform, but a better one might say something like, “The use of free-indirect speech in ‘She Wasn’t Soft’ demonstrates that Paula is actually soft, despite her repeated claims to the contrary, and that Jason and Paula’s mutual loathing sustains their relationship, despite what they say.” That’s still not the sort of thesis statement I’d use to write a publishable academic paper, but it’s closer. Many if not most student papers are missing one of those elements. Not every thesis needs all three, but they’re not bad ideas to check for.

Over time and with experience, I’ve developed, and you’ll develop, a fairly good eye for thesis statements. Eventually, when you’re sufficiently practiced, you won’t necessarily use explicit thesis statements—your thesis will be implied in your writing. Neal Stephenson doesn’t really have an explicit thesis statement in “Turn On, Tune In, Veg Out,” although his last line may function as one, and Roland Barthes definitely doesn’t have an explicit one in “The Brain of Einstein.” Thesis statements aren’t necessarily appropriate to all genres, all the time.

When I started teaching, I actually thought I was going to be a revolutionary and not teach thesis statements at all. I wrote about that experience here. The experiment didn’t work. Most undergrads need thesis statements. So I started teaching them, and student papers got better and more focused, and I’ve been doing so ever since.

Your question or questions are about the inherent challenges of writing, and those don’t have easily summarized answers. The problem also comes from language. Language itself is imprecise, or, alternately, layered with meaning; that’s where so much humor and misunderstanding comes from (and humor could be considered a kind of deliberate misunderstanding). I’ve read about how, when computer scientists tried to start making translation systems and natural-language processing systems, they ran into the ambiguity problem—and that problem still hasn’t been fully solved, as anyone who’s tried to use text-to-speech software, or Google translate, can easily find (I wish I could find any citations or discussions regarding this issue; if you happen to run across any, send them over).

This line of questioning also leads into issues of semiotics—how signs, signaling, and reception function—and the degree of specificity necessary to be good. Trying to specify every part of good writing is like trying to specify every aspect of good writing: you get something like McDonald’s. While McDonald’s does a lot of business, I wouldn’t want to eat there, and it’s pretty obvious that something is lost is the process (Joel Spolsky’s article “Big Macs vs. the Naked Chef” (sfw) also uses McDonald’s as a cautionary tale, this time for software developers; you should definitely read it).

Actually, I’m going to interrupt this essay to quote from Joel:

The secret of Big Macs is that they’re not very good, but every one is not very good in exactly the same way. If you’re willing to live with not-very-goodness, you can have a Big Mac with absolutely no chance of being surprised in the slightest.

Bad high school teachers often try to get students to write essays that are not very good in exactly the same way. I’m trying to get students, and myself, to write essays that are good and that a human might actually want to read. This basically guarantees that different students are going to approach the problem space in different ways, some more successfully than others, and different essays are going to be good in different ways. I’m trying to get students to think about the process and, more broadly, to think not just about the solutions, but about the domain; how you conceptualize the problem domain will change what you perceive as the solution. That being said, if you ever find yourself in front of 20 or 30 novice writers, you’ll quickly see that some are much better than others, even if there’s much wiggle room between a C and C+.

I don’t get the sense that students who are unhappy with their grades are unhappy out of a deeply felt and considered sense of aesthetic disagreement about fundamental literary or philosophical principles. I suspect I feel this way partially because I actually have a fairly wide range of what I would consider “good” writing, or at least writing good enough to get through undergrad English classes, and someone with sufficient sophistication and knowledge to make a good argument about aesthetics or the philosophy of writing would be very unlikely to get a sufficiently low mark to want to argue about it. Rather, I think most students who are unhappy about their grades just want better grades, without doing the thinking and writing necessary to get them.

These issues are compounded by the a meta-issue: many if not most K – 12 English (and other humanities) teachers are bad. And many of them aren’t that smart or knowledgeable (which tends to overlap with “bad”). So a lot of students—especially those on the brighter side—inchoately know that their teachers are bad, and that something stinks, and therefore they conclude that English is bogus anyway, as are related fields. This has a lot of unfortunate consequences on both the individual and societal level; books like C.P. Snow’s The Two Cultures are one manifestation of this larger problem.

In general, I would like for people to try and get along, see each other’s points of view, and be tolerant—not only in fields like religion and politics, but also things like the humanities / sciences, or reason / emotion, or any number of the other possibly false binaries that people love to draw for reasons of convenience.

Finally, at least on this topic, it’s worth noting that, if you think I’m completely wrong about what makes good writing (and what makes writing good), you have a huge world out there and can judge the reaction to your writing. Twilight and The Da Vinci Code are poorly written novels, yet millions of people have read and enjoyed them—many fewer than have read Straight Man, one of my favorite novels and one that’s vastly better written. Who’s right: the millions of teenage girls who think they’re in love with the stilted, wooden prose that makes up Edward, or me, who sees the humor in a petulant English department? It depends on what you mean by “right.” If I were a literary agent or editor, I would’ve passed on both Twilight and The Da Vinci Code. Definitions of “good” are uncertain, and the ones I embrace and impose on students are worth questioning. If you can at least understand where I’m coming from and why I hold the views I do, however, I’ll consider my work a relative success.

Most people also have different definitions of “good” at different points in their lives; I’m in my 20s and view writing very differently than I did in my teens. I would be surprised if I view writing the same way in my 40s. One major change is that I’ve done so much reading, and probably will do so much reading. Someone who doesn’t read very much, or doesn’t challenge themselves when they do read, may find that their standards don’t change as much either. I could write much more on this point alone, but for the most part you’ll have to trust me: your tastes will probably change.

This email is a long way of saying, “I’m not trying to bullshit you, but the problem domain itself is hard, and that domain is not easy to explain, without even getting into its solution.”

The short version of this email is “trust me,” or, alternatively, spend the next ten years of your life pondering and contemplating these issues while reading about them, and then you’ll have a pretty good grasp of what good writing means. Writing is one of these 10,000 hour skills in that it probably takes 10,000 hours of deliberate practice to get good. Start now and you’ll be better in a couple years.

Why you should become a nurse or physicians assistant instead of a doctor: the underrated perils of medical school

Many if not most people who go to medical school are making a huge mistake—one they won’t realize they’ve made until it’s too late to undo.

So many of the medical students, residents, and doctors tell the same story that they’ve inspired me to explain, in detail, the underappreciated yet essential problems with medical school and residency. Furthermore, most people don’t realize that nursing provides many of the job security advantages of medical school without binding them to at least a decade, and probably a lifetime, of finance-induced servitude.

The big reasons to be a doctor are a) lifetime earning potential, b) the limited number of doctors who are credentialed annually, which implies that doctors can restrict supply and thus will always have jobs available, c) higher perceived social status, and d) a desire to “help people” (there will be much more on the dubious value of that last one below). But there are numerous problems with these reasons: a) it takes a long time for doctors to make that money, b) it’s almost impossible to gauge whether you’ll actually like a profession or the process of joining that profession until you do, c) most people underestimate opportunity costs, and d) you have to be able to help yourself before you can help other people.

Straight talk about doctors and money.

Although many doctors will eventually make a lot of money, they take a long time to get there. Nurses, by contrast, can start making real salaries of around $50,000 when they’re 22. Doctors can’t start making real money until they’re at least 29, and often not until they’re much older.

Keep that in mind when you’re reading the following numbers.

Student Doctor reports that family docs made about $130 – $200K on average, which sounds high compared to what I’ve heard on the street. Student Doctor’s numbers also don’t discuss hours. The Bureau of Labor Statistics—a much more reliable source—reports that primary care physicians make an average of $186,044 per year. Notice, however, that’s an average, and it also doesn’t take into account overhead. Notice the table showing that BLS data indicates more than 40% of doctors are in primary care specialties. Family and general practice doctors make a career median annual wage of $163,510.

Nurses, by contrast, make about $70K a year. They also have a lot of market power—especially skilled nurses who might otherwise be doctors. Christine Mackey-Ross describes these economic dynamics in “The New Face of Health Care: Why Nurses Are in Such High Demand.” There’s an obvious reason for nurses’ market power: medical costs are rising and residency programs have a stranglehold on the doctor supply.

This is pretty sweet if you’re already a doctor, because it means you have very little competition and, if you choose a sufficiently demanding specialty, you can make a lot of money. But it’s pretty lousy for the healthcare system as a whole, which is lurching in the direction of finding ways to provide healthcare at lower costs. Like, say, through nurses.

Those nurses are going to end up competing with primary care docs. The New York Times recently published, “U.S. Moves to Cut Back Regulations on Hospitals,” which includes this:

Under the proposals, issued with a view to “impending physician shortages,” it would be easier for hospitals to use “advanced practice nurse practitioners and physician assistants in lieu of higher-paid physicians.” This change alone “could provide immediate savings to hospitals,” the administration said.

Primary care docs are increasingly going to see pressure on their wages from nurse practitioners for as long as health care costs outstrip inflation. For more on the subject, see “Yes, the P.A. Will See You Now,” which starts this way:

Ever since he was a hospital volunteer in high school, Adam Kelly was interested in a medical career. What he wasn’t interested in was the lifestyle attached to the M.D. degree. “I wanted to treat patients, but I wanted free time for myself, too,” he said. “I didn’t want to be 30 or 35 before I got on my feet — and then still have a lot of loans to pay back.”

To recap: nurses start making money when they’re 22, not 29, and they are eating into the market for primary care docs. Quality of care is a concern, but the evidence thus far shows no difference between nurse practitioners who act as primary-care providers and MDs who do. If you’re thinking about being an MD, you should study this issue carefully. You should also be aware that calls to lower doctor pay, like the one found in Matt Ygleasias’s “America’s Overpaid Doctors: We pay our doctors way too much,” are likely to grow louder. Note that I’m not taking a moral or economic stance about whether physician pay should be higher or lower: I’m arguing that the pressure on doctors’ pay is likely to increase, and Yglesias’s arguments are one form that pressure is likely to take.

One friend, who read this and who is a medical resident, simply said that she “didn’t realize that I was looking for nursing.” Or being a physicians assistant (P.A.). She hated her third year of medical school, as most med students do, and got shafted in her residency—which she effectively can’t leave. Adam Kelly is right: more people should realize what “the lifestyle attached to an M.D. degree” means. They should also understand “The Bullying Culture of Medical School” and residency, which is pervasive and pernicious—and contributes to the relationship failures that notoriously plague the medical world. Yet med schools and residencies can get away with this because they have you by the loans.

Why would my friend have realized that she wanted to be a nurse? Our culture doesn’t glorify nursing the way it does doctoring. So a lot of high achievers think being a doctor is the optimal road to success in the medical world. They pay attention to those eye-popping surgeon salary numbers and to the rhetoric about helping people without realizing that nurses help people too, or that their desire to help people is likely to be pounded out of them by a cold, uncaring system that uses the rhetoric of helping to sucker undergrads into mortgaging their souls to student loans. Through the magic of student loans, schools are steadily siphoning off more of doctors’ lifetime earnings. Given constraints and barriers to entry into medicine, I suspect med schools and residencies will be able to continue doing so for the foreseeable future. The logical thing for you, as an individual, to do is exit the market because you have so little control over it. Most prospective doctors who don’t exit the market live to regret it.

Sure, $160K probably sounds like a lot to a random 21-year-old college student, because it is, but after taking into account the investment value of money, student loans for undergrad, student loans for med school, how much nurses make, and residents’ salaries, most doctors’ earnings probably fail to outstrip nurses’ earnings until well after the age of 40. Dollars per hour worked probably don’t outstrip nurses’ earnings until even later.

To some extent, you’re trading happiness, security, dignity, and your sex life in your 20s, and possibly early 30s, for a financial opportunity that might not pay off until your 50s.

The higher social status thing is nice, but not nearly as nice when you’re exhausted at 3 a.m. as a third-year, or exhausted at 3 a.m. as a first-year resident, or exhausted at 3 a.m. as a third-year resident and you’re 30 and you just want a quasi-normal life, damnit, and maybe some time to be an artist. Or when you’re exhausted at 3 a.m. as a regular doctor who happens to be on-call because the senior doctors at the HMO know how to stiff the newbies.

This is where prospective medical students say, “I’m not going to be a family medicine doc.” Which is okay: maybe you won’t be (have fun in seven years of residency instead of three). But don’t confuse the salaries of superstar specialties like neurosurgery and cardiology with the average experience; more likely than not you’re average. There’s this social ideal of doctors being rich. Not all of them are, even with barriers to entry in place.

To belabor the point about money, The Atlantic recently published this: “The average female primary-care physician would have been financially better off becoming a physician assistant.” Notice: “Interestingly, while the PA field started out all male, the majority of graduates today are female. The PA training program is generally 2 years, shorter than that for doctors. Unsurprisingly, subsequent hourly earnings of PAs are lower than subsequent hourly earnings of doctors.” Although the following sentence doesn’t use the word “opportunity costs,” it should: “Even though both male and female doctors both earn higher wages than their PA counterparts, most female doctors don’t work enough hours at those wages to financially justify the costs of becoming a doctor.” I’m not arguing that women shouldn’t become doctors. But I am arguing that women and men both underestimate the opportunity costs of med school, and, if they did understand those costs, fewer would go.

Plus, if you get a nursing degree, you can still go to medical school (as long as you have the pre-requisite courses; hell, you can major in English and go to med school as long as you take the biology, math, physics, and chemistry courses that med schools require). Apparently some medical schools will sniff at nurses who want to become doctors because of the nursing shortage and, I suspect, because med schools want to maintain a clear class / status hierarchy with doctors at top. After all, med schools are run by doctors. But the reality is simpler: medical schools want people with good MCAT scores and GPAs. Got a 4.0 and whatever a high MCAT score is? A med school will defect and take you.

The underrated miseries of residency.

As one friend said, “You can see why doctors turn into the kind of people they do.” He meant that the system itself lets patients abuse doctors, doctors abuse residents, and for people to generally treat each other not like people, but like cogs. At least if you get a nursing degree and hate nursing, you can quit without having taken completely obscene student loans. You can probably go back to school and get a second degree in twelve to twenty-four months.

In normal jobs, if you hear about a better opportunity in another company or industry, you can pursue it. If you’re sufficiently dissatisfied with your university, you can transfer.[1] Many academic grad schools make quitting easy. Residencies don’t. The residency market is tightly controlled by residency programs that want to restrict residents’ autonomy—and thus their wages and bargaining power. Once you’re in a residency, it’s very hard to leave, and you can only do so at particular times, in the gap between each residency year.

This is a recipe for exploitation, and many of the labor battles during the first half of the twentieth century were fought to prevent employers from wielding this kind of power. For medical residents, however, employers have absolute power over you enshrined in law. Most other fields don’t have this level of coercion.

Once a residency program has you, they can do almost anything they want to you, and you have very little leverage. In a normal employment situation, if an employer turns out to suck, you quit. You work for three months, realize it’s not for you, walk out. Residency programs don’t let you. You don’t want to be in situations where you have no leverage, but that’s precisely what happens the moment you enter the “match.”

The match occurs in the second half of your fourth year of medical school. You apply to residencies in the first half of your fourth year, interview at various places, and then list the residencies you’re interested in. Residency program directors then rank you, and the National Residency Match Program “matches” you to each other. You’re then obligated to attend that residency program. You can’t privately negotiate with other programs, as you can for, say, undergrad admissions, or med school admissions, or almost any other normal employment situation. You can’t say, “How about another 5 grand?” or “Can I modify my contract to give me fewer days?” If you refuse to accept your “match,” then you’re blackballed from re-entering for the next three years.

Once I realized how nasty the residency match process is and how fundamentally unfair the labor market for residents is, I was shocked: residency programs have formed a cartel designed to control cost and reduce employee autonomy, and hence salaries. I only went to law school for a year, by accident, but even I know enough law and history to recognize a very clear situation of the sort that anti-trust laws are supposed to address in order to protect workers. When my friend entered the match process like a mouse into a snake’s mouth, I became curious, because the system’s cruelty, exploitation, and unfairness to residents is an obvious example of employers banding together to harm employees. Lawyers often get a bad rap in our society, and sometimes for good reasons, but a case like this looked ripe to me.

It turns out that I’m not a legal genius and that lawyers have noticed this anti-trust violation. So an anti-trust lawsuit was filed. You can read about it in the NYTimes, including a grimly hilarious line about how “The defendants say the Match is intended to help students and performs a valuable service.” Ha! A valuable service to employers, since employees effectively can’t quit or negotiate with individual employers. Yes, indeed, curtailing employee power by distorting markets is a valuable service. The article also noted this bit of regulatory capture:

Meanwhile, the medical establishment, growing increasingly concerned about the legal fees and the potential liability for hundreds of millions of dollars in damages, turned to Congress for help. They hired lobbyists to request legislation that would exempt the residency program from the accusations. A rider, sponsored by Senators Edward M. Kennedy, Democrat of Massachusetts, and Judd Gregg, Republican of New Hampshire, was attached to a pension act, which President Bush signed into law in April.

In other words, employers bought Congress and President Bush in order to screw residents.[2] If you attend med school, you’re agreeing to be screwed for three to eight years after you’ve incurred hundreds of thousands of dollars of debt, and you have few if any legal rights to attack the exploitive system you’ve entered.

(One question I have for knowledgeable readers: do you know of any comprehensive discussion of residents and unions? Residents can apparently unionize—which, if I were a medical resident, would be my first order of business—but the only extended treatment of the issue I’ve found so far is here, which deals with a single institution. Given how poorly many residents are treated, I’m surprised there haven’t been more unionization efforts, especially in union-friendly, resident-heavy states like California and New York. One reason might be simple: people fear being blackballed at their ultimate jobs, and a lot of residents seem to have Stockholm Syndrome.)

Residency program directors will no doubt argue that residency is set up the way it is because the residency experience is educational. So will doctors. There’s a very good reason they argue for residency being essential: they have a stake in the process. Residency directors and other administrators make money off residents who work longer hours and don’t have alternatives. So we shouldn’t be surprised that they seek other legal means of restricting competition (indeed, so much of the fight around medicine isn’t about patient care—it’s about regulatory environments and legislative initiatives. For one recent but very small example of the problems, see “When the Nurse Wants to Be Called ‘Doctor’,” concerning nursing doctorates.)

I don’t buy their arguments for more than ad hominem reasons. The education at many residency programs is tenuous at best. One friend, for example, is in a program that requires residents to attend “conference,” where they are supposed to learn. But “conference” usually degenerates into someone nattering and most of the residents reading or checking their phones. Conference is mandatory, regardless of its utility. Residents aren’t 10 year olds, yet they’re treated as such.

These problems are well-known (“What other profession routinely kicks out a third of its seasoned work force and replaces it with brand new interns every year?”). But there’s no political impetus to do anything about it: doctors like limiting their competition, and people are still fighting to get into medical school.

When you enter the military, you usually make a four-year commitment. Even ROTC only demands a four- to five-year commitment after graduation—at which point you can choose to do something else. Medicine is, in effect, at least a ten-year commitment: four of medical school, at least three of residency, and at least another three to pay off med school loans. At which point a smiling twenty-two-year-old graduate will be a glum thirty-two-year-old doctor who doesn’t entirely get how she got to be a doctor anyway, and might tell her earlier self the things that earlier self didn’t know.

Contrast this experience with nursing, which requires only a four-year degree. At the same time, as John Goodman points out in “Why Not A Nurse?“, nursing is much less heavily or uniformly regulated than doctoring. Nurses can move to Oregon:

Take JoEllen Wynne. When she lived in Oregon, she had her own practice. As a nurse practitioner, she could draw blood, prescribe medication (including narcotics) and even admit patients to the hospital. She operated like a primary care physician and without any supervision from a doctor. But, JoEllen moved to Texas to be closer to family in 2006. She says, “I would have loved to open a practice here, but due to the restrictions, it is difficult to even volunteer.” She now works as an advocate at the American Academy of Nurse Practitioners.

and, based on the article, avoid Texas. Over time, we’ll see more articles like “Why Nurses Need More Authority: Allowing nurses to act as primary-care providers will increase coverage and lower health-care costs. So why is there so much opposition from physicians?” Doctors will oppose this, because it’s in their economic self-interest to avoid more competition.

The next problem with becoming a doctor involves what economists call “information asymmetry.” Most undergraduates making life choices don’t realize the economic problems I’ve described above, let alone some of the other problems I’m going to describe here. When I lay out the facts about becoming a doctor to my freshmen, many of those who want to be doctors look at me suspiciously, like I’m offering them a miracle weight-loss drug or have grown horns and a tail. “No,” I can see them thinking, “this can’t be true because it contradicts so much of what I’ve been implicitly told by society.” They don’t want to believe. Which is great—right up to the point they have to live their lives, and see how their how those are lives are being shaped by forces that no one told them about. Just like no one told them about opportunity costs or what residencies are really like.

Medical students and doctors have complained to me about how no one told them how bad it is. No one really told them, that is. I’m not sure how much of this I should believe, but, at the very least, if you’re reading this essay you’ve been told. I suspect a lot of now-doctors were told or had an inkling of what it’s really like, but they failed to imagine the nasty reality of 24- or 30-hour call. They ignore information that conflicts with their current belief system about the glamor of medicine to avoid cognitive dissonance (as we all do: this is part of what Jonathan Haidt points out in The Righteous Mind, as does Daniel Kahneman in Thinking, Fast and Slow). Many now-doctors, even if they were aware, probably ignored that awareness and now complain—in other words, even if they had better information, they’d have ignored it and continued on their current path. They pay attention to status and money instead of happiness.

For example, Penelope Trunk cites Daniel Gilert’s Stumbling on Happiness and says:

Unfortunately, people are not good at picking a job that will make them happy. Gilbert found that people are ill equipped to imagine what their life would be like in a given job, and the advice they get from other people is bad, (typified by some version of “You should do what I did.”)

Here are some other vital takeaways from Stumbling on Happiness: [3]

* Making more than about $40,000/year does little to improve happiness (this should probably be greater in, say, NYC, but the main point stands: people think money and happiness show a linear correlation when they really don’t).

* Most people value friends, family, and social connections more than additional money, at least once their income reaches about $40K/year. If you’re trading time with friends and family for money, or, worse, for commuting, you’re making a tremendous, doctor-like mistake.

* Your sex life probably matters more than your job, and many people mis-optimize in this area. I’ve heard many residents and med students say they’re too busy to develop relationships or have sex with their significant others, if they manage to retain one or more, and this probably makes them really miserable.

* Making your work meaningful is important.

Go to med school without reading Gilbert at your own peril. No one in high school or college warns you of the dangers of seeking jobs that harm your sex life, because high schools are too busy trying to convince you not to have one. So I’m going issue the warning: if you take a job that makes you too tired to have sex or too tired to engage in contemporary mate-seeking behaviors, you’re probably making a mistake. Medical students are signing up for three to six years of this condition, which may explain why so many of them are miserable and unhappy; they’ve failed to optimize the things that probably would make them happy, like getting more action. They’ve made bad trade-offs without really realizing that they’ve made them.

The sex-life issue might be overblown, because people who really want to have one find a way to have one; some med students and residents are just offering the kinds of generic romantic complaints that everyone stupidly offers, and which mean nothing more than discussion about the weather. You can tell what a person really wants by observing what they do, rather than what they say. But med students and residents have shown enough agony over trade-offs and time costs to make me believe that med school does generate a genuine pall over romantic lives. There is a correlation-is-not-causation problem—maybe med school attracts the romantically inept—but I’m willing to assume for now that it doesn’t.

The title of Trunk’s post is “How much money do you need to be happy? Hint: Your sex life matters more.” If you are doing a job that consistently makes you too tired for sex, you are doing things wrong and need to re-prioritize. If you’re a nurse, you can work three twelves a week, or thirty-six total hours, and be okay. But, as described above, being a doctor doesn’t let you re-prioritize. Proto-doctors screw up their 20s and 30s, sexually speaking, because they’ve committed to a job that’s so cruel to its occupants that, if doctors were equally cruel to patients, those doctors would be sued for malpractice. And the student loans mean you effectively can’t quit. You’ve traded sex for money and gotten a raw deal. You’ll also be surrounded by people who are miserable and uptight—and who have also mis-prioritized.

You probably also don’t realize how ill-equipped you are to what your life would be like as a doctor because a lot of doctors sugarcoat their jobs, or because you don’t know any actual doctors. So you extrapolate from people who say, “That’s great” when you say you want to be a doctor. If you say you’re going to stay upwind and see what happens, they don’t say, “That’s great,” because they simply think you’re another flakey college student. But saying “I want to go to med school” or “I want to go to law school” isn’t a good way to seem level-headed (though I took the latter route; fortunately, I had the foresight to quit). Those routes, if they once led to relative success and happiness, don’t any more, at least for most people, who can’t imagine what life is like on the other end of the process. With law, at least the process is three years, not seven or more.

No one tells you this because there’s still a social and cultural meme about how smart doctors are. Some are. Lots are very good memorizers and otherwise a bit dull. And you know what? That’s okay. Average doctors seeing average patients for average complaints are fixing routine problems and directing traffic when it comes to problems they can’t solve. Medicine, meanwhile, doesn’t select for being well-rounded, innovative, or interesting; if anything, it selects against those traits through its relentless focus on test scores and so forth, which don’t appear to correlate strongly with being interesting or intellectual.

You’re not necessarily associating with the great minds of your generation by going to medical school; you may not even really be associating with great minds. You might just be associating with excellent memorizers. I didn’t realize this until I met a fair number of doctors, had repeated stabs at real conversations with them, and eventually realized that many aren’t intellectually curious and imaginative. There are, of course, plenty of smart, intellectually curious doctors, but given the meme about the intelligence of doctors, there are fewer than I expected, and plenty who see themselves as skilled technicians and little more.

A lot of doctors are the smartest stupid people you’ve ever met. Smart, because they’ve survived the academic grind. Stupid, because they signed up for med school, which is effectively signing away extraordinarily valuable options. Life isn’t a videogame. You don’t get a reset button, a do-over. Once your 20s are gone, they’re gone forever.

Maybe your 20s are supposed to be confusing. Although I’m still in that decade, I’m inclined to believe this idea. Medical school offers a trade-off: your professional life isn’t confusing and you have a clear path to a job and paycheck. If you take that path, your main job is to jump through hoops. But it offers that clarity of professional purpose at great cost in terms of hours worked, debt assumed, and, perhaps worst of all, flexibility. Given that set of trade-offs, I think a lot of people who become doctors would be better off with the standard confusion, but they take the clear path out of fear—which is the same thing that drives so many bright but unfocused liberal grads into law schools.

I’ve already mentioned prestige and money as two of the big reasons people go to med school. Here’s another: fear of the unknown. I think a lot of people start med school because it’s a clearly defined, well-lit path. The problem is that such paths are becoming increasingly crowded. You can fight the crowd, or you can find another way. Most people are scared of the other way. They shouldn’t be, and they wouldn’t be if they knew what graduate school paths are like.

For yet another perspective on the issue of not going to med school, see Ali Binazir’s “Why you should not go to medical school — a gleefully biased rant,” which has more than 200 comments as of this writing. As he says, there’s only one thing that should drive you to med school: “You have only ever envisioned yourself as a doctor and can only derive professional fulfillment in life by taking care of sick people.” If you can only derive professional fulfillment in life by taking care of sick people, however, you should remember that you can do so by being a nurse or a physician’s assistant. And notice the words Binazir chooses: he doesn’t say, “help people”—he says “taking care of sick people.” The path from this feeling to actually taking care of sick people is a long, miserable one. And you should work hard at envisioning yourself as something else before you sign up for med school.

You can help people in all kinds of ways; the most obvious ones are by having specialized, very unusual skills that lots of people value. Alternately, think of a scientist like Norman Borlaug (I only know about him through Tyler Cowen’s book The Great Stagnation; in it, Cowen also observes that “When it comes to motivating human beings, status often matters at least as much as money.” I suspect that a lot of people going to medical school are really doing it for the status).Bourlag saved millions of lives through developing hardier seeds and through other work as an agronomist. I don’t want to say something overwrought and possibly wrong like, “Bourlag has done more to help people than the vast majority of doctors,” since that raises all kinds of questions about what “more” and “help” and “vast majority” mean, but it’s fair to use him as an example of how to help people outside of being a doctor. Programmers, too, write software that can be instantly disseminated to billions of people, and yet those who want to “help” seldom think of it as a helping profession, even though it is.

For a lot of the people who say they want to be a doctor so they can help people, greater intellectual honesty would lead them to acknowledge mixed motives in which helping people is only one and perhaps not the most powerful. On the other hand, if you really want to spend your professional life taking care of sick people, Binazir is right. But I’m not sure you can really know that before making the decision to go to medical school, and, worse, even if all you want to do is take care of sick people, you’re going to find a system stacked against you in that respect. You’re not taking the best care of people at 3 a.m. on a 12- to 24-hour shift in which your supervisors have been screaming at you and your program has been jerking your schedule around like a marionette all month, leaving your sleep schedule out of whack. Yeah, someone has to do it, but it doesn’t have to be you, and if fewer people were struggling to become doctors, the system itself would have to change to entice more people into medical school.

One other, minor point: you should get an MD and maybe a PhD if you really, really want to do medical research. But that’s a really hard thing for an 18 – 22 year old to know, and most doctors aren’t researchers. Nonetheless, nurses (usually) aren’t involved in the same kind of research as research MDs. I don’t think this point changes the main thrust of my argument.

Very few people will tell you this, or tell even if you ask; Paul Graham even writes about a doctor friend in his essay “How to do What You Love:”

A friend of mine who is a quite successful doctor complains constantly about her job. When people applying to medical school ask her for advice, she wants to shake them and yell “Don’t do it!” (But she never does.) How did she get into this fix? In high school she already wanted to be a doctor. And she is so ambitious and determined that she overcame every obstacle along the way—including, unfortunately, not liking it.

Now she has a life chosen for her by a high-school kid.

When you’re young, you’re given the impression that you’ll get enough information to make each choice before you need to make it. But this is certainly not so with work. When you’re deciding what to do, you have to operate on ridiculously incomplete information. Even in college you get little idea what various types of work are like. At best you may have a couple internships, but not all jobs offer internships, and those that do don’t teach you much more about the work than being a batboy teaches you about playing baseball.

Having a life chosen for you by a 19-year-old college student or 23-year-old wondering what to do is only marginally better.

By the way, I’m far from the first person to notice that people don’t always understand what they’ll be like when they’re older; in “Aged Wisdom,” Robin Hanson says:

You might look inside yourself and think you know yourself, but over many decades you can change in ways you won’t see ahead of time. Don’t assume you know who you will become. This applies all the more to folks around you. You may know who they are now, but not who they will become.

This doesn’t surprise me anymore. Now I acknowledge that I’m very unlikely to be able to gauge what I’ll want in the future.

Contemplate too the psychological makeup of many med students. They (or “you,” if you prefer) are almost always good rule-followers and test-takers; they tend to be very good on tracks but perhaps not so good outside of tracks. Prestige is very important, as is listening to one’s elders (who may or may not understand the ways the world is changing in fundamental ways). They may find the real world large and scary, while the academic world is small, highly directed, and sufficiently confined to prevent intellectual or monetary agoraphobia.

These issues are addressed well in two books: Excellent Sheep by William Deresiewicz and Zero to One by Peter Thiel and Blake Masters. I won’t endorse everything in either book, but pay special attention to their discussions of the psychology of elite students and especially the weaknesses that tend to appear in that psychology. It is not easy for anyone to accept criticism, but that may be particularly true of potential med students, who have probably spent much of their lives being told how “smart” they are, or supposedly are. Being smart in the sense of passing classes and acing tests may not necessarily lead you towards the right life, and, moreover, graduate schools and consulting have evolved to prey on your need for accomplishment, positive feedback, and clear metrics. You are the food they need to swallow and digest. Think long and hard about that.

If you don’t want to read Excellent Sheep and Zero to One, or think you’re “too busy,” I’m going to marvel: you’re willing to spend hundreds of thousands of dollars and years of your life to a field that you’re not wiling to spend $30 and half a day to understanding better? That’s a dangerous yet astonishingly common level of willful ignorance.

Another friend asked what I wanted to accomplish with this essay. The small answer: help people understand things they didn’t understand before. The larger answer—something like “change medical education”—isn’t very plausible because the forces encouraging people to be doctors are so much larger than me. The power of delusion and prestige is so vast that I doubt I can make a difference through writing alone. Almost no writer can: the best one can hope for is changes at the margin over time.

Some med school stakeholders are starting to recognize the issues discussed in this essay: for example, The New York Times has reported that New York University’s med school may be able to shorten its duration from four years to three, and “Administrators at N.Y.U. say they can make the change without compromising quality, by eliminating redundancies in their science curriculum, getting students into clinical training more quickly and adding some extra class time in the summer.” This may be a short-lived effort. But it may also be an indicator that word about the perils of med school is spreading.

I don’t expect this essay to have much impact. It would require people to a) find it, which most probably won’t do, b) read it, which most probably won’t do, c) understand it, which most of those who read it won’t or can’t do, and d) implement it. Most people don’t seem to give their own futures much real consideration. I know a staggering number of people who go to law or med or b-school because it “seems like a good idea.” Never mind the problem with following obvious paths, or the question of opportunity costs, or the difficulty in knowing what life is like on the other side. People just don’t think that far ahead, and, even if they have knowledge like that contained in this essay, I’m not sure they’ll use it. I’m already imagining people on the Internet who are thinking about going to med school and who see the length of this essay and decide it’s not worth it—as if they’d rather spend a decade of their lives gathering the knowledge they could read in an hour. They just don’t understand the low quality of life medicine entails for many if not most doctors.

I’m not telling you what to do. I rarely tell anyone what to do. I’m describing trade-offs and asking if you understand them. It appears that few people do. Have you read this essay carefully? If not, read it again. Then at least you won’t be one of the many doctors who hate what you do, warn others about how doctors are sick of their profession, and wish you’d been wise enough to take a different course.

If you enjoyed this essay, you should also read my novel, Asking Anna. It’s a lot of fun for not a lot of money!


[0] Here’s another anti-doctor article: “Why I Gave Up Practicing Medicine.” The anti-med-school lit is available, if you care to seek it. Most potential med students don’t seem to. If you read the literature and understand the perils and want to go anyway, great.

[1] One could argue that many of the problems in American K – 12 education stem from a captive audience whose presence or absence in a school is based on geography and geographical accidents rather than the school’s merit.

[2] You can read more about the Match lawsuit here. Europe doesn’t have a Match-like system; there, the equivalent of medical residency is much more like a job.

[3] You should read Stumbling on Happiness; it did more to change my life and perspective than almost any other book. And I’ve read thousands of books. Maybe tens of thousands. Yet this one probably does more than any other to influence my day-to-day decisions and practices by clarifying how a lot of what people say they value they don’t, and how a lot of us make poor life choices based on perceived status that end up screwing us. Which is another way of saying we end up screwing ourselves. Which is what a lot of medical students, doctors, and residents have done; no one holds the proverbial gun to your head and orders you into med school (unless you have exceptionally fanatical parents). When you’re doing life, at least in industrialized Western countries, you mostly have yourself to blame for your poor choices, made without enough knowledge to know the right choices.

Thanks to Derek Huang and Bess Stillman for reading this essay.

Follow

Get every new post delivered to your Inbox.

Join 1,348 other followers

%d bloggers like this: