Why unpublished novelists keep writing: why not? An answer as to why this one does

Alix Christie’s “We Ten Million” asks why unpublished novelists write, the number being an estimate of the number of unpublished novels out there (hat tip Heather Horn). Very few books get published; very few that do get any attention; very few of those even make any money; and delusion is a vital skill for many who continue writing. Rationally, most of these would-be writers would probably be better off if they quit writing and did something more economically and socially more productive with their time, like working for Wal-Mart, digging holes and filing them up, writing blogs about their cats, etc.

According to Horn, possible answers include: the idea of a craft, the importance of literature (even if it’s unread?), the need for story, and art as courage. I’m not sure I buy any of those, or any of Christie’s answers. I think the real reason is simpler: novelists keep writing because they basically like the act of writing novels. Publishing, fame, fortune, and all the rest would be nice, as they certainly would be for this unpublished writer with an inbox full of requests for fulls and partials (industry lingo for “send me the full manuscript” or “send me some chapters”) from agents, but the possibility of future and unlikely accolades don’t fuel the work on a daily basis. Instead, the daily drive to succeed is about the material itself. I’ve mentioned this famous quote before and will do so again: “Robertson Davies, the great Canadian novelist, once observed: ‘There is absolutely no point in sitting down to write a book unless you feel that you must write that book, or else go mad, or die.’ ”

The people writing unpublished novels are presumably doing so in lieu of going mad or dying. They feel they have to or need to write.

In a recent post, I wrote about an exchange with a friend who’s an undergrad:

A lot of my motivation comes from a fantasy of myself-as-_____, where the role that fills the blank tends to change erratically. Past examples include: writer, poet, monk, philosopher, womanizer. How long will the physicist/professor fantasy last? 

I replied:

This is true of a lot of people. One question worth asking: Do you enjoy the day-to-day activities involved with whatever the fantasy is? For me, the “myself-as-novelist” fantasy continues to be closer to fantasy than reality, although “myself-as-writer” is definitely here. But I basically like the work of being a novelist: I like writing, I like inventing stories, I like coming up with characters, plot, etc. Do I like it every single day? No. Are there some days when it’s a chore to drag myself to the keyboard? Absolutely. And I hate query letters, dealing with agents, close calls, etc. But I like most of the stuff and think that’s what you need if you’re going to sustain something over the long term. Most people who are famous or successful for something aren’t good at the something because they want to be famous or successful; they like the something, which eventually leads to fame or success or whatever.

“I basically like the work of being a novelist,” including the writing and so forth. That’s why I keep going. I think anyone who continues for any other reason is probably already mad, to use Davies’ term. Alternately, a lot of the would-be novelists out there are probably writing not because they want to get published, but to work out their inner demons, or signal something, or because they don’t know what else to do with their lives, or because they’re misinformed. They’re doing something other than really trying to write something that someone else might want to read.

I’m reminded of a passage from Norah Vincent’s nonfiction book Self-Made Man, in which she describes dressing like and passing as a man. Vincent, dressed as a man named “Ned,” describes going out with a woman met on an online dating site, who “was either the most conversationally inconsiderate person I’d ever met or the most socially impervious:”

Clearly she wasn’t ready to start dating again. She wasn’t looking for a relationship. She was looking for distraction and an ear to tell her troubles to. She didn’t have enough emotional energy left to get seriously involved with Ned [. . .]

A lot of would-be writers are probably doing much the same. I’d guess that relatively few of those ten million novels are publishable, or that many of the writers of those novels have any clue what something like publishable might mean (I didn’t when I started, which might’ve helped me; more on that below). As Laura Miller says regarding the “slush pile” of unsolicited queries agents and publishers get:

You’ve either experienced slush or you haven’t, and the difference is not trivial. People who have never had the job of reading through the heaps of unsolicited manuscripts sent to anyone even remotely connected with publishing typically have no inkling of two awful facts: 1) just how much slush is out there, and 2) how really, really, really, really terrible the vast majority of it is. Civilians who kvetch about the bad writing of Dan Brown, Stephenie Meyer or any other hugely popular but critically disdained novelist can talk as much trash as they want about the supposedly low standards of traditional publishing. They haven’t seen the vast majority of what didn’t get published — and believe me, if you have, it’s enough to make your blood run cold, thinking about that stuff being introduced into the general population.

So you can probably knock off at least 90% of those unpublished novels as not even being serious attempts, where “serious” means “at least thinking about what makes good novels good and bad novels bad.” Of those serious attempts, a lot of them are probably written by people who will one day be good but aren’t yet (Charlie Stross, the SF writer: “[. . .] I was averaging 1-2 novels a year, for very approximate values of “novel”. (They weren’t publishable. I was writing my million words of crap. You don’t want to read them, honest.)”). John Scalzi says something similar: “Writing an entire novel is something most people have to work up to,” and it’s really hard.

I started four novels and wisely abandoned them. I finally wrote two feature-complete novels in the sense that they started and had ends and had middles that led to the ends, kind of, but they were terrible, and I sent them to agents and got deservedly rejected. If you were one of those slush pile readers, I apologize, but those attempts were so far in the past that you’ve probably forgotten them. Then I wrote the last three novels over the last three or so years and started getting those requests for fulls and partials, which was a lot like the typical dating experience in that they ended with variations of “I like you, but not in that way.”

Nonetheless, I would like to think I can stand far enough back from myself to say that, at the very least, they’re publishable, and I think quite fun. Eventually I assume I will write something that gets a literary agent or press to agree with me—or I’ll go mad or die before that die arrives. Between now and the, I keep writing mostly because a) I’m an idiot (this shouldn’t be discounted) and b) I mostly like the work, as I described above. The second might seem a minor variation on what Christie says—”the only reason is my belief that I have got a story that I must tell”—but it’s a sufficiently important one that I’ll forward it here.

The function of stories in society and some of that other stuff is good, but I’m still guessing that my real reason (and, probably, hers) is that I like to write, which is slightly different from having a story to tell. I suspect the same is true of most artists and intellectuals and hackers; even most hacker/programmer types probably like the fact that they can change the world with their code, and so forth, but their big motivation is probably solving problems and writing code. Notice how the verb “writing” takes on a noun—code—that “writing prose” has lost. The word shows the similar impetus underlying both activities.

I’m not a hacker because, although I’ve written a little bit of code, I don’t like doing it all that much. If I did, it would’ve been vastly smarter to pursue that than it is to continue what I’m doing now. At least I’ve done enough to appreciate how hard it is to write code. And those write good code are rewarded for their skill. Good hackers, programmers, or computer scientists (pick your choice, each with its shades of connotation but denoting more or less the same activity) make a lot of money, and the smart ones often have an immediate, tangible effect on the world. This is sometimes but not always true of writers. But when I began writing fiction with some level of seriousness, I didn’t sit down and say to myself, “What is the optimal path?” I had some ideas and began typing. A depressingly large number of years later, I’m still doing the same basic thing in a way that might be detrimental to my own best interests. So why do I keep going? Why am I part of the ten million?

Because I like the work.

Thinking and doing: Procrastination and the life of the mind

I finally got around to reading James Surowiecki’s “What does procrastination tell us about ourselves?” (answer: maybe nothing; maybe a lot), which has been going around the Internet like herpes for a very good reason: almost all of us procrastinate, almost all of us hate ourselves for procrastinating, and almost all of us go back to procrastinating without really asking ourselves what it means to procrastinate.

According to Surowiecki, time preferences help explain procrastination. For a good introduction on the topic, see Philip Zimbardo and John Boyd’s The Time Paradox. The short, non-technical version: Some people tend to value present consumption more than future consumption, while others are the inverse. And it’s not just time preferences that change who we are; as Dan Ariely documents in Predictably Irrational, we also change our stated behaviors based on whether, for example, we’re aroused. We also sometimes prefer to bind ourselves through commitments to deadlines or to external structures that will “force” us to behave a certain way. How many dissertations would be completed without the social stigma that comes from working on a project for years and failing to complete it, coupled with the threat of funding removal?

The basic issue is that we have more than one “self,” and the self closest to the specious present (which lasts about three seconds) might be the “truest.” This comes out in the form of procrastination. To quote at length from Surowiecki, who is nominally reviewing The Thief of Time: Philosophical Essays on Procrastination:

Most of the contributors to the new book agree that this peculiar irrationality stems from our relationship to time—in particular, from a tendency that economists call “hyperbolic discounting.” A two-stage experiment provides a classic illustration: In the first stage, people are offered the choice between a hundred dollars today or a hundred and ten dollars tomorrow; in the second stage, they choose between a hundred dollars a month from now or a hundred and ten dollars a month and a day from now. In substance, the two choices are identical: wait an extra day, get an extra ten bucks. Yet, in the first stage many people choose to take the smaller sum immediately, whereas in the second they prefer to wait one more day and get the extra ten bucks.

In other words, hyperbolic discounters are able to make the rational choice when they’re thinking about the future, but, as the present gets closer, short-term considerations overwhelm their long-term goals. A similar phenomenon is at work in an experiment run by a group including the economist George Loewenstein, in which people were asked to pick one movie to watch that night and one to watch at a later date. Not surprisingly, for the movie they wanted to watch immediately, people tended to pick lowbrow comedies and blockbusters, but when asked what movie they wanted to watch later they were more likely to pick serious, important films. The problem, of course, is that when the time comes to watch the serious movie, another frothy one will often seem more appealing. This is why Netflix queues are filled with movies that never get watched: our responsible selves put “Hotel Rwanda” and “The Seventh Seal” in our queue, but when the time comes we end up in front of a rerun of “The Hangover.”

The lesson of these experiments is not that people are shortsighted or shallow but that their preferences aren’t consistent over time. We want to watch the Bergman masterpiece, to give ourselves enough time to write the report properly, to set aside money for retirement. But our desires shift as the long run becomes the short run.

This probably explains why you have to like the daily process of whatever you’re becoming skilled at (writing, researching, law, programming) in order to get good at it: if you have a very long term goal (“Write a great novel” or “Write an entire operating system”), you’ll probably never get there because it’s very easy to defer that until tomorrow. But if you break the task down (I’m going to write 500 words today; I’m going to work on memory management) and fundamentally like the task, you might actually do it. If your short-term desires roughly align with your long-term desires, you’re doing something right. If they don’t, and if you can’t find a way to harmonize them, you’re going to be the kind of person who looks back in 20 years and says, “Where did the time go?”

The answer is obvious: minute by minute and second by second, into activities that don’t pass what Paul Graham calls “The obituary test” in “Good and Bad Procrastination” (like many topics others pass over, he’s already thought about the issue). Are you doing something that will be mentioned in your obituary? If so, then you’re doing something right. Most of us aren’t: we’re watching TV, hanging out on Facebook, thinking that we really should clean the house, waiting for 5:00 to roll around when we get off work, thinking we should go shopping for that essential household item. As Graham says, “The most impressive people I know are all terrible procrastinators. So could it be that procrastination isn’t always bad?” It isn’t, as long as we’re deferring something unimportant for something important, and as long as we have appropriate values for “important.”

So how do we work against bad procrastination and towards doing something useful? The question has been on my mind lately, because a friend who’s an undergrad recently wrote:

A lot of my motivation comes from a fantasy of myself-as-_____, where the role that fills the blank tends to change erratically. Past examples include: writer, poet, monk, philosopher, womanizer. How long will the physicist/professor fantasy last?

I replied:

This is true of a lot of people. One question worth asking: Do you enjoy the day-to-day activities involved with whatever the fantasy is? For me, the “myself-as-novelist” fantasy continues to be closer to fantasy than reality, although “myself-as-writer” is definitely here. But I basically like the work of being a novelist: I like writing, I like inventing stories, I like coming up with characters, plot, etc. Do I like it every single day? No. Are there some days when it’s a chore to drag myself to the keyboard? Absolutely. And I hate query letters, dealing with agents, close calls, etc. But I like most of the stuff and think that’s what you need if you’re going to sustain something over the long term. Most people who are famous or successful for something aren’t good at the something because they want to be famous or successful; they like the something, which eventually leads to fame or success or whatever.

If you essentially like the day-to-day time in the lab, in running experiments, in fixing the equipment, etc., then being a prof might be for you.

One other note: writer, poet, and philosopher have some aspect of money involved in it. So does physicist / professor. Unless you’re Neil Strauss or Tucker Max, “womanizer” is probably a hobby more than a profession. And think of Richard Feynman as an example: he sounds like he got a lot of play, but that wasn’t his main focus; it’s just something he did on the side, so to speak. (“You mean, you just ask them?!”). The more you have some other skill (being a writer, a rock star, whatever), the easier it seems to be to find members of your preferred sex to be interested in you. In Assholes Finish First, Max notes that women started coming to him after his website became successful (note that I have not had the same experience writing about books and lit).

As for the physicist/prof fantasy, I have no idea how long it will last. You sound like you’re staying upwind, per Paul Graham’s essay “What You’ll Wish You’d Known“, which is important because that will let you re-deploy as time goes on. To my mind, read/writing and math are upwind of almost everything else; if you work on those two – three subjects, you’ll probably be okay.

One nice thing about grad school in physics is that you can apparently leverage that to do a lot of other things: programming; becoming a Wall Street quant; doing various kinds of business analysis; etc. It’s probably a better fantasy than monk, poet, or philosopher for that reason. The “philosopher” thing is also (relatively) easy to do on the side, and I would guess it’s probably more fun writing a philosophy blog than writing peer-reviewed philosophy papers, which sounds eminently tedious, at least to me.

Oh: and I have a pile of unposted, half-written blog posts in my Textmate project drawer:

You can see a pile of them on the left. Most will eventually get written. Some will eventually be deleted. All were started with good intentions. Some have been sitting there for a depressingly long period of time. In fact, this post might have found its way among them, if not for the fact that I decided to write it in a single blaze of activity, and if not for the fact that I’m writing about procrastination, this post might have gone the way of many others: half-finished and eventually abandoned.

One reason I’ve had staying power with this blog, while so many of my friends have written a blog for a few months and then quit, is because I basically like blogging for its own sake. Blogging hasn’t brought me fame, power, money, groupies, or other markers of conventional success (so far, anyway!), and it appears unlikely to do so in the short- to medium-term (the long term is anyone’s guess). Sometimes I worry that blogging keeps me from more important work, like writing fiction, but I keep doing it because I like it and because blogging teaches me a lot about the subject I’m writing about and is an excellent forum for small ideas that might one day grow into much larger ones. This is basically the issue that “Signaling, status, blogging, academia, and ideas” discusses.

If the small projects lead to the big projects, you’re doing something right. If the small projects supplant, instead of supplementing, the big projects, you’re doing something wrong. But if you don’t like the small increments of whatever you’re working on, you’re not likely to get to the big project. You’re likely to procrastinate. You’re likely to skip from fantasy to fantasy instead of finding your place. You’re not likely to do the right kind of procrastinating. I wish I’d realized all this when I was younger. Of course, I wish I’d learned a lot of things when I was younger, but I didn’t have Surowiecki, Graham, Zimbardo, Max, and Feynman. Now I do, which enables me to say, “this blog post itself is a form of procrastination, but a productive one, and it’s therefore one I’m going to finish because I like writing it.” That sure beats improbable resolutions.

Snapshot of the new workplace and the symbolic content of Karen Owen’s “horizontal academics”

Penelope Trunk’s “Snapshot of the new workplace: Karen Owen’s PowerPoint” is one of the very few insightful posts about “An education beyond the classroom: excelling in the realm of horizontal academics.” For those of you who haven’t been caught in the media blizzard, Karen Owen turned her sex life at Duke into a PowerPoint narrative. As Trunk says, “She has bullet points, charts, and graphs. How can you not admire a woman who can graph her sex life?”

Trunk is writing about the changing workplace, but the significance of this media event goes beyond that. I mentioned the story to a literary female friend and said that agents had started calling Owen. My friend read the PowerPoint and said that she couldn’t see where agents would go , and I replied that it didn’t matter: Owen is a hilarious (and unusually clear) writer. It’s harder to develop voice than any other trait; if you have voice, structure, plotting, and the like can follow, if the writer wants them bad enough. Owen might.

Notice too Owens’ command of genre: she combines PowerPoint (typically boring), a bloggy style (think Belle de Jour: The Diary of an Unlikely Call Girl) and narrative (which most PowerPoint presentations lack) to make something that defies expectation: PowerPoint is usually stodgy and bad; blogs are nice, but Bell de Jour doesn’t use graphs (to my knowledge); and the Owen’s subject (sex) is of near universal interest, especially when it violates conventional norms, which still exist enough for Owen to capture attention.

Of course, it’s easy to argue that this affair of the moment is trivial, and in the long term it certainly is. But the incident is also emblematic of larger changes. Karen Owen’s story isn’t only interesting because she’s a good writer or because she engages the questions of genre: it’s interesting because it marks an intersection or fault point between ways of living and codes of morality. Despite the sexual revolution, parents still engage in daughter guardian, per the 2008 Perilloux, Fleischman, and Buss journal article I’ve cited before, “The Daughter-Guarding Hypothesis: Parental Influence on, and Emotional Reactions to, Offspring’s Mating Behavior” (Evolutionary Psychology, 6, 217-233). They use strategies to restrict girls’ sexuality more than boys’, which probably contributes to the kinds of gender standards we see as adults.

Parents—who, by now, almost all came of age after the sexual revolution—still nonetheless attempt to shape the behavior of their offspring along more “traditional” lines than they might have wanted their own shaped. And that’s probably true beyond the sexual domain—consider what Paul Graham says in “Why To Not Not Start a Startup:”

… parents tend to be more conservative for their kids than they would be for themselves. This is actually a rational response to their situation. Parents end up sharing more of their kids’ ill fortune than good fortune. Most parents don’t mind this; it’s part of the job; but it does tend to make them excessively conservative. And erring on the side of conservatism is still erring. In almost everything, reward is proportionate to risk. So by protecting their kids from risk, parents are, without realizing it, also protecting them from rewards. If they saw that, they’d want you to take more risks.

“Parents tend to be more conservative for their kids” because parents will probably experience the ups more than the downs. Karen Owen presumably enjoyed her sex life (based on her description) and enjoyed writing her PowerPoint. Her parents probably derived near-zero pleasure from the former and a lot of grief from the latter, since she’s probably hiding out at home. For the rest of their lives, her parents will be hearing—”Karen Owen? Name rings a bell. Was she on TV for something?” and variations on that. Unless they’re unusually snarky, they’ll probably find it difficult to deal with queries about their offspring’s supposed failings.

Parents become “excessively conservative” for their children relative to themselves, and in protecting kids from the risks of sex, they also work to protect kids from its rewards. The same is probably true of work (as Graham says) and of expression: had Owen’s parents known about their daughter’s PowerPoint, they probably would’ve discouraged her from making it. But The same creative impulse that drove Owen to write her PowerPoint might also drive her in the working world, and that’s what Trunk wants to highlight.

I don’t see any route around these fundamental preference differences between parents and children. A lot of teenagers are, from what everyone has observed in popular culture, outraged at their parents’ seemingly cruel, capricious, and arbitrary rules. But those rules often have reasons behind them, as Perilloux, Fleischman, and Buss point out in the context of sex and Graham points out in the context of career, and when one looks at the cost-benefit analyses parents make, one begins to understand why parent-child conflicts exist: the two have different risk-reward profiles.

Parent-Offspring Conflict over Mating: The Case of Mating Age“, another article from Evolutionary Psychology says that:

Parents and offspring have asymmetrical preferences with respect to mate choice. So far, several areas of disagreement have been identified, including beauty, family background, and sexual strategies. This article proposes that mating age constitutes another area of conflict, as parents desire their children to initiate mating at a different age than the offspring desire it for themselves.

Conflicts are built into the family relationship system and are not incidental to it. This is not especially new; in his famous 1974 paper, “Parent-Offspring Conflict,” Robert Trivers discusses the problem and its implication from the perspective of biology. But realizing that this is a feature, not a bug, was new to me when I started reading more about evolutionary psychology three years ago.

One can re-read many of the various complaints about “youth these days” as ones chiefly about how preferences change as people age: younger people want fun, sex, and freedom; older people with children want their children to successfully reproduce and pass on their genes and culture, but what “successfully reproduce” means is different for younger people than older people. That conflict can sometimes be read along generational lines even when it’s more about preferences of the child versus preferences of the parent. In that light, “The New Dating Game: Back to the New Paleolithic Age” is less about what’s inherently good or bad and more about how time preferences function and how people are afraid of change, especially if they fear that change will hurt their economic or reproductive success.

Still, the social world is changing, and a concrete manifestation of abstract change can often become a major topic because it is really a symbolic repository for large-scale fears, hopes, desires, and conflict. Penelope Trunk says that the Karen Owen incident—notice the fear-mongering phrase I use because I can’t think of a better one—is about changes in the workplace and workplace power dynamics.

And this isn’t the first time female sexuality, writing disseminated online, and the workplace have come together: Heather Armstrong got fired for writing in her blog, Dooce, and the term “Dooced” now means to be fired for something one has written online. That she also sometimes wrote about religion and sex probably didn’t help, but they probably also widened her audience, and people like talking about religion because religious practices often function as control and regulation for sexual ones. Anna Davies addresses similar issues in I’m done writing about my sex life: It was a great way for a young woman like me to get published. But the cost of sharing sordid tales became too high. It got her published because people like reading about it—and it’s got Owen “published,” too, although perhaps not in the manner and forum she would prefer.

In his book Say Everything: How Blogging Began, What It’s Becoming, and Why It Matters, Scott Rosenberg calls the chapter on Heather Armstrong “The Perils of Keeping it Real.” Karen Owen is now being forced to navigate the same perils, and I don’t think it a coincidence that female writers face greater perils than men. Then again, Rosenberg points out that a man named Cameron Barrett might be the first person to lose his job over a blog or proto-blog post, since he “was fired […] in 1997 when colleagues found a mildly off-color piece of short fiction he’d posted to his personal website.” The issue of “mildly off-color” material arises in other circumstances, and Rosenberg cites

[…] Ellen Simonetti, a flight attendant who got sacked by Delta Airlines in 2004, apparently because she’d posted photos of herself in uniform revealing a bit too much leg (though nothing that would put a PG rating at risk). There was senatorial aide Jessica Cutler, whose salacious tales of Capitol Hill liaisons gained notoriety for her anonymous blog, Washingtonienne, but cost her her job once Wonkette named her.

Regarding the blog world, Rosenberg says that “[…] there was plainly something about blogging itself that made it hazardous to employment. Perhaps it lulled people into thinking that words in a post had a uniquely protected status and could be cordoned off from the rest of existence.” But one could remove “blogging” and put “the Internet” in place of it, or one could just acknowledge that it can be harder to maintain separate, authentic selves in a world where the reproduction of data is nearly frictionless for a large proportion of the population. The forward button can put your PowerPoint anywhere and everywhere, assuming people want to read it, and social norms haven’t caught up to that.

In Raymond Chandler’s The Big Sleep, the plot revolves around nude pictures of the sexually avaricious Sternwood daughters and whether those pictures will be revealed publicly. Today, we’re moving toward a world in which so many people have already given nude pictures to friends or lovers that real social punishment is becoming increasingly untenable. But those norms aren’t changing so fast that someone like Karen Owen can’t be caught up in the shift. Trunk says that “The rules are all different” and that “[Owen] illustrates why men are afraid of twentysomething women.” She’s right, and it’s probably unfortunate that Owen has unwittingly found herself the catalyst for those shifts. With a blogger, or a writer like Anna Davies, one knows in advance that the act of writing puts one’s self in the public. Owen didn’t consciously realize that the act of writing and e-mailing her PowerPoint could do the same, unwittingly.

In a way, we’re all academics now, in that we’re all judged (and might be fired) for what we’ve written. There’s a flipside to that, however: we might find jobs because of how our writing demonstrates expertise. Karen Owen has probably made some jobs harder to acquire (it’s difficult to imagine her getting past the Google screen of your average high school principal if she wants to be a teacher), but she’s probably also opened up others: hence the calls from editors and agents if she wants to be some kind of writer. If I had a new media company of some kind, I’d be trying to find Karen Owen’s number. Sure, my last sentence sets me up for dirty jokes, but, more importantly, it shows how work and life are changing.

How to get coaching, mentoring, and attention

Introduction

Students regularly say that professors, teachers, coaches, mentors, and others don’t care about them or don’t offer real help and advice. In a recent discussion on the forum Hacker News, someone wrote, “[…] coaching/mentorship is probably found a lot more in a grad program than undergrad, where it’s pretty much nonexistent.” That commenter is somewhat right, but the deeper issue is that professors (and others with knowledge and competence) are most inclined to help people who won’t waste their time.

The challenge is to figure out who is going to waste time and who isn’t. Professors accomplish this through implicit tests. The challenge for you, the student who wants help, is to demonstrate that you’re worth the investment. I’m going to describe the incentives acting on both professors (or people with expertise) and students (or people seeking to develop expertise) and explain how to show that you’re better than the average student.

“How to get your Professors’ Attention” is biased towards universities because I’m a grad student in one and therefore more attuned to universities and the peculiar people who inhabit them. But this advice can be generalized to other situations where someone is knowledgeable and someone else is trying to seek knowledge or mentorship.

This essay is also biased toward English, which is my field. But if you’re working in computer science, for example, you’ll probably get more and better help if you walk into a professor’s office and say something like, “I’m having a problem with this program, which I suspect is related to X, but I’m not sure. I’ve tried sources Y and Z, which might be related, but I can’t figure out what’s going on. Am I missing something?” This will almost always go over better than saying, “Explain binary search trees to me” or “I don’t get this class,” which will probably yield a pointer to the relevant section of the book, with the instruction that you come back once you’ve read it and explain more explicitly where you’ve gotten lost.

Background

I majored in English and went to Clark University, where I think I got a lot of mentorship and connected with my professors. That might be because I took a lot of time to seek them out or because Clark is a small liberal arts school where professors are expected to interact with students. Even there, however, most, though not all, professors offered real mentorship/guidance to the extent the students seek it. When I was an undergrad, I was doing many of the things described in this essay, albeit unconsciously.

What do you care about?

The idea that professors don’t care about their students is a pernicious half-truth. Most professors do care about their students (otherwise they wouldn’t be professing), but professors know that many students don’t care about the subject or about learning—they care about grades. Professors don’t care about grades, and they often care about their students to the extent that their students care about learning.

If a student really wants to learn, the professor will usually help, but most students don’t—so the professor builds a wall between herself and her students to make sure that the only students who breach the wall are the ones who do care about learning. Professors do this through the tests described in the next section. Students often perceive this wall as indifference or callousness, when it’s really just a practical means of separating out the students whose primary goal is to get an A from the students whose primary goal is to understand why Ulysses was a major break from the tradition of the novel and why it became an emblematic text of modernism…

And so on. Life is complex and simple questions often have complex answers. Those complex answers are often found in the form of text, since good writing is far more idea-dense than speech can hope to be, which leads to my next point.

Books

Now I’m a grad student at the University of Arizona and tell my students the same thing: if they want to go beyond whatever is required in class, they should start by showing up in their professors’ office hours, ideally with somewhat smart or at least well-considered questions or comments. Most professors respond well to this and will often give recommendations on books to read and/or projects to work on. A few days ago I taught Paul Graham’s essay “What You’ll Wish You’d Known,” and students glommed onto this paragraph:

A key ingredient in many projects, almost a project on its own, is to find good books. Most books are bad. Nearly all textbooks are bad. So don’t assume a subject is to be learned from whatever book on it happens to be closest. You have to search actively for the tiny number of good books.

Professors are a good place to find good books because they’ve read so many. If you follow their recommendations and talk to them afterwards, coaching and mentorship relationships will be much more likely form. Demonstrate interest in their subject if you want their attention.

Obviously, there are exceptions, but this principle usually works reasonably well. If you show up in office hours and say “mentor me!” you’re probably not going to get much. But if you show up and ask questions x, y, and z, then read whatever the prof recommends, then come back, you’ll probably have a much better shot at their attention.

Another person on the Hacker News discussion said, “I get the impression that some undergraduates at some colleges do get good coaching and mentorship, and I would like to hear from other HN participants if they know of examples of that.” They’re right: some undergraduates do get good coaching and mentorship, but I suspect that depends less on the college or university and more on the undergraduate—and the undergraduate realizing how things work from her professors’ perspectives.

Reading

Professors tell you to read more or read particular books / essays for two reasons. The primary one is that reading is simply more information dense than talking, as mentioned earlier. Try this sometime: copy a half hour of TV news verbatim. You’ll find that it comes to maybe a page of text. To have a reasonable conversation, it often makes sense to read something related to the topic first, then talk about where to go from there. To learn more, read more. To learn faster, read more.

Secondarily, your professor will often recommend reading to test your seriousness. If she says, “Go read X and Y,” and you do, you’ve demonstrated that you’re not wasting the professor’s time and are genuinely interested in the topic. If you go away and don’t come back, you’ve demonstrated that you would’ve wasted her time had she spent an extra hour talking to you outside of class and office hours.

In English and related fields, a deep interest in reading is a pre-condition to doing other interesting things, like knowing about the world. It’s necessary but not sufficient. You don’t need to have read obsessively since you were 12 to catch my attention—but it does help if you say something like, “Oh, yeah, I read Heart of Darkness last summer and noticed the narrative structure, with Marlow telling the story to a random guy on the deck of the boat…” If you tell your computer science professors, “I’m working on a system to save and organize the comments I leave on blogs and read about this association algorithm…” they’re probably going to be more impressed than if you say that you’re ranked on the StarCraft II Battle.net ladder.

There are a handful of people who for whatever reason can’t get around to reading. But all of us make time for what’s important to us. If you can’t make time to read whatever your professor suggests, that indicates the topic isn’t of great importance to you—and therefore your professor shouldn’t waste time doing something that’s not important.

Once I had a student who said in class that he didn’t like to read fiction. Fair enough; not everyone does and it doesn’t offend me when others don’t share my vices. A week or two later, however, he wanted me to edit his 43 pages of Starcraft fan fiction; when I said that it isn’t possible to be a good writer without being a good reader, he didn’t believe me. Nonetheless I told him that if he read How Fiction Works and discussed it with me, I would read his Starcraft fan fiction. And I would have. He didn’t, of course, and acted like I I had kicked his puppy when I suggested that he prove himself.

To summarize: reading teaches you faster than talking can, and it efficiently sorts people who are willing to put in some time investment from those who aren’t. It’s necessary if you’re going to do interesting work.

Doing

People know I’m a wannabe “novelist” (as Curtis Sittenfeld said of her success with Prep in “The Perils of Literary Success,” “I was excited by the thought of no longer having to use air quotes when referring to myself as a ‘writer’ working on a ‘novel’ ”) with many rejection letters and near acceptances to prove how much of a wannabe I am. Sometimes friends and others say things like, “I want to be a novelist,” or “I want to write a novel.” I usually say, “Okay: start today.” Then I tell them: write Chapter One by date X (usually two or three days out) and send it to me.

I’ve probably made this offer to between one and two dozen people over the last couple years. One person has taken me up; she sent me Chapter One, I sent her some comments, and I didn’t hear back (we’re still friends; she says she’s writing other things). When people say they want to be better writers, I tell them what I told my Starcraft fan fiction writer: read James Wood’s How Fiction Works and Francine Prose’s Reading Like a Writer. The rare ones who read show me they’re serious.

By now, I’ve been trained to assume that most people who say things like “I want to write a novel” a) have no idea how hard it is to write a novel, b) how much harder it is to write a novel someone else might actually want to read, and c) the fact that, based on experience, most people who say, “I want to write a novel” are full of shit.

Almost everyone in the United States who wants a computer has one. If you have a working computer and two or three hours a day, you can write a novel. Nothing is stopping you: you don’t need a $10,000 piano. You don’t need a mass spectrometer.[1] You don’t need permission. You don’t need to pass a test. You don’t need to be told you’re special.

All you need to do is sit down and write every day for a couple of hours. Eventually, you’ll have a novel, or at least a very large pile of words. Few people really want to.[2]

Most people who say they do, don’t, just like most people who say they want to lose weight don’t read Michael Pollan’s In Defense of Food and then stop eating simple carbohydrates and highly processes meats. They say they want to lose weight and keep buying Coke. Comparing statements to actions reduces to, “I want to write a novel / lose weight, but not as much as I want to watch TV / drink soda.”

The funny thing is that both novel writing and losing weight are actually fields where relatively minor changes, accumulated over time, can lead to relatively large changes: try writing for one hour a day. Then two. Then three (maybe only on the weekends). Try to drink nothing but water (most drinks are just easily removed empty calories). Take most forms of bread out of your diet; eat fruit instead of candy. Go for a walk at the end of the day. You’ll eventually have a largish pile of words or drop some pounds. A large enough number of people do both to prove they’re possible—if you want them.

Your professors are asking themselves: “Does this student want it? Really want it?” The value of “it” varies by discipline, but the idea remains the same.

A lot of students say or imply they’re not ready or incapable to do a real project, or that they don’t have the time to do so. The former excuses about readiness might be true, but students should still start doing something. I wasn’t capable of writing a novel anyone wanted to read when I was 19—or even finishing one. It took me three tries to get a coherent, complete narrative together, which was still unpublishable. But I wouldn’t have the skills I have now if I hadn’t started trying then. Here’s Curtis Sittenfeld again, this time in an interview with The Atlantic: “I don’t think that you can learn to write a book except by writing a book.”

This isn’t just true of writing books. I didn’t start or stop my work based on what classes I was in or whether I was somehow authorized or trained to do what I was doing. In effect, I mostly trained myself, which I wouldn’t have done without all those early hours writing unpublishable crap. Most novelists tell the same story: lots of early crap and rejection that they ultimately overcome.

If you have a choice between building or making something and not building or making something, always choose “building or making something,” which will be more impressive than not trying even if you fail. Plus, if you look for it, you’ll see people in almost every field saying the same thing: the only way to learn is via the work itself. Here’s Patrick Allitt in I’m the Teacher, You’re the Student:

[. . .] but the way to improve as a teacher is by actually teaching; hypothetical situations or abstract discussions are too different from the real thing. The best you can hope for, short of actually getting down to the job, is to learn a handful of principles, on the one hand, and a handful of useful techniques, on the other.

You can learn those principles and techniques, but you still have to—above all—do. And your professors, like coaches and mentors, are looking for the people who will do whatever it takes. A lot of students say, “I’m just a student, and the president of club X, and I have homework to do, and I want to have sex with my boyfriend / girlfriend / neighbor / person-from-the-party-whose-name-I-forget, and my parents are breathing down my neck…” That might all be true, and all of those are fine things to do or worry about. I have to worry about many of them myself.

But you’ll only have more work over time, and the work done in college is nothing compared to the real work people do to support themselves. From what friends have told me, college schoolwork and life is nothing like the work of having a baby and being responsible for feeding and keeping alive a small, helpless, somewhat boring human. So in your professors’ minds, saying that you have so many responsibilities often reduces to an excuse not to start now. A base excuse. The best time to start anything is now. Today.

People who really want to do something… do it. Or they make changes so they can; you might notice that most people are not too busy to find time to date and/or have sex with the person of their dreams. But most people say they want to do something and then they don’t (I’ve repeated this a couple of times in the hopes that it sticks). Over time, others notice this (like me), and they start to assume that most people who say they want to do or know something are full of shit, in part because experts can’t distinguish at first glance who’s full of shit and who is genuine and thus worth investing in.

So experts assume that someone is full of shit until they prove otherwise. In the case of someone who wants to write a novel, I assume they’re no longer full of shit if they’ve written a complete first novel and started on a second one (the first one is almost certainly no good, although there might be useful lessons to draw from it. That was true for me). In the case of someone who wants to lose weight, I assume they’re full of shit until they start carrying around a Nalgene bottle and a bag of peanuts instead of a Coke and a Snickers. Your professors will start to think you’re not full of shit when you read the books they recommend, ask for more recommendations, read those, and come back for more.

In addition, if you do enough stuff, you’ll have something to bring to the table. A random person with no skills is less appealing than a random person who can say, “I’ll get your blog up and running” or “I’ll write the first draft of the boring NIH proposal for you” or even “I’m obsessed with coffee and will make you a single-original brew in a Chemex.” People who develop skills tend to develop the meta-skill of developing skills, and they’re more appealing because of the skills they already have.

Caveats

This basic advice won’t always work: some professors won’t pay any attention to you no matter what you do. They might be more interested in their own research than teaching, or they might be having personal problems, or they might be off in their own world, or they might be burned out. Some professors will go out of their way to try and inflict mentoring on students who don’t particularly want it, although I don’t think there are very many of these professors, especially in big public schools; most professors who try this approach will also probably encounter enough apathy to scale it back once they’re rebuffed enough times.

There are probably also variations by field: enough people have reported that professors in technical fields are less inclined to work with undergrads to make me wonder if there is some truth to this stereotype. I suspect that science professors just have a different mode of mentoring, which goes something like: “Come to the lab, we’ll see if you can do anything there.”[3] Most professors, however, will fall somewhere in the middle of this spectrum, and those are the professors who can most be reached via this guide. It would be very unusual to find a school where following the basic outline presented here will result in nothing.

A story…

I had a student who I’ll call “Joe.” He habitually wanted to hang out and chat after class. This is good: at first I interpreted it as meaning that he was intellectually curious and driven.

But as the semester went on, I got progressively more annoyed because he’d ask questions that couldn’t be reduced to sound bites. I kept telling him to drop by office hours if he wanted to really talk, but he never showed up. I’d suggest he read X, and when I asked him about it a week later, he’d say he’d been busy, but he was never too busy to waste ten or fifteen minutes of my time in class. We were reading Jane Austen’s Pride and Prejudice, and he said something about her place in literary history that was… unlikely, let us say, so I told him to read a few of the essays in the back of the Norton critical edition. I don’t think he did.

Before their first papers are due, I usually meet with my freshmen individually to go over their work. I close read, edit, talk to them about ideas, catch disastrously bad papers so they can be rewritten, and so on. Joe didn’t show up to his conference; he didn’t come to my office hours; and when I finally did read his paper, it had incredible howlers in terms of both fact and interpretation, my favorite being his assertion that the Toyota Prius is in some way like a perpetual motion machine, which demonstrated that he didn’t know anything about physics or perpetual motion machines or even general knowledge.

Joe got back a paper that was charitably graded, given its quality, and he dropped the class. Joe is an extreme example of a time waster: I think he would’ve been more than happy to chat for an hour after class each day, shooting the breeze while I had pressing concerns. I get at least one Joe every year. I separate Joe from students who want to learn by a) telling them to read something and b) seeing if they do it. The ones who do, I spend as much time talking to outside of class as they want—because I know they’re not wasting time. I love chatting with students who are engaged by the material and by life, and I’ll spend a lot of time with them, as long as they’re not bogus.

Criticism

Most of us don’t like being criticized: we’d prefer to imagine that we’re good at everything, that we don’t need the help of others, and that whatever we’re working on is perfect. Don’t change a thing! We get prickly when people try to help us and often denigrate the person giving us advice, assuming that person doesn’t understand our genius or is too hard a grader or has malice in their heart.

Grades are a form of criticism and a form of ranking you against other people: they’re a statement from your professor to you about how well the professor thinks you’ve mastered the material. Even in an era of rampant grade inflation, grades can still sting, and few students achieve a 4.0. A small but noisy minority of students will come back after every semester to fight about their grades, which is one of the least pleasant aspects of teaching.

Most people who are nominally looking for help in truth want to have their current ideas or beliefs gratified and validated. If professors offer real, constructive criticism, it’s often viewed as a personal attack. The student on the receiving end is then hostile to the critic; that hostility turns into negative responses on the end-of-semester evaluations, awkward moments when the professor and student run into each other on campus or at a bar, and so on.

Still, some fields are culturally disposed towards rapid, yes/no assessment. One friend who read this essay mention that his vector calculus professor often says things like, “No, you’re doing it wrong—here’s how it should be done.” My friend said it took him aback at first, and he realized that the professor’s honesty could be mistaken for cruelty and indifference. But the professor’s demeanor is actually about efficiency: he wants his students to get the right answer as fast as possible. Most of us aren’t used to being told we’re wrong on a regular basis, so we interpret this as hostility when it’s not.

“Don’t shoot the messenger” is a cliché because few people are capable of listening dispassionately to criticism, evaluating it, and ignoring it if they think it invalid and accepting it if they think it’s valid. Most of us suffer from some level of confirmation bias, which is a term psychologists use to describe what Wikipedia calls “a tendency for people to favor information that confirms their preconceptions or hypotheses regardless of whether the information is true.”[4] We all want to believe we are smart and capable. But we often aren’t, and we don’t like to accept it when people tell us this or imply it. When students do attempt something, fail, and accept credit, it’s almost as impressive as if they get it on the first try.

From the professor’s perspective, it’s easier to avoid giving the real criticism necessary for improvement. If you’re a student who wants to learn, you’ll need to demonstrate that you’re capable of taking criticism, that your ego is not overly inflated, and that you’re willing to accept that you don’t know everything and that you could be wrong. Some people never learn how to do this. Others do only after a great struggle. Professors will assume that you can’t take criticism until you show you can. This problem inhibits your professors from forming real bonds and sharing real knowledge with you, especially if that knowledge contradicts what you already believe to be true. If a professor gives you real commentary, use it to improve.

That doesn’t mean you have to believe your professor or take all the advice anyone gives you, but you should at least not be hostile to it. If the professor is right, modify your behavior; if the professor is wrong, pity them for their ignorance or incorrect interpretation. But don’t get angry because someone is trying to help you, however imperfectly.

Professors, and most people who do good or interesting work, need to have a peculiar temperament: they need an open mind (Paul Graham in “What You Can’t Say:” “To do good work you need a brain that can go anywhere”) but also the rigor not to become too infatuated with or attached to particular ideas. Few people achieve this balance, and very few people have the kind of openness that I associate with great intelligence, which manifests itself in a willingness to take in new ideas and be wrong when necessary. When I see these kinds of traits in anyone, they arrest my attention. This is doubly true for students, because so few students have or manifest them.

Real education

In “Who Are You and What Are You Doing Here?“, Mark Edmundson writes:

If you want to get a real education in America you’re going to have to fight—and I don’t mean just fight against the drugs and the violence and against the slime-based culture that is still going to surround you. I mean something a little more disturbing. To get an education, you’re probably going to have to fight against the institution that you find yourself in—no matter how prestigious it may be. (In fact, the more prestigious the school, the more you’ll probably have to push.) You can get a terrific education in America now—there are astonishing opportunities at almost every college—but the education will not be presented to you wrapped and bowed. To get it, you’ll need to struggle and strive, to be strong, and occasionally even to piss off some admirable people.

This guide is basically teaching you how “to fight,” because the regular education that you get solely from sitting in classes won’t be real impressive. You won’t learn as much from formal, explicit education as you will from informal, tacit education. Both have their place, but you have to go beyond the given to get the tacit education. That’s where the “struggle and strive” come from. If you’re perceptive and attending a big American school, you’ve probably noticed that you’re not getting much out of a 500- or 1,000-person lecture class.

Of course you aren’t—those classes are designed to balance the university’s budget, since they cost only marginally more to run than ten-person seminars, yet the university charges you, the student, the same amount per credit hour as it does to the ten seminarians. If you’re not perceptive or you just want to party and get laid, it probably doesn’t matter. But if you are that student who really wants to get something more than a particular kind of fun from the college experience, you need to know how to “get a terrific education,” which “will not be presented to you wrapped and bowed.” You have to take it for yourself—you have to prove yourself. In movies about sports, you may notice that the team or individual doesn’t get to the championship match or fight the first time it hits the field or enters the ring.

You won’t either. You have to prove to your professors and to others that you have what it takes. That you have tenacity, grit, strength. That you want the education, not merely the piece of paper at the end that says you’ve sat through four years of stultifying classes and managed not to fail out. Depending on your major, it’s shockingly hard to fail, as Richard Arum and Josipa Roksa show in Academically Adrift: Limited Learning on College Campuses.

It’s important to learn how to cultivate teachers. In A Jane Austen Education: How Six Novels Taught Me About Love, Friendship, and the Things That Really Matter, William Deresiewicz writes:

The need for teachers: there is something in the modern spirit that bridles at the notion. It seems inegalitarian, undemocratic. It injures our self-esteem, the idea of having to confess our incompleteness and submerge our ego beneath another person. It outrages our Romantic temper, which feels that the self is autonomous and the self is supreme. [ . . .] But Austen accepted it, even celebrated it. Nearly all of her heroines have teachers of one kind or another, and in her own life, we know, her mentors were many crucial.

Most teachers are not very good, despite our need for them. But we need to learn how much we need them, if we’re really going to do the things we want to do in our lives. We might be “autonomous,” but we also need to have someone else’s perspective and experience.

Conclusion

Many professors will help you, but you need to know how to make them want to help you. You need to learn how to signal a willingness to learn, which you can do mostly by formulating good questions and doing the reading or projects your professor suggests. As stated earlier, some professors won’t help you no matter what. They’re not very common, since if they didn’t have a strong desire to teach, they’d have gone into a more lucrative field, since there are few fields less lucrative than teaching at the university level (adjusted for education and opportunity costs). Many, however, will have been burned by students who are dilettantes and time wasters. You need to prove you’re not one of them and learn how to breach their defenses. This is a guide to doing so, but reading the guide is the easy part. The hard part is doing the reading and finishing the projects. That is up to you.

Thanks to Bess Stillman, Derek Huang, and Andrew Melton for reading this essay. For further reading, consider Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life. Leading a meaningful life is not easily accomplished, and for evidence of that assertion I’d submit the tragically small number of people who seem to do so.


[1] But really, who doesn’t want one?

[2] Maybe they are afraid of ending up with that very large pile of words.

[3] They want to know: Are you competent? Can you do math? Will you break the $10,000 PCR machine? Okay, go play with chemicals, read this paper, get back to me in a week.

[4] Learning about confirmation bias is one of the first steps toward combating it, which Steve Joordens discusses in his lecture “You Can Lead Students to Knowledge, But How Do You Make Them Think?” The lecture is about critical thinking, but it’s really about how to think and why.

How Universities Work, or: What I Wish I’d Known Freshman Year: A Guide to American University Life for the Uninitiated

Note that you can also read this essay as a .pdf.

Introduction

Fellow graduate students sometimes express shock at how little many undergraduates know about the structure and purpose of universities. It’s not astonishing to me: I didn’t understand the basic facts of academic life or the hierarchies and incentives universities present to faculty and students when I walked into Clark University at age 18. I learned most of what’s expressed here through osmosis, implication, inference, discussion with professors, and random reading over seven years.

Although most of it seems obvious now, as a freshman I was like a medieval peasant who conceived of the earth as the center of the universe; Copernicus’ heliocentric[1] revolution hadn’t reached me, and the much more accurate view of the universe discovered by later thinkers wasn’t even a glimmer to me. Consequently, I’m writing this document to explain, as clearly and concisely as I can, how universities work and how you, a freshman or sophomore, can thrive in them.

The biggest difference between a university and a high school is that universities are designed to create new knowledge, while high schools are designed to disseminate existing knowledge. That means universities give you far greater autonomy and in turn expect far more from you in terms of intellectual curiosity, personal interest, and maturity.

Universities are also supposed to help students help themselves. That is, you, the student, are or should be most responsible for your own learning.

Degrees

This section might make your eyes glaze over, but it’s important for understanding how universities work. If you’re a freshman in college, you’ve probably just received your high school diploma. Congratulations: you’re now probably working toward your B.A. (bachelor of arts) or B.S. (bachelor of science), which will probably take four years. If you earn that, you’ll have received your undergraduate degree.

From your B.A./B.S., if you wish to, you’ll be able to go on to professional degrees like law (J.D.), medicine (M.D.), or business (M.B.A.), or to further academic degrees, which usually come in the form of an M.A., or Master’s Degree. An M.A. usually takes one to two years after a B.A. After or concurrently with an M.A., one can pursue a Ph.D., or Doctor of Philosophy degree, which usually takes four to ten years after a B.A.

The M.A. and Ph.D. are known as research degrees, meaning that they are conferred for performing original research on a specific topic (remember: universities exist to create new knowledge). Professional degrees are designed to give their holder the knowledge necessary to be a professional: a lawyer, a doctor, or a business administrator.

Many if not most people who earn Ph.D.s ultimately hope to become a professor, as described in the next section. The goal of someone earning a Ph.D. is essentially to become the foremost expert in a particular and narrow subject.

Professors, Adjuncts, and Graduate Students

There are two to three main groups—one could even call them species—you’ll interact with in a university: professors, adjunct professors, and graduate students.

Professors almost always have a Ph.D. Many will have written important books and articles in their field of expertise. They can be divided into two important classes: those with tenure—a word you’ll increasingly hear as you move through the university system—and those without. “Tenure,” as defined by the New Oxford American Dictionary that comes with Mac OS X 10.6, is “guaranteed permanent employment, esp. as a teacher or professor, after a probationary period.” It means that the university can’t fire the professor, who in turn has proven him or herself through the publication of those aforementioned books and papers along with a commitment to teaching. This professor will probably spend her career at the university she’s presently at.

Those without tenure but hoping to achieve it are on the “tenure track,” which means that, sometime between three and six years after they’re hired, a committee composed of their peers in the department will, along with university administrators and others, decide whether to offer tenure. Many professors on the tenure track are working feverishly on books and articles meant for publication. Without those publications, they will be denied tenure and fired from their position.

Adjuncts, sometimes called adjunct professors, usually have at least an M.A. and often have a Ph.D. They do not have tenure and are not on the “tenure track” that could lead to tenure. They usually teach more classes than tenured or tenure-track professors, and they also have less job security. Usually, but not always, adjuncts teach lower-level classes. They are not expected to do  research as a condition of staying at the university.

Graduate Students (like me, as of this writing) have earned a B.A. or equivalent and are working towards either an M.A. or a Ph.D. From the time they begin, most graduate students will spend another two to eight years in school. They take a set number of small, advanced classes followed by tests and/or the writing of a dissertation, which is an article- or book-length project designed to show mastery in their field.

Many—also like me—teach or help teach classes as part of their contract with the university. In my case, I teach two classes most semesters, usually consisting of English 101, 102, or 109 for the University of Arizona. As such, I take and teach classes. In  return, the university doesn’t charge me tuition and pays me a small stipend. Most graduate students who teach you ultimately want to become professors. To get a job as a professor, they need to show excellence in research—usually by writing articles and/or books—as well as in teaching.

For all three groups, much of their professional lives revolve around tenure, which brings additional job security, income, and prestige.

Two Masters

Most graduate students and non-tenured professors serve two masters: teaching and research. As an undergraduate, you primarily see their teaching side, and your instructors might seem like another version of high school teachers. For some if not most instructors, however, teaching is not their primary duty and interest; rather, they primarily want to conduct original research, which usually takes the form of writing articles (also sometimes called “papers”) and books. The papers you are assigned for many classes are supposed to help you prepare for more advanced writing and research.

Graduate students and professors feel constant tension between their teaching and their research / writing responsibilities. Good ones try to balance the two. For most graduate students and professors, however, published research leads to career advancement, better jobs, and, ultimately, tenure.

Many of your instructors will have stronger incentives to work on research than teaching. This doesn’t mean they will shirk teaching, but many do. Some teach creatively and diligently, as they should. But it’s nonetheless wise to understand the two masters most of your instructors face; they are usually rewarded much more for research than teaching.

In graduate school multiple professors told me to minimize my time spent teaching and maximize my time spent researching. This isn’t unusual advice. Grad students and non-tenured professors are often explicitly told not to waste time on teaching, since that doesn’t lead to advancement, and often imbibe a cultural atmosphere that denigrates teaching. This is important if you’re wondering why your professors seem distracted or uninterested in the classroom. Professors are often incentivized not to focus on teaching. Professional academics understand these facts well, but they’re surprisingly poorly understood by everyone else:

There is only one problem with telling students to seek out good teaching in college. They’re going to have some trouble finding it, because academic institutions usually don’t care about it. Oh, they’ll tell you otherwise, in their promotional material. But I advise you to be skeptical. The profession’s whole incentive structure is biased against teaching, and the more prestigious the school, the stronger the bias is likely to be. (Deresiewicz 180-1)

I personally think teaching is of great importance and that schools ought to reward teaching, but “what I personally think” and “what is true” are different in this situation.

Interacting with Professors, Adjuncts, and Graduate Students

To earn tenure (or work towards a PhD, or earning tenure), many professors and grad students spend long periods of time intensely studying a subject, most often but not exclusively through reading. They expect you to read the assigned material and have some background in reading more generally; if you don’t, expect a difficult time in universities.

Professors and other instructors have devoted or are devoting much of their lives to their subjects. As you might imagine, having someone say that they find a subject boring, worthless, or irrelevant often irritates professors, since if professors found their subject boring, worthless, or irrelevant, they wouldn’t have spent or be planning to spend their lives studying it.

Most make their subject their lives and vice-versa. They could in theory earn more money in other professions but choose not to pursue those professions, but they are often excited by knowledge itself and want to find others who share that excitement. If you say or imply their classes are worthless, you’ve said or implied that their entire lives are worthless. Most people do not like to think that their lives are worthless.

Professors can sometimes seem aloof or demanding. This is partially due to the demands placed on them (see “Two Masters,” above). Being aloof or demanding doesn’t mean a professor doesn’t like you. Most professors are interested in their students to the extent that students are interested in the subject being taught. Engaged professors often try to stir students’ interest in a subject, but actively hostile/ uninterested students will often find their instructors uninterested in them. Motivated and interested students often inspire the same in their professors.[2] It’s a virtuous cycle.

To be sure, there are exceptions: some professors will be hostile or uninterested regardless of how much effort a student shows, and some will be martyrs who try to reach even the most distant, disgruntled student. But most professors are in the middle, looking for students who are engaged and focusing on those students.

Nearly all your instructors have passed through the trials and tests they’re giving you: if they hadn’t done so, and excelled, they wouldn’t be teaching you. Thus, few are impressed when you allocate time poorly, try to cram before tests, appear hungover in class, and show up late to or miss class repeatedly. On the other hand, many will cut slack for diligent students who show promise.

One reason professors don’t think much of student excuses is because many students have different priorities than professors. As undergraduates, most professors were part of the “academic culture” on campus, to use Murray Sperber’s term (5); in contrast, many undergraduates are part of the collegiate (interested in the Greek system, parties, and football games) or vocational (interested in job training) cultures. The academic culture, according to Sperber, “[has a] minimal understanding of, and sympathy for, the majority of their undergraduate students” (7) at big public schools.

I think Sperber is too harsh, but the principle is accurate: if you aren’t in school to learn and develop your intellect—and most students in most schools aren’t, as Sperber shows—you probably won’t understand your professors and their motivations. But they will understand yours. Academics are a disproportionately small percentage of the student population at most schools but an extraordinary large proportion of grad students and professors.

Another book, Paying for the Party: How College Maintains Inequality, describes how many universities have evolved two or more tracks, but those tracks are mostly concealed from the students. One track is primarily academic, with hard, usually technical, majors that are highly demanding and that usually lead to developing important skills. The other track is primarily social and leaves students with fewer skills but lots of time to party. The latter track works reasonably well, or is at least not catastrophic, for students from wealthy and/or well-connected families that can get intellectually weak, low-skill students jobs upon graduation—even graduation with a dubious degree and four years of intense partying. The party/social track doesn’t work well for students with poorer or disconnected families. The more time I spend in the system the more apparent the two tracks become—and the more I wish students were explicitly told about them.

Requirements for Undergraduates

You can only graduate from a university if you pick a major and fulfill its requirements. Clark called its undergraduate requirements “Perspectives,” while the University of Arizona calls them “Gen Eds” or “General Education Requirements.” There is no way to avoid filling requirements, and most requirements demand that you spend a certain amount of time with your rear end in a seat at a certain number of classes. Fulfill as many requirements as possible as soon as you realize those requirements exist, assuming you want to graduate on time.

You’ll often be assigned an “academic advisor,” whose job it is to help keep you on track to graduate and to help you pick courses. Don’t be afraid of this person: he or she will often help you or point you to people who can help you. At bigger schools, your advisor will often seem harried or uninterested, but even if that person is, remember that he or she is still a valuable resource. And if you can’t get help from your counselor, find the requirements of potential majors or all majors and work toward checking them off, because you won’t be able to get out of them.

As an undergrad, I tried and found that there is virtually no negotiating with requirements, even if some are or seem silly. For example, Clark required that students take “science perspective.” In studying my schedule and options, I figured that astronomy was the easiest way out. Considering how useless astronomy looked, I decided to petition the Dean of Students to be excused from it so I could take better classes, arguing that I’d taken real science classes in high school and that I could be more productively engaged elsewhere. The answer came quickly: “no.”

Astronomy, as it was taught to me, consisted of tasks like memorizing the lengths of planets from the sun, what the Kuiper Belt is[3], and the like. Tests asked things like the size of each planet—in other words, to regurgitate facts that one can find in two seconds on Google, which is how I found out what the Kuiper Belt is again. The professor teaching it no longer appeared to have a firm grasp of his mental faculties; I think he was in his 80s. At least it was relatively easy: the only worse thing would’ve been having to take, say, chemistry, or a real science class.

That astronomy class was probably the most useless I took, and Clark’s tuition at that time was something like $22,000. I received a scholarship toward tuition, room, and board, so my tuition was probably closer to $16,000, or $8,000 per semester. Undergrads took four classes, so the useless astronomy class cost around $2,000. Would I have rather taken another English class, or computer science class, or a myriad of other subjects? You bet. But I couldn’t, and if I didn’t take some kind of science class, I wouldn’t have been able to graduate, no matter the uselessness of the class.

What should I major in?

I have a theory that virtually everything you learn in universities (and maybe life) is the substance or application of two (or three, depending on how you wish to count) abilities: math and reading/writing. Regardless of what you major in, work on building those two skills.

In the liberal arts, that most often means philosophy, English, and history; other majors vary by university, but those requiring a lot of reading and writing are almost always better than those that don’t. In the hard sciences and economics you’ll be left to develop your reading and writing skills on your own. And this does apply to you, whether you realize it or not. As software company founder and rich guy Joel Spolsky wrote:

Even on the small scale, when you look at any programming organization, the programmers with the most power and influence are the ones who can write and speak in English clearly, convincingly, and comfortably. Also it helps to be tall, but you can’t do anything about that.

The difference between a tolerable programmer and a great programmer is not how many programming languages  they know, and it’s not whether they prefer Python or Java. It’s whether they can communicate their ideas. By persuading other people, they get leverage.

So if you want leverage, learn how to write. And if liberal arts majors don’t want to be bamboozled by statistics, they better learn some math.

In short, I have no idea what you should major in. But you probably shouldn’t major in business, communication, sociology, or criminal justice, all of which are worthy subjects that, for most undergraduates, are sufficiently watered down that you’re unlikely to challenge yourself much. Odds are that you’ll even make more money as a philosophy major than a business management major (“Salary Increase by Major”).

Paul Graham wrote:

Thomas Huxley said “Try to learn something about everything and everything about something.” Most universities aim at this ideal.

But what’s everything? To me it means, all that people learn in the course of working honestly on hard problems. All such work tends to be related, in that ideas and techniques from one field can often be transplanted successfully to others. Even others that seem quite distant. For example, I write essays the same way I write software: I sit down and blow out a lame version 1 as fast as I can type, then spend several weeks rewriting it.

The reality is that your specific major probably doesn’t matter nearly as much as your tenacity, ability to learn, and the consistent application of that ability to learn to specific problems. One way  people—friends, employers, graduate schools, colleagues, etc.—measure this is by measuring the way you speak and write, which together are a proxy for how much and how deeply you’ve read.

A great deal of college is about teaching you how to learn, and reading is probably the fastest way to learn. Once you’ve mastered the art of reading, you’ll be set for life, provided you keep exercising the skills you develop at a university. Keep that in mind as you search for majors: those that assign more reading, more writing, and more math are probably more worthwhile than those that  don’t.

Many people have many opinions about what you should major in, and most of them are probably wrong. This one included. As I said previously, it probably doesn’t matter in the long run, so don’t worry much about what to major in—worry about finding something you’re passionate about and something you love. In Prelude to Mathematics, W.W. Sawyer wrote: “An activity engaged in purely for its consequences, without any pleasure for the activity itself, is likely to be poorly executed” (16 – 17). If possible, find something to major in which you enjoy for itself, or which you can learn to enjoy for itself.

Regardless of what you major in, let me reiterate something I wrote in the introduction: you are or should be most responsible for your own learning. This is true not only in school but in your entire life. You will get some bad teachers, some bad bosses, some bad clients, and some bad situations in your life. Nonetheless it is your responsibility to keep learning, to overcome obstacles, and to help yourself.

Students often want to be spoon-fed everything, but that’s not how the world works. People generally pay other people to solve their problems. Your goal is to develop the skills it takes to solve the problems other people have, so that they pay you. Let’s look at some professions and how, in an ideal world, each profession solves a problem:

  • Cop: Solves the need for public safety.
  • Scientist: Solves the need for learning how things actually work, and, tangentially to that, how to turn ideas and facts into products.
  • Petroleum Engineer: Solves the need for energy, which people require to get from point A to B via car, plane, or train, and for electricity.
  • Teacher: Solves the need for education, and helps turn economically useless children into productive adults (Senior).
  • Social Media Analyst: Solves the need to advertise through numerous electronic platforms.

You can occasionally find situations in which it’s possible to get paid without solving someone’s problem, but they’re rare. There are also important jobs that are nonetheless illegal but can be analyzed through the same method as the bullet above (for example, prostitutes solve the need for sex, and drug dealers solve the need for different experiences). People on the cutting edge of technology and social change often solve needs for themselves—Mark Zuckerberg needed a way to communicate with others online before most people really noticed that need.

Your teachers and professors, including me, are often not that good at identifying such needs.

Finally, note that you often can’t predict what will be useful and what won’t be. It’s also possible that the people designing your curriculum know more about the subject than you do.

How do I get an A?

One thing you shouldn’t do is say that all you want to do is get an A: as stated above, most professors are completely and utterly invested in their subject. When you ask how you get an “A,” they’re likely to be annoyed because you’re indicating you don’t care about learning, which is the best way to earn an A. Instead, you care about the badge. It’s like asking how you become poet laureate, as Ebenezer Cooke does in The Sot-Weed Factor: the question itself is wrong, because the right question is how you become a poet, and the laureateship will follow (Barth 73). If you ask professors how to get an A, they’ll also tell you what you already know: work hard at the class, show up, read the book(s) and related materials, form study groups, etc.

Another grad student in English said that she’s almost relieved when students say they just want to get an A, because it means she doesn’t have to worry about them or their grade. Paradoxically, when you say that you just want an A/B/C, you lower the probability that you’ll actually get it.

To get that A/B/C, demonstrate that you’re interested in the material, do all the reading, and show up to class every day. Go to the professor’s office hours to ask intelligent questions—like whether you’re on the right track regarding a paper—or what you could’ve done better on a quiz. By doing so, you’re showing that you’re interested in doing better, rather than saying you are. Novelists have a saying: “show, don’t tell,” which means that you should show what a character is thinking and why they are acting in a certain way rather than telling the reader. Readers are smart and will figure it out for themselves. Your professors will be able to figure out in a million ways whether you’re interested in a subject, and when you ask how you get an A, they’ll know you aren’t.

Oh, and don’t fear the library—it’s the big place with the books. If you conduct research with books, your professors will be impressed. And learn to use the online journals. If you don’t know what this  means, ask a librarian, who will assist you. They very seldom bite and are there to help, and most schools also conduct library help sessions at the beginning of each year. Indeed, almost everyone at a university is there to help you learn; you just need to a) want to learn and b) ask. Many students never get to point a, and of those who do, more should get to point b.

Reflection

I wrote this now because I’m old enough to, I think, have some perspective on universities while still being young enough to remember the shock and bewilderment of the first semester of my freshman year. This document reflects my academic training and preoccupation: it contains allusions and references to other work and is structured in such a way that you can skip easily from section to section. As a trade-off for its detail, however, weaker or uninterested students might lose interest in it before they come to the end, which is unfortunate because it describes the world they will largely be inhabiting for somewhere between one week and six if not more years.

Anecdotes from my own academic experience are included because discovering facts about the incentives in university life didn’t occur all at once for me. No one gave me a document like this; I was expected to either already know or understand most of what you just read, and as a result, I spent years drawing a mental map of universities. The professors and graduate students had spent long enough in the university atmosphere that they knew how universities were structured with the thoroughness you know your native language. I’ve written this in the hope that it will better explain to you (in the plural sense) what I’ve explained to many individuals.

My natural impetus is to remember when I have to repeat the same things over and over again, consider how I might convey all the things I’ve said to a large number of people, and then write those things down so that they might be read, which is a vastly more efficient information transfer mechanism than speech. Nonetheless, I realize that this document and my explanations are probably not perfect, so if you’ve read this to the best of your ability and still have questions, don’t be afraid to ask them. One thing universities should inculcate is inquisitiveness, and I hope I do so as a teacher and as a person.

Notice that this document has a version number in the upper-right corner: as time goes on and I receive questions or comments, I’ll probably change this document to reflect new concerns. When you ask questions, you’re not only helping yourself discover something: you’re helping the person you’re asking better understand the subject at hand and the nature of what they’re trying to say. By asking me questions about this document, you might help me ultimately improve it, and ultimately help those who read it in the future. If there is one cultural advantage universities should impart more than any other, it is the ability to ask questions about even the most fundamental things; confusion and uncertainty are often the sources of new knowledge.

As Paul Krugman, who won the 2008 Nobel Prize for Economics, said of his own research (which led him to the prize):

The models I wrote down that winter and spring were incomplete, if one demanded of them that they specify exactly who produced what. And yet they told meaningful stories. It took me a long time to express clearly what I was doing, but eventually I realized that one way to deal with a difficult problem is to change the question — in particular by shifting levels.

He also has a section called “question the question,” in which he recursively asks himself whether the question he has asked is the right one. For him, as for many people, questions are at the center of the learning universe, and if you learn to ask them promiscuously and then seek the answers, whether from me, your other professors, or from books, you’ll be better equipped to find the answers, do well in college, and do well in life. One challenge is often learning enough to be able to formulate the right questions, and with this in mind, I hope you know how to ask important questions about the institution you’re attending.

As noted previously, you can also download this essay in .pdf form.

Works Cited [4]

Barth, John. The Sot-Weed Factor. New York: Anchor Books, 1987.

Graham, Paul. “Undergraduation.” Personal website. March 2005. Accessed 7 December 2008. <http://paulgraham.com/college.html&gt;

Deresiewicz, William. Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life. Free Press, 2014.

Krugman, Paul. “How I Work.” Personal website. Accessed 11 November 2008. <http://web.mit.edu/krugman/www/howiwork.html&gt;

“Salary Increase by Major.” The Wall Street Journal. Undated. Accessed 7 December 2008. <http://online.wsj.com/public/resources/documents/info-Degrees_that_Pay_you_Back-sort.html?mod=googlenews_wsj&gt;

Sawyer, W.W. Prelude to Mathematics. New York: Dover Publications, 1982.

Sperber, Murray. Beer and Circus: How Big Time College Sports is Crippling Undergraduate Education. New York: Henry Holt and Company, 2001.

Spolsky, Joel. “Advice for Computer Science College Students.” Personal website. 2 January 2005. Accessed 7 December 2008. <http://joelonsoftware.com/articles/CollegeAdvice.html&gt;

“tenure.” The New Oxford American Dictionary. 2010. Mac OS X 10.6 Operating System.


[1] One useful study tip: if you read or hear a word you don’t know, look it up. You’ll expand your vocabulary and, concomitantly, the range of your thinking.

[2] In the hard sciences, for example, it’s often wise to ask professors if you can join their research labs, where you’ll gain valuable experience and make important connections. But most undergraduates don’t seem to realize that the first thing they have to do is ask. The second thing they need to do is show their professors that they won’t be a waste of time.

[3] A bunch of rocks near Neptune’s orbit, for those of you wondering.

[4] Writers include works cited pages so others can draw on the sources used to construct an argument. Contrary to popular belief among freshmen, they’re not just pointless hoops teachers set up, and they become progressively more important as you matriculate.

So you wanna be a writer: What Anthony Bourdain can tell you even when he's not talking about writing

There’s a great essay called “So You Wanna Be a Chef” by Anthony Bourdain, who wrote Kitchen Confidential. Based on “So You Wanna Be a Chef,” culinary schools sound rather like MFA programs. Money drives both decisions, even when artistry is supposed to:

But the minute you graduate from school—unless you have a deep-pocketed Mommy and Daddy or substantial savings—you’re already up against the wall. Two nearly unpaid years wandering Europe or New York, learning from the masters, is rarely an option. You need to make money NOW.

You could replace “cooking” with “writing” and “being a chef” with “being a writer” in Bourdain’s essay and have more or less the same outcome. Going into the “hotels and country clubs” side of the business is like getting tenure as a professor. There are a few differences between the fields—you’re never too old to be a writer—but similarities proliferate. Like this:

Male, female, gay, straight, legal, illegal, country of origin—who cares? You can either cook an omelet or you can’t. You can either cook five hundred omelets in three hours—like you said you could, and like the job requires—or you can’t. There’s no lying in the kitchen.

You can either sit (or stand) at a computer for years, producing words, or you can’t. There’s no lying at the keyboard. If you want to be a writer, the keyboard is where you’re going to spend a lot of your time (Michael Chabon on book tour in Seattle for The Yiddish Policemen’s Union: “If you want to write a novel you have to sit on your ass.” I can testify that the same is true of writing a blog). All the chatter in the world about how how you prefer early Ian McEwan to late Ian McEwan isn’t going to help you produce words.

As with many disciplines, what’s important is not just being good or adequate—it’s being amazing. “There is, as well, a big difference between good work habits (which I have) and the kind of discipline required of a cook at Robuchon.” There is a big difference between good work habits and being an artist: a surprisingly large number of people can crap out a novel if given sufficient time and motivation. Milan Kundera in The Curtain:

Every novel created with real passion aspires quite naturally to a lasting aesthetic value, meaning to a value capable of surviving its author. To write without having that ambition is cynicism: a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.

This overstates the case: an indifferent or “mediocre” novel by a “mediocre novelist” does not tangibly hurt anyone, and its most likely fate is to be ignored—which is the most likely fate of any novelist. But the writer needs to aspire “to a lasting aesthetic value,” which means that merely existing and producing something isn’t enough. Hence my derogatory phrase: “crap out a novel.”

Instead of traveling to “Find out how other people live and eat and cook,” as Bourdain tells the chef to do, the writer must read widely and voraciously and omnivorously. If you’re writing in a genre, read the classics. If you’re a literary novelist, read some of the better genre fiction (it’s out there). Read books about writing. Read books not about writing to learn how the world works. Get out of your literary comfort zone with some frequency. You’ll need it.

Also wise: “Treating despair with drugs and alcohol is a time-honored tradition—I’d just advise you to assess honestly if it’s really as bad and as intractable a situation as you think.” Steven King writes in On Writing about his own problems with drugs. He points out that drinking or taking drugs doesn’t make you a writer—if you’re a writer, you might drink or take drugs, but skipping straight to the drugs doesn’t do anything for you.

The bottom line: creative fields and top performers in many disciplines appear to have more in common than not. From what I’ve read, the same basic dynamic described by Bourdain applies not just to cooking and writing, but to software hacking, most kinds of research, athletes, architecture, music, and most forms of art. Don’t pursue these fields unless you want to master them. And you probably don’t. And if you do, you might be better off not realizing how difficult they are before you start, because you might never start.

So you wanna be a writer: What Anthony Bourdain can tell you even when he’s not talking about writing

There’s a great essay called “So You Wanna Be a Chef” by Anthony Bourdain, who wrote Kitchen Confidential. Based on “So You Wanna Be a Chef,” culinary schools sound rather like MFA programs. Money drives both decisions, even when artistry is supposed to:

But the minute you graduate from school—unless you have a deep-pocketed Mommy and Daddy or substantial savings—you’re already up against the wall. Two nearly unpaid years wandering Europe or New York, learning from the masters, is rarely an option. You need to make money NOW.

You could replace “cooking” with “writing” and “being a chef” with “being a writer” in Bourdain’s essay and have more or less the same outcome. Going into the “hotels and country clubs” side of the business is like getting tenure as a professor. There are a few differences between the fields—you’re never too old to be a writer—but similarities proliferate. Like this:

Male, female, gay, straight, legal, illegal, country of origin—who cares? You can either cook an omelet or you can’t. You can either cook five hundred omelets in three hours—like you said you could, and like the job requires—or you can’t. There’s no lying in the kitchen.

You can either sit (or stand) at a computer for years, producing words, or you can’t. There’s no lying at the keyboard. If you want to be a writer, the keyboard is where you’re going to spend a lot of your time (Michael Chabon on book tour in Seattle for The Yiddish Policemen’s Union: “If you want to write a novel you have to sit on your ass.” I can testify that the same is true of writing a blog). All the chatter in the world about how how you prefer early Ian McEwan to late Ian McEwan isn’t going to help you produce words.

As with many disciplines, what’s important is not just being good or adequate—it’s being amazing. “There is, as well, a big difference between good work habits (which I have) and the kind of discipline required of a cook at Robuchon.” There is a big difference between good work habits and being an artist: a surprisingly large number of people can crap out a novel if given sufficient time and motivation. Milan Kundera in The Curtain:

Every novel created with real passion aspires quite naturally to a lasting aesthetic value, meaning to a value capable of surviving its author. To write without having that ambition is cynicism: a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.

This overstates the case: an indifferent or “mediocre” novel by a “mediocre novelist” does not tangibly hurt anyone, and its most likely fate is to be ignored—which is the most likely fate of any novelist. But the writer needs to aspire “to a lasting aesthetic value,” which means that merely existing and producing something isn’t enough. Hence my derogatory phrase: “crap out a novel.”

Instead of traveling to “Find out how other people live and eat and cook,” as Bourdain tells the chef to do, the writer must read widely and voraciously and omnivorously. If you’re writing in a genre, read the classics. If you’re a literary novelist, read some of the better genre fiction (it’s out there). Read books about writing. Read books not about writing to learn how the world works. Get out of your literary comfort zone with some frequency. You’ll need it.

Also wise: “Treating despair with drugs and alcohol is a time-honored tradition—I’d just advise you to assess honestly if it’s really as bad and as intractable a situation as you think.” Steven King writes in On Writing about his own problems with drugs. He points out that drinking or taking drugs doesn’t make you a writer—if you’re a writer, you might drink or take drugs, but skipping straight to the drugs doesn’t do anything for you.

The bottom line: creative fields and top performers in many disciplines appear to have more in common than not. From what I’ve read, the same basic dynamic described by Bourdain applies not just to cooking and writing, but to software hacking, most kinds of research, athletes, architecture, music, and most forms of art. Don’t pursue these fields unless you want to master them. And you probably don’t. And if you do, you might be better off not realizing how difficult they are before you start, because you might never start.

Eat, Pray, Love and the misery of the literary agent

Literary agents are flooded with pitches for the next Eat, Pray, Love. Fortunately, one of the few things I haven’t done wrong in searching for an agent is pitching the next Eat, Pray, Love, which probably isn’t a surprise since I read about 15 pages of the first one, thought it was dumb, and gave it back to the woman who had a copy (without my observation on its literary merit). To me, the oddest thing about the book is that it states or implies that going to exotic countries allows to discover yourself, or whatever. But to my mind, you can eat good food here (I try to and usually succeed), pray wherever, and love… well, that’s around too. Less common in the suburbs, I suppose, but still.

Mostly I’m reminded of friends in college who were like, “We’re going to MEXCIO for spring break to get drunk and hook up!!!” (Sometimes the destination would be Europe, the Caribbean, etc., and usually they’d say “party” as a euphemism for “get drunk and hook up.”) To which I would usually respond, “Can’t you do that sort of thing at home?” Usually they’d look at me strangely, like I’d suggested they consider eating a tarantula. It’s the same look I get when I suggest that You Will Suffer Humiliation When The Sports Team From My Area Defeats The Sports Team From Your Area.

I wonder if people implicitly believe that traveling changes the rules and social norms to which they’re accustomed, creating a Midsummer Night’s Dream-style scenario. If so, couldn’t they change the rules where they live through deciding, “I’m not going to play by the standard one rules anyway?” After all, Western culture has a rich tradition of this kind of thing: think of the Transcendentalists, Herman Hesse, Gay Talese, and Baywatch (Okay, that last one is a test of who’s paying attention). The epiphany is a regular occurrence in Joyce, especially The Portrait of the Artist as a Young Man. If we need to be “transformed by an experience that allowed us to step outside ourselves,” we might find that in fiction as easily as Indonesia. Katie Roiphe says that the TV show Mad Men offers “The Allure of Messy Lives.” We can make a mess and find self-fulfillment at home as easily as elsewhere!

Still, the Slate article says Gilbert is a good writer overall, and I read the book long enough ago not to keep slagging that part of it. To me, the setup sounds like the silliest part, but the money shot of the article comes at the end: “So be warned. If your proposal mentions a book that’s been on the bestseller list for more than 180 weeks, it may be a sign that your book isn’t worth writing.”

If your idea for life fulfillment comes from a book that’s been the bestseller list for more than 180 weeks, it may be a sign that you’re seeking fulfillment from the wrong place.

David Shields’ Reality Hunger and James Wood’s philosophy of fiction

In describing novels from the first half of the 19th Century, David Shields writes in Reality Hunger: A Manifesto that “All the technical elements of narrative—the systematic use of the past tense and the third person, the unconditional adoption of chronological development, linear plots, the regular trajectory of the passions, the impulse of each episode toward a conclusion, etc.—tended to impose the image of a stable, coherent, continuous, unequivocal, entirely decipherable universe.”

I’m not so sure; the more interesting novels didn’t necessarily have “the unconditional adoption of chronological development” or the other features Shields ascribes to them. Caleb Williams is the most obvious example I can immediately cite: the murderers aren’t really punished in it and madness is perpetual. Gothic fiction of the 19th Century had a highly subversive quality that didn’t feature “the regular trajectory of the passions.” To my mind, the novel has always had unsettling features and an unsettling effect on society, producing change even when that change isn’t immediately measurable or apparent, or when we can’t get away from the fundamental constraints of first- or third-person narration. Maybe I should develop this thought more: but Shields doesn’t in Reality Hunger, so maybe innuendo ought to be enough for me too.

Shields is very good at making provocative arguments and less good at making those arguments hold up under scrutiny. He says, “The creators of characters, in the traditional sense, no longer manage to offer us anything more than puppets in which they themselves have ceased to believe.” Really? I believe if the author is good enough. And I construct coherence where it sometimes appears to be lacking. Although I’m aware that I can’t shake hands with David Kepesh of The Professor of Desire, he and the characters around him feel like “more than puppets” in which Roth has ceased to believe.

Shields wants something made new. Don’t we all? Don’t we all want to throw off dead convention? Alas: few of us know how to successfully, and that word “successfully” is especially important. You could write a novel that systematically eschews whatever system you think the novel imposes (this is the basic idea behind the anti-novel), but most people probably won’t like it—a point that I’ll come back to. We won’t like it because it won’t seem real. Most of us have ideas about reality that are informed by some combination of lived experience and cultural conditioning. That culture shifts over time. Shields starts Reality Hunger with a premise that is probably less contentious than much of the rest of the manifesto: “Every artistic movement from the beginning of time is an attempt to figure out a way to smuggle more of what the artist thinks is reality into the work of art.” I can believe this, though I suspect that artists begin getting antsy when you try to pin them down on what reality is: I would call it this thing we all appear to live in but that no one can quite represent adequately.

That includes Shields. Reality Hunger doesn’t feel as new as it should; it feels more like a list of N things. It’s frustrating even when it makes one think. Shields says, “Culture and commercial languages invade us 24/7.” But “commercial languages” only invade us because we let them: TV seems like the main purveyor, and if we turn it off, we’ll probably cut most of the advertising from our lives. If “commercial languages” are invading my life to the extent I’d choose the word “invade,” I’m not aware of it, partially because I conspicuously avoid those languages. Shields says, “I try not to watch reality TV, but it happens anyway.” This is remarkable: I’ve never met anyone who’s tried not to watch reality TV and then been forced to, or had reality TV happen to them, like a car accident or freak weather.

Still, we need to think about how we experience the world and depict it, since that helps us make sense of the world. For me, the novel is the genre that does this best, especially when it bursts its perceived bounds in particularly productive ways. I can’t define those ways with any rigor, but the novel has far more going on than its worst and best critics imagine.

Both the worst and best critics tend to float around the concept of reality. To use Luc Sante’s description in “The Fiction of Memory,” a review of Reality Hunger:

The novel, for all the exertions of modernism, is by now as formalized and ritualized as a crop ceremony. It no longer reflects actual reality. The essay, on the other hand, is fluid. It is a container made of prose into which you can pour anything. The essay assumes the first person; the novel shies from it, insisting that personal experience be modestly draped.

I’m not sure what a “crop ceremony” is or how the novel is supposed to reflect “actual reality.” Did it ever? What is this thing called reality that the novel is attempting to mirror? Its authenticity or lack thereof has, as far as I know, always been in question. The search for realism is always a search and never a destination, even when we feel that some works are more realistic than others.

Yet Sante and Sheilds are right about the dangers of rigidity; as Andrew Potter writes in The Authenticity Hoax: How We Get Lost Finding Ourselves, “One effect of disenchantment is that pre-existing social relations come to be recognized not as being ordained by the structure of the cosmos, but as human constructs – the product of historical contingencies, evolved power relations, and raw injustices and discriminations.”

Despite this, however, we feel realism—if none of us did, we’d probably stop using the term. Our definitions might blur when we approach a precise definition, but that doesn’t mean something isn’t there.

Sante writes, quoting Shields, that “‘Anything processed by memory is fiction,’ as is any memory shaped into literature.” Maybe: but consider these three statements, if I were to make them to you (keep in mind the context of Reality Hunger, with comments like “Try to make it real—compared to what?”):

Aliens destroyed Seattle in 2004.

I attended Clark University.

Alice said she was sad.

One of them is, to most of us, undoubtedly fiction. One of them is true. The other I made up: no doubt there is an Alice somewhere who has said she is sad, but I don’t know her and made her up for the purposes of example. The second example might be “process by memory,” but I don’t think that makes it fiction, even if I can’t give you a firm, rigorous, absolute definition of where the gap between fact and interpretation begins. Jean Bricmont and Alan Sokal give it a shot in Fashionable Nonsense: “For us, as for most people, a ‘fact’ is a situation in the external world that exists irrespective of the knowledge that we have (or don’t have) of it—in particular, irrespective of any consensus or interpretation.”

They go to observe that scientists actually face some problems of definition that I see as similar to those of literature and realism:

Our answer [as to what makes science] is nuanced. First of all, there are some general (but basically negative) epistemological principles, which go back at least to the seventeenth century: to be skeptical of a priori arguments, revelation, sacred texts, and arguments from authority. Moreover, the experience accumulated during three centuries of scientific practice has given us a series of more-or-less general methodological principles—for example, to replicate experiments, to use controls, to test medicines in double-blind protocols—that can be justified by rational arguments. However, we do not claim that these principles can be codified in a definite way, nor that the list is exhaustive. In other words, there does not exist (at least present) a complete codification rationality, is always an adaptation to a new situation.

They lay out some criteria (beware of “revelation, sacred texts, and arguments from authority”) and “methodological principles” (“replicate experiments”) and then say “we do not claim that these principles can be codified in a definite way.” Neither can the principles of realism. James Wood does as good a job of exploring them as anyone. But I would posit that, despite our inability to pin down realism, either as convention or not, most of us recognize it: when I tell people that I attended Clark University, none have told me that my experience is an artifact of memory, or made up, or that there is no such thing as reality and therefore I didn’t. Such realism might merely be convention or training—or it might be real.

In the first paragraph of his review of Chang-Rae Lee’s The Surrendered, James Wood lays out the parameters of the essential question of literary development or evolution:

Does literature progress, like medicine or engineering? Nabokov seems to have thought so, and pointed out that Tolstoy, unlike Homer, was able to describe childbirth in convincing detail. Yet you could argue the opposite view; after all, no novelist strikes the modern reader as more Homeric than Tolstoy. And Homer does mention Hector’s wife getting a hot bath ready for her husband after a long day of war, and even Achilles, as a baby, spitting up on Phoenix’s shirt. Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation. The novel is peculiar in this respect, because while anyone painting today exactly like Courbet, or composing music exactly like Brahms, would be accounted a fraud or a forger, much contemporary fiction borrows the codes and conventions—the basic narrative grammar—of Flaubert or Balzac without essential alteration.

I don’t think literature progresses “like medicine or engineering.” Using medical or engineering knowledge as it stood in 1900 would be extremely unwise if you’re trying to understand the genetic basis of disease or build a computer chip. Papers tend to decay within five to ten years of publication in the sciences.

But I do think literature progresses in some other, less obvious way, as we develop wider ranges of techniques and social constraints allow for wider ranges of subject matter or direct depiction: hence why Nabakov can point out that “Tolstoy, unlike Homer, was able to describe childbirth in convincing detail,” and I can point out that mainstream literature effectively couldn’t depict explicit sexuality until the 20th Century.

While that last statement can be qualified some, it is hard to miss the difference between a group of 19th Century writers like Thackeray, Dickens, Trollope, George Eliot, George Meredith, and Thomas Hardy (who J. Hillis Miller discusses in The Form of Victorian Fiction) and a group of 20th Century writers like D.H. Lawrence, James Joyce, Norman Rush, and A.S. Byatt, who are free to explicitly describe sexual relationships to the extent they see fit and famously use words like “cunt” that simply couldn’t be effectively used in the 19th Century.

In some ways I see literature as closer to math: the quadratic equation doesn’t change with time, but I wouldn’t want to be stuck in a world with only the quadratic equation. Wood gets close to this when he says that “Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation.” The word “perhaps” is essential in this sentence: it gives a sense of possibility and realization that we can’t effectively answer the question, however much we might like to. But both question and answer give a sense of some useful parameters for the discussion. Most likely, literature isn’t exactly like anything else, and its development (or not) is a matter as much of the person doing the perceiving and ordering as anything intrinsic to the medium.

I have one more possible quibble with Wood’s description when he says that “the basic narrative grammar—of Flaubert or Balzac without essential alteration.” I wonder if it really hasn’t undergone “essential alteration,” and what would qualify as essential. Novelists like Elmore Leonard, George Higgins, or that Wood favorite Henry Green all feel quite different from Flaubert or Balzac because of how they use dialog to convey ideas. The characters in Tom Perrotta’s Election speak in a much more slangy, informal style than do any in Flaubert or Balzac, so far as I know. Bellow feels more erratic than the 19th Century writers and closer to the psyche, although that might be an artifact of how I’ve been trained by Bellow and writers after Bellow to perceive the novel and the idea of psychological realism. Taken together, however, the writers mentioned make me think that maybe “the basic narrative grammar” has changed for writers who want to adopt new styles. Yes, we’re still stuck with first- and third-person perspectives, but we get books that are heavier on dialog and lighter on formality than their predecessors.

Wood is a great chronicler of what it means to be real: his interrogation of this seemingly simple term runs through the essays collected in The Irresponsible Self: On Laughter and the Novel, The Broken Estate: Essays on Literature and Belief, and, most comprehensively, in the book How Fiction Works. Taken together, they ask how the “basic narrative grammar” of fiction works or has worked up to this point. In setting out some of the guidelines that allow literary fiction to work, Wood is asking novelists to find ways to break those guides in useful and interesting ways. In discussing Reality Hunger, Wood says, “[Shields’] complaints about the tediousness and terminality of current fictional convention are well-taken: it is always a good time to shred formulas.” I agree and doubt many would disagree, but the question is not merely one of “shred[ing] formulas,” but how and why those formulas should be shred. One doesn’t shred the quadratic formula: it works. But one might build on it.

By the same token, we may have this “basic narrative grammar” not because novelists are conformist slackers who don’t care about finding a new way forward: we may have it because it’s the most satisfying or useful way of conveying a story. Although I don’t think this is true, I think it might be true. Maybe most people won’t find major changes to the way we tell stories palatable. Despite modernism and postmodernism, fewer people appear to enjoy the narrative confusion and choppiness of Joyce than do enjoy the streamlined feel of the latest thriller. That doesn’t mean the latter is better than the former—by my values, it’s not—but it does mean that the overall thrust of fiction might remain where it is.

Robert McKee, in his not-very-good-but-useful book Story: Substance, Structure, Style and The Principles of Screenwriting, gives three major kinds of plots, which blend into one another: “arch plots” that are causal in nature and finish their story lines; “mini plots,” which he says are open and “strive for simplicity and economy while retaining enough of the classical […] to satisfy the audience,” and antiplot, which are where absurdism and the like fall.

He says that as one moves “toward the far reaches of Miniplot, Antiplot, and Non-plot, the audience shrinks” (emphasis in original). From there:

The atrophy has nothing to do with quality or lack of it. All three corners of the story triangle gleam with masterworks that the world treasures, pieces of perfection for our imperfect world. Rather, the audience shrinks for this reason: Most human beings believe that life brings closed experiences of absolute, irreversible change; that their greatest sources of conflict are external to themselves; that they are the single and active protagonists of their own existence; that their existence operates through continuous time within a consistent, causally interconnected reality; and that inside this reality events happen for explainable and meaningful reasons.

The connection between this and Wood’s “basic narrative grammar” might appear tenuous, but McKee and Wood are both pointing towards the ways stories are constructed. Wood is more concerned with language; although plot and its expression (whether in language or in video) can’t be separated from one another, they can still be analyzed independently enough of one another to make a distinction.

The conventions that underlie the “arch plots,” however, can become tedious over time. This is what Wood is highlighting when he discusses Roland Barthes’ “reality effect,” which fiction can achieve: “All this silly machinery of plotting and pacing, this corsetry of chapters and paragraphs, this doxology of dialogue and characterization! Who does not want to explode it, do something truly new, and rouse the implication slumbering in the word ‘novel’?” Yet we need some kind of form to contain story; what is that form? Is there an ideal method of conveying story? If so, what if we’ve found it and are now mostly tinkering, rather than creating radical new forms? If we take out “this silly machinery of plotting and pacing” and dialog, we’re left with something closer to philosophy than to a novel.

Alternately, maybe we need the filler and coordination that so many novels consist of if those novels are to be felt true to life, which appears to be one definition of what people mean by “realistic.” This is where Wood parts with Barthes, or at least makes a distinct case:

Convention may be boring, but it is not untrue simply because it is conventional. People do lie on their beds and think with shame about all that has happened during the day (at least, I do), or order a beer and a sandwich and open their computers; they walk in and out of rooms, they talk to other people (and sometimes, indeed, feel themselves to be talking inside quotation marks); and their lives do possess more or less traditional elements of plotting and pacing, of suspense and revelation and epiphany. Probably there are more coincidences in real life than in fiction. To say “I love you” is to say something at millionth hand, but it is not, then, necessarily to lie.

“Convention may be boring, but it is not untrue simply because it is conventional,” and the parts we think of as conventional might be necessary to realism. In Umberto Eco’s Reflections on The Name of the Rose, he says that “The postmodern reply to the modern consists of recognizing that the past, since it cannot really be destroyed, because its destruction leads to silence, must be revisited: but with irony, not innocently.” That is often the job of novelists dealing with the historical weight of the past and with conventions that are “not untrue simply because [they are] conventional.” Eco and Wood both use the example of love to demonstrate similar points. Wood’s is above; Eco says:

I think of the postmodern attitude as that of a man who loves a very cultivated woman and knows he cannot say to her, ‘I love you madly,’ because he knows that she knows (and that she knows that he knows) that these words have already been written by Barbara Cartland. Still, there is a solution. He can say, ‘As Barbara Cartland would put it, I love you madly.’ At this point, having avoided false innocence, having said clearly that it is no longer possible to speak innocently, he will nevertheless have said what he wanted to say to the woman: that he loves her, but he loves her in an age of lost innocence. If the woman goes along with this, she will have received a declaration of love all the same. Neither of the two speakers will feel innocent, both will have accepted the challenge of the past, of the already said, which cannot be eliminated […]

I wonder if every age thinks of itself as “an age of lost innocence,” only to be later looked on as pure, naive, or unsophisticated. Regardless, for Eco postmodernism requires that we look to the past long enough to wink and then move on with the story we’re going to tell in the manner we’re going to tell it. Perhaps Chang-Rae Lee doesn’t do so in The Surrendered, which is the topic of Wood’s essay—but like so many essays and reviews, Wood’s starts with a long and very useful consideration before coming to the putative topic of its discussion. Wood speaks of reading […] “Chang-Rae Lee’s new novel, “The Surrendered” (Riverhead; $26.95)—a book that is commendably ambitious, extremely well written, powerfully moving in places, and, alas, utterly conventional. Here the machinery of traditional, mainstream storytelling threshes efficiently.” I haven’t read The Surrendered and so can’t evaluate Wood’s assessment.

Has Wood merely overdosed on the kind of convention that Lee uses, as opposed to convention itself? If so, it’s not clear how that “machinery” could be fixed or improved on, and the image itself is telling because Wood begins his essay by asking whether literature is like technology. My taste in literature changes: as a teenager I loved Frank Herbert’s Dune and now find it almost unbearably tedious. Other revisited novels hold up poorly because I’ve overdosed on their conventions and start to crave something new—a lot of fantasy flattens over time like opened soda.

Still, I usually don’t know what “something new” entails until I read it. That’s the problem with saying that the old way is conventional or boring: that much is easier to observe than the fix. Wood knows it, and he’s unusually good at pointing to the problems of where we’ve been and pointing to places that we might go to fix it (see, for example, his recent essay on David Mitchell, who I now feel obliged to read). This, I suspect, is why he is so beloved by so many novelists, and why I spend so much time reading him, even when I don’t necessarily love what he loves. The Quickening Maze struck me as self-indulgent and lacking in urgency, despite the psychological insight Adam Foulds offers into a range of characters’ minds: a teenage girl, a madman, an unsuccessful inventor.

I wanted more plot. In How Fiction Works, Wood quotes from Adam Smith writing in the eighteenth century regarding how writers use suspense to maintain reader interest and then says that “[…] the novel [as an art form; one could also say the capital-N Novel] soon showed itself willing to surrender the essential juvenility of plot […]” Yet I want and crave this element that Wood dismisses—perhaps because of my (relatively) young age: Wood says that Chang-Rae Lee’s Native Speaker was “published when the author was just twenty-nine,” older than I am. I like suspense and the sense of something major at stake, and that could imply that I have a weakness for weak fiction. If so, I can do little more than someone who wants chocolate over vanilla, or someone who wants chocolate despite having heard the virtues of cherries extolled.

When I hear about the versions of the real, reality, and realism that get extolled, I often begin to think about chocolate, vanilla, and cherries, and why some novelists write in such a way that I can almost taste the cocoa while others are merely cardboard colored brown. Wood is very good at explaining this, and his work taken together represents some of the best answers to the questions that we have.

Even the best answers lead us toward more questions that are likely to be answered best by artists in a work of art that makes us say, “I’ve never seen it that way before,” or, better still, “I’ve never seen it.” Suddenly we do see, and we run off to describe to our friends what we’ve seen, and they look at us and say, “I don’t get it,” and we say, “maybe you just had to see it for yourself.” Then we pass them the book or the photo or the movie and wait for them to say, “I’ve already seen this somewhere before,” while we argue that they haven’t, and neither have we. But we press on, reading, watching, thinking, hoping to come across the thing we haven’t seen before so we can share it again with our friends, who will say, like the critics do, “I’ve seen it before.”

So we have. And we’ll see it again. But I still like the sights—and the search.

The Shallows: What the Internet is Doing to Our Brains — Nicholas Carr

One irony of this post is that you’re reading a piece on the Internet about a book that is in part about how the Internet is usurping the place of books. In The Shallows, Carr argues that the Internet encourages short attention spans, skimming, shallow knowledge, and distraction, and that this is a bad thing.

He might be right, but his argument misses one essential component: the absolute link between the Internet and distraction. He cites suggestive research but never quite crosses the causal bridge from the Internet as inherently distracting, both because of links and because of the overwhelming potential amount of material out there, and that we as a society and as a people are now endlessly distracted. Along the way, there are many soaring sentiments (“Our rich literary tradition is unthinkable without the intimate exchanges that take place between reader and writer within the crucible of a book”) and clever quotes (Nietzsche as quoted by Carr: “Our writing equipment takes part in the forming of our thoughts”), but that causal link is still weak.

I liked many of the points Carr made; that one about Nietzsche is something I’ve meditated over before, as shown here and here (I’ve now distracted you and you’re probably less likely to finish this post than you would be otherwise; if I offered you $20 for repeating the penultimate sentence in the comments section, I’d probably get no takers); I think our tools do cause us to think differently in some way, which might explain why I pay more attention to them than some bloggers do. And posts on tools and computer set ups and so forth seem to generate a lot of hits; Tools of the Trade—What a Grant Writer Should Have is among the more popular Grant Writing Confidential posts.

I use Devonthink Pro as described by Steven Berlin Johnson, which supplements my memory and acts as research tool, commonplace book, and quote database, and probably weakens my memory while allowing me to write deeper blog posts and papers. Maybe I remember less in my mind and more in my computer, but it still takes my mind to give context to the material copied into the database.

In fact, Devonthink Pro helped me figure out a potential contradiction in Carr’s writing. On page 209, he says:

Even as our technologies become extensions of ourselves, we become extensions of our technologies […] every tool imposes limitations even as it opens possibilities. The more we use it, the more we mold ourselves to its form and function.

But on page 47 he says: “Sometimes our tools do what we tell them to. Other times, we adapt ourselves to our tools’ requirements.” So if “sometimes our tools do what we tell them to,” then is it true that “The more we use it, the more we mold ourselves to its form and function?” The two statements aren’t quite mutually exclusive, but they’re close. Maybe reading Heidegger’s Being and Time and Graham Harman’s Tool-Being will clear up or deepen whatever confusion exists, since he a) went deep but b) like many philosophers, is hard to read and is closer to a machine for generating multiple interpretations than an illuminator and simplifier of problems. This could apply to philosophy in general as seen from the outside.

This post mirrors some of Carr’s tendencies, like the detour in the preceding paragraph. I’ll get back to the main point for a moment: Carr’s examples don’t necessarily add up to proving his argument, and some of them feel awfully tenuous. Some are also inaccurate; on page 74 he mentions a study that used brain scans to “examine what happens inside people’s heads as they read fiction” and cites Nicole K. Speer’s journal article “Reading Stories Activates Neural Representations of Visual and Motor Experiences,” which doesn’t mention fiction and uses a memoir from 1951 as its sample text.

Oops.

That’s a relatively minor issue, however, and one that I only discovered because I found the study interesting enough to look up.

Along the way in The Shallows we get lots of digressions, and many of them are well-trod ones: the history of the printing press; the origins of the commonplace books; the early artificial intelligence program ELIZA; Frederick Winslow Taylor and his efficiency interest; the plasticity of the brain; technologies that’ve been used for various purposes, including metaphor.

Those digressions almost add up to one of my common criticisms of nonfiction books, which is that they’d be better as long magazine articles. The Shallows started as one, and one I’ve mentioned before: “Is Google Making Us Stupid?” The answer: maybe. The answer now, two years and 200 pages later: maybe. Is the book a substantial improvement on the article? Maybe. You’ll probably get 80% of the book’s content from the article, which makes me think you’d be better off following the link to the article and printing it—the better not to be distracted by the rest of The Atlantic. This might tie into the irony that I mentioned in the first line of this post, which you’ve probably forgotten by now because you’re used to skimming works on the Internet, especially moderately long ones that make somewhat subtle arguments.

Offline, Carr says, you’re used to linear reading—from start to finish. Online, you’re used to… something else. But we’re not sure what, or how to label the reading that leads away from the ideal we’ve been living in: “Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts—the faster, the better.”

Again, maybe, which is the definitive word for analyzing The Shallows: but we don’t actually have a name for this kind of mind, and it’s not apparent that the change is as major as Carr describes: haven’t we always made disparate connections among many things? Haven’t we always skimmed until we’ve found what we’re looking for, and then decided to dive in? His point is that we no longer do dive in, and he might be right—for some people; but for me, online surfing, skimming, and reading coexists with long-form book reading. Otherwise I wouldn’t have had the fortitude to get through The Shallows.

Still, I don’t like reading on my Kindle very much because I’ve discovered that I often tend to hop back and forth between pages. In addition, grad school requires citations that favor conventional books. And for all my carping about the lack of causal certainty regarding Carr’s argument, I do think he’s on to something because of my own experience. He says:

Over the last few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I feel it most strongly when I’m reading. I used to find it easy to immerse myself in a book or a lengthy article. My mind would get caught up in the twists of the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration starts to drift after a page or two. I get fidgety, lose the thread, begin looking for something else to do. I feel like I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I think I know what’s going on. For well over a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet.

He says friends have reported similar experiences. I feel the same way as him and his friends: the best thing I’ve found for improving my productivity and making reading and writing easier is a program called Freedom, which prevents me from getting online unless I reboot my iMac. It throws enough of a barrier between me and the Internet that I can’t easily distract myself through e-mail or Hacker News (Freedom has also made writing this post slightly harder, because during the first draft, I haven’t been able to add links to various appropriate places, but I think it worth the trade-off, and I didn’t realize I was going to write this post when I turned it on). Paul Graham has enough money that he uses another computer for the same purpose, as he describes in the linked essay, which is titled, appropriately enough, “Disconnecting Distraction” (sample: “After years of carefully avoiding classic time sinks like TV, games, and Usenet, I still managed to fall prey to distraction, because I didn’t realize that it evolves.” Guess what distraction evolved into: the Internet).

Another grad student in English Lit expressed shock when I told him that I check my e-mail at most once a day and shook for every two days, primarily in an effort not to distract myself with electronic kibble or kipple. Carr himself had to do the same thing: he moves to Colorado and jettisons much of his electronic life, and he “throttled back my e-mail application […] I reset it to check only once an hour, and when that still created too much of a distraction, I began to keeping the program closed much of the day.” I work better that way. And I think I read better, or deeper, offline.

For me, reading a book is a very different experience from searching the web, in part because most of the websites I visit are exhaustible much faster than books. I have a great pile of them from the library waiting to be read, and an even greater number bought or gifted over the years. Books worth reading seem to go on forever. Websites don’t.

But if I don’t have that spark of discipline to stay off the Internet for a few hours at a time, I’m tempted to do the RSS round-robin and triple check the New York Times for hours, at which point I look up and say, “What did I do with my time?” If I read a book—like The Shallows, or Carlos Ruiz Zafon’s The Shadow of the Wind, which I’m most of the way through now—I look up in a couple of hours and know I’ve done something. This is particularly helpful for me because, as previously mentioned, I’m in grad school, which means I have to be a perpetual reader (if I didn’t want to be, I’d find another occupation).

To my mind, getting offline can become a comparative advantage because, like Carr, “I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain,” and that someone is me and that someone is the Internet. But I can’t claim this is true for all people in all places, even as I tell my students to try turning off their Internet access and cell phones when they write their papers. Most of them no doubt don’t. But the few who do learn how to turn off the electronic carnival are probably getting something very useful out of that advice. The ones who don’t probably would benefit from reading The Shallows because they’d at least become aware of the possibility that the Internet is rewiring our brains in ways that might not be beneficial to us, however tenuous the evidence (notice my hedging language: “at least,” “the possibility” “might not”).

Alas: they’re probably the ones least likely to read it.