How to get coaching, mentoring, and attention

Introduction

Students regularly say that professors, teachers, coaches, mentors, and others don’t care about them or don’t offer real help and advice. In a recent discussion on the forum Hacker News, someone wrote, “[…] coaching/mentorship is probably found a lot more in a grad program than undergrad, where it’s pretty much nonexistent.” That commenter is somewhat right, but the deeper issue is that professors (and others with knowledge and competence) are most inclined to help people who won’t waste their time.

The challenge is to figure out who is going to waste time and who isn’t. Professors accomplish this through implicit tests. The challenge for you, the student who wants help, is to demonstrate that you’re worth the investment. I’m going to describe the incentives acting on both professors (or people with expertise) and students (or people seeking to develop expertise) and explain how to show that you’re better than the average student.

“How to get your Professors’ Attention” is biased towards universities because I’m a grad student in one and therefore more attuned to universities and the peculiar people who inhabit them. But this advice can be generalized to other situations where someone is knowledgeable and someone else is trying to seek knowledge or mentorship.

This essay is also biased toward English, which is my field. But if you’re working in computer science, for example, you’ll probably get more and better help if you walk into a professor’s office and say something like, “I’m having a problem with this program, which I suspect is related to X, but I’m not sure. I’ve tried sources Y and Z, which might be related, but I can’t figure out what’s going on. Am I missing something?” This will almost always go over better than saying, “Explain binary search trees to me” or “I don’t get this class,” which will probably yield a pointer to the relevant section of the book, with the instruction that you come back once you’ve read it and explain more explicitly where you’ve gotten lost.

Background

I majored in English and went to Clark University, where I think I got a lot of mentorship and connected with my professors. That might be because I took a lot of time to seek them out or because Clark is a small liberal arts school where professors are expected to interact with students. Even there, however, most, though not all, professors offered real mentorship/guidance to the extent the students seek it. When I was an undergrad, I was doing many of the things described in this essay, albeit unconsciously.

What do you care about?

The idea that professors don’t care about their students is a pernicious half-truth. Most professors do care about their students (otherwise they wouldn’t be professing), but professors know that many students don’t care about the subject or about learning—they care about grades. Professors don’t care about grades, and they often care about their students to the extent that their students care about learning.

If a student really wants to learn, the professor will usually help, but most students don’t—so the professor builds a wall between herself and her students to make sure that the only students who breach the wall are the ones who do care about learning. Professors do this through the tests described in the next section. Students often perceive this wall as indifference or callousness, when it’s really just a practical means of separating out the students whose primary goal is to get an A from the students whose primary goal is to understand why Ulysses was a major break from the tradition of the novel and why it became an emblematic text of modernism…

And so on. Life is complex and simple questions often have complex answers. Those complex answers are often found in the form of text, since good writing is far more idea-dense than speech can hope to be, which leads to my next point.

Books

Now I’m a grad student at the University of Arizona and tell my students the same thing: if they want to go beyond whatever is required in class, they should start by showing up in their professors’ office hours, ideally with somewhat smart or at least well-considered questions or comments. Most professors respond well to this and will often give recommendations on books to read and/or projects to work on. A few days ago I taught Paul Graham’s essay “What You’ll Wish You’d Known,” and students glommed onto this paragraph:

A key ingredient in many projects, almost a project on its own, is to find good books. Most books are bad. Nearly all textbooks are bad. So don’t assume a subject is to be learned from whatever book on it happens to be closest. You have to search actively for the tiny number of good books.

Professors are a good place to find good books because they’ve read so many. If you follow their recommendations and talk to them afterwards, coaching and mentorship relationships will be much more likely form. Demonstrate interest in their subject if you want their attention.

Obviously, there are exceptions, but this principle usually works reasonably well. If you show up in office hours and say “mentor me!” you’re probably not going to get much. But if you show up and ask questions x, y, and z, then read whatever the prof recommends, then come back, you’ll probably have a much better shot at their attention.

Another person on the Hacker News discussion said, “I get the impression that some undergraduates at some colleges do get good coaching and mentorship, and I would like to hear from other HN participants if they know of examples of that.” They’re right: some undergraduates do get good coaching and mentorship, but I suspect that depends less on the college or university and more on the undergraduate—and the undergraduate realizing how things work from her professors’ perspectives.

Reading

Professors tell you to read more or read particular books / essays for two reasons. The primary one is that reading is simply more information dense than talking, as mentioned earlier. Try this sometime: copy a half hour of TV news verbatim. You’ll find that it comes to maybe a page of text. To have a reasonable conversation, it often makes sense to read something related to the topic first, then talk about where to go from there. To learn more, read more. To learn faster, read more.

Secondarily, your professor will often recommend reading to test your seriousness. If she says, “Go read X and Y,” and you do, you’ve demonstrated that you’re not wasting the professor’s time and are genuinely interested in the topic. If you go away and don’t come back, you’ve demonstrated that you would’ve wasted her time had she spent an extra hour talking to you outside of class and office hours.

In English and related fields, a deep interest in reading is a pre-condition to doing other interesting things, like knowing about the world. It’s necessary but not sufficient. You don’t need to have read obsessively since you were 12 to catch my attention—but it does help if you say something like, “Oh, yeah, I read Heart of Darkness last summer and noticed the narrative structure, with Marlow telling the story to a random guy on the deck of the boat…” If you tell your computer science professors, “I’m working on a system to save and organize the comments I leave on blogs and read about this association algorithm…” they’re probably going to be more impressed than if you say that you’re ranked on the StarCraft II Battle.net ladder.

There are a handful of people who for whatever reason can’t get around to reading. But all of us make time for what’s important to us. If you can’t make time to read whatever your professor suggests, that indicates the topic isn’t of great importance to you—and therefore your professor shouldn’t waste time doing something that’s not important.

Once I had a student who said in class that he didn’t like to read fiction. Fair enough; not everyone does and it doesn’t offend me when others don’t share my vices. A week or two later, however, he wanted me to edit his 43 pages of Starcraft fan fiction; when I said that it isn’t possible to be a good writer without being a good reader, he didn’t believe me. Nonetheless I told him that if he read How Fiction Works and discussed it with me, I would read his Starcraft fan fiction. And I would have. He didn’t, of course, and acted like I I had kicked his puppy when I suggested that he prove himself.

To summarize: reading teaches you faster than talking can, and it efficiently sorts people who are willing to put in some time investment from those who aren’t. It’s necessary if you’re going to do interesting work.

Doing

People know I’m a wannabe “novelist” (as Curtis Sittenfeld said of her success with Prep in “The Perils of Literary Success,” “I was excited by the thought of no longer having to use air quotes when referring to myself as a ‘writer’ working on a ‘novel’ ”) with many rejection letters and near acceptances to prove how much of a wannabe I am. Sometimes friends and others say things like, “I want to be a novelist,” or “I want to write a novel.” I usually say, “Okay: start today.” Then I tell them: write Chapter One by date X (usually two or three days out) and send it to me.

I’ve probably made this offer to between one and two dozen people over the last couple years. One person has taken me up; she sent me Chapter One, I sent her some comments, and I didn’t hear back (we’re still friends; she says she’s writing other things). When people say they want to be better writers, I tell them what I told my Starcraft fan fiction writer: read James Wood’s How Fiction Works and Francine Prose’s Reading Like a Writer. The rare ones who read show me they’re serious.

By now, I’ve been trained to assume that most people who say things like “I want to write a novel” a) have no idea how hard it is to write a novel, b) how much harder it is to write a novel someone else might actually want to read, and c) the fact that, based on experience, most people who say, “I want to write a novel” are full of shit.

Almost everyone in the United States who wants a computer has one. If you have a working computer and two or three hours a day, you can write a novel. Nothing is stopping you: you don’t need a $10,000 piano. You don’t need a mass spectrometer.[1] You don’t need permission. You don’t need to pass a test. You don’t need to be told you’re special.

All you need to do is sit down and write every day for a couple of hours. Eventually, you’ll have a novel, or at least a very large pile of words. Few people really want to.[2]

Most people who say they do, don’t, just like most people who say they want to lose weight don’t read Michael Pollan’s In Defense of Food and then stop eating simple carbohydrates and highly processes meats. They say they want to lose weight and keep buying Coke. Comparing statements to actions reduces to, “I want to write a novel / lose weight, but not as much as I want to watch TV / drink soda.”

The funny thing is that both novel writing and losing weight are actually fields where relatively minor changes, accumulated over time, can lead to relatively large changes: try writing for one hour a day. Then two. Then three (maybe only on the weekends). Try to drink nothing but water (most drinks are just easily removed empty calories). Take most forms of bread out of your diet; eat fruit instead of candy. Go for a walk at the end of the day. You’ll eventually have a largish pile of words or drop some pounds. A large enough number of people do both to prove they’re possible—if you want them.

Your professors are asking themselves: “Does this student want it? Really want it?” The value of “it” varies by discipline, but the idea remains the same.

A lot of students say or imply they’re not ready or incapable to do a real project, or that they don’t have the time to do so. The former excuses about readiness might be true, but students should still start doing something. I wasn’t capable of writing a novel anyone wanted to read when I was 19—or even finishing one. It took me three tries to get a coherent, complete narrative together, which was still unpublishable. But I wouldn’t have the skills I have now if I hadn’t started trying then. Here’s Curtis Sittenfeld again, this time in an interview with The Atlantic: “I don’t think that you can learn to write a book except by writing a book.”

This isn’t just true of writing books. I didn’t start or stop my work based on what classes I was in or whether I was somehow authorized or trained to do what I was doing. In effect, I mostly trained myself, which I wouldn’t have done without all those early hours writing unpublishable crap. Most novelists tell the same story: lots of early crap and rejection that they ultimately overcome.

If you have a choice between building or making something and not building or making something, always choose “building or making something,” which will be more impressive than not trying even if you fail. Plus, if you look for it, you’ll see people in almost every field saying the same thing: the only way to learn is via the work itself. Here’s Patrick Allitt in I’m the Teacher, You’re the Student:

[. . .] but the way to improve as a teacher is by actually teaching; hypothetical situations or abstract discussions are too different from the real thing. The best you can hope for, short of actually getting down to the job, is to learn a handful of principles, on the one hand, and a handful of useful techniques, on the other.

You can learn those principles and techniques, but you still have to—above all—do. And your professors, like coaches and mentors, are looking for the people who will do whatever it takes. A lot of students say, “I’m just a student, and the president of club X, and I have homework to do, and I want to have sex with my boyfriend / girlfriend / neighbor / person-from-the-party-whose-name-I-forget, and my parents are breathing down my neck…” That might all be true, and all of those are fine things to do or worry about. I have to worry about many of them myself.

But you’ll only have more work over time, and the work done in college is nothing compared to the real work people do to support themselves. From what friends have told me, college schoolwork and life is nothing like the work of having a baby and being responsible for feeding and keeping alive a small, helpless, somewhat boring human. So in your professors’ minds, saying that you have so many responsibilities often reduces to an excuse not to start now. A base excuse. The best time to start anything is now. Today.

People who really want to do something… do it. Or they make changes so they can; you might notice that most people are not too busy to find time to date and/or have sex with the person of their dreams. But most people say they want to do something and then they don’t (I’ve repeated this a couple of times in the hopes that it sticks). Over time, others notice this (like me), and they start to assume that most people who say they want to do or know something are full of shit, in part because experts can’t distinguish at first glance who’s full of shit and who is genuine and thus worth investing in.

So experts assume that someone is full of shit until they prove otherwise. In the case of someone who wants to write a novel, I assume they’re no longer full of shit if they’ve written a complete first novel and started on a second one (the first one is almost certainly no good, although there might be useful lessons to draw from it. That was true for me). In the case of someone who wants to lose weight, I assume they’re full of shit until they start carrying around a Nalgene bottle and a bag of peanuts instead of a Coke and a Snickers. Your professors will start to think you’re not full of shit when you read the books they recommend, ask for more recommendations, read those, and come back for more.

In addition, if you do enough stuff, you’ll have something to bring to the table. A random person with no skills is less appealing than a random person who can say, “I’ll get your blog up and running” or “I’ll write the first draft of the boring NIH proposal for you” or even “I’m obsessed with coffee and will make you a single-original brew in a Chemex.” People who develop skills tend to develop the meta-skill of developing skills, and they’re more appealing because of the skills they already have.

Caveats

This basic advice won’t always work: some professors won’t pay any attention to you no matter what you do. They might be more interested in their own research than teaching, or they might be having personal problems, or they might be off in their own world, or they might be burned out. Some professors will go out of their way to try and inflict mentoring on students who don’t particularly want it, although I don’t think there are very many of these professors, especially in big public schools; most professors who try this approach will also probably encounter enough apathy to scale it back once they’re rebuffed enough times.

There are probably also variations by field: enough people have reported that professors in technical fields are less inclined to work with undergrads to make me wonder if there is some truth to this stereotype. I suspect that science professors just have a different mode of mentoring, which goes something like: “Come to the lab, we’ll see if you can do anything there.”[3] Most professors, however, will fall somewhere in the middle of this spectrum, and those are the professors who can most be reached via this guide. It would be very unusual to find a school where following the basic outline presented here will result in nothing.

A story…

I had a student who I’ll call “Joe.” He habitually wanted to hang out and chat after class. This is good: at first I interpreted it as meaning that he was intellectually curious and driven.

But as the semester went on, I got progressively more annoyed because he’d ask questions that couldn’t be reduced to sound bites. I kept telling him to drop by office hours if he wanted to really talk, but he never showed up. I’d suggest he read X, and when I asked him about it a week later, he’d say he’d been busy, but he was never too busy to waste ten or fifteen minutes of my time in class. We were reading Jane Austen’s Pride and Prejudice, and he said something about her place in literary history that was… unlikely, let us say, so I told him to read a few of the essays in the back of the Norton critical edition. I don’t think he did.

Before their first papers are due, I usually meet with my freshmen individually to go over their work. I close read, edit, talk to them about ideas, catch disastrously bad papers so they can be rewritten, and so on. Joe didn’t show up to his conference; he didn’t come to my office hours; and when I finally did read his paper, it had incredible howlers in terms of both fact and interpretation, my favorite being his assertion that the Toyota Prius is in some way like a perpetual motion machine, which demonstrated that he didn’t know anything about physics or perpetual motion machines or even general knowledge.

Joe got back a paper that was charitably graded, given its quality, and he dropped the class. Joe is an extreme example of a time waster: I think he would’ve been more than happy to chat for an hour after class each day, shooting the breeze while I had places to be and other pressing concerns. But I get at least one Joe every year. I separate Joe from students who want to learn by a) telling them to read something and b) seeing if they do it. The ones who do, I spend as much time talking to outside of class as they want—because I know they’re not wasting my time.

Criticism

Most of us don’t like being criticized: we’d prefer to imagine that we’re good at everything, that we don’t need the help of others, and that whatever we’re working on is perfect—we shouldn’t change a thing. We get prickly when people try to help us and often denigrate the person giving us advice, assuming that person doesn’t understand our genius or is too hard a grader or has malice in their heart.

Grades are a form of criticism and a form of ranking you against other people: they’re a direct statement from your professor to you about how well the professor thinks you’ve mastered the material. Even in an era of rampant grade inflation, grades can still sting, and very few students achieve a 4.0. A small but noisy minority of students will come back after every semester to fight about their grades, which is one of the least pleasant aspects of teaching.

Professors know that most people who are looking for help mostly want to have their current ideas or beliefs gratified and validated. If professors offer real, constructive criticism, it’s often viewed as a personal attack by the person on the receiving end, who will then be hostile to the critic; that hostility will turn into negative responses on the end-of-semester evaluations, awkward moments when the professor and student run into each other on campus or at a bar, and so on.

Still, some fields are culturally disposed towards rapid, yes/no assessment. One friend who read this essay mention that his vector calculus professor often says things like, “No, you’re doing it wrong—here’s how it should be done.” My friend said it took him aback at first, and he realized that the professor’s honesty could be mistaken for cruelty and indifference. But the professor’s demeanor is actually about efficiency: the math professor wants his students to get the right answer as fast as possible. Most of us, however, aren’t used to being told we’re wrong on a regular basis, so we interpret this as hostility when it’s not.

We’ve all heard the phrase, “Don’t shoot the messenger,” which is a cliché precisely because very few people are capable of listening dispassionately to criticism, evaluating it, and ignoring it if they think it invalid and accepting it if they think it’s valid. Most of us suffer from some level of confirmation bias, which is a term psychologists use to describe what Wikipedia calls “a tendency for people to favor information that confirms their preconceptions or hypotheses regardless of whether the information is true.”[4] We all want to believe we are smart and capable. But we often aren’t, and we don’t like to accept it when people tell us this or imply it. When students do attempt something, fail, and accept credit, it’s almost as impressive as if they get it on the first try.

From the professor’s perspective, it’s easier to avoid giving the real criticism necessary for improvement. If you’re a student who wants to learn, you’ll need to demonstrate that you’re capable of taking criticism, that your ego is not overly inflated, and that you’re willing to accept that you don’t know everything and that you could be wrong. Some people never learn how to do this. Others do only after a great struggle. Professors will assume that you can’t take criticism until you show you can. This problem inhibits your professors from forming real bonds and sharing real knowledge with you, especially if that knowledge contradicts what you already believe to be true. If a professor gives you real commentary, use it to improve.

That doesn’t mean you have to believe your professor or take all the advice anyone gives you, but you should at least not be hostile to it. If the professor is right, modify your behavior; if the professor is wrong, pity them for their ignorance or incorrect interpretation. But don’t get angry because someone is trying to help you, however imperfectly.

Professors, and most people who do good or interesting work, need to have a peculiar temperament: they need an open mind (Paul Graham in “What You Can’t Say:” “To do good work you need a brain that can go anywhere”) but also the rigor not to become too infatuated with or attached to particular ideas. Few people achieve this balance, and very few people have the kind of openness that I associate with great intelligence, which manifests itself in a willingness to take in new ideas and be wrong when necessary. When I see these kinds of traits in anyone, they arrest my attention. This is doubly true for students, because so few students have or manifest them.

Real education

In “Who Are You and What Are You Doing Here?“, Mark Edmundson writes:

If you want to get a real education in America you’re going to have to fight—and I don’t mean just fight against the drugs and the violence and against the slime-based culture that is still going to surround you. I mean something a little more disturbing. To get an education, you’re probably going to have to fight against the institution that you find yourself in—no matter how prestigious it may be. (In fact, the more prestigious the school, the more you’ll probably have to push.) You can get a terrific education in America now—there are astonishing opportunities at almost every college—but the education will not be presented to you wrapped and bowed. To get it, you’ll need to struggle and strive, to be strong, and occasionally even to piss off some admirable people.

This guide is basically teaching you how “to fight,” because the regular education that you get solely from sitting in classes won’t be real impressive. You won’t learn as much from formal, explicit education as you will from informal, tacit education. Both have their place, but you have to go beyond the given to get the tacit education. That’s where the “struggle and strive” come from. If you’re perceptive and attending a big American school, you’ve probably noticed that you’re not getting much out of a 500- or 1,000-person lecture class.

Of course you aren’t—those classes are designed to balance the university’s budget, since they cost only marginally more to run than ten-person seminars, yet the university charges you, the student, the same amount per credit hour as it does to the ten seminarians. If you’re not perceptive or you just want to party and get laid, it probably doesn’t matter. But if you are that student who really wants to get something more than a particular kind of fun from the college experience, you need to know how to “get a terrific education,” which “will not be presented to you wrapped and bowed.” You have to take it for yourself—you have to prove yourself. In movies about sports, you may notice that the team or individual doesn’t get to the championship match or fight the first time it hits the field or enters the ring.

You won’t either. You have to prove to your professors and to others that you have what it takes. That you have tenacity, grit, strength. That you want the education, not merely the piece of paper at the end that says you’ve sat through four years of stultifying classes and managed not to fail out. Depending on your major, it’s shockingly hard to fail, as Richard Arum and Josipa Roksa show in Academically Adrift: Limited Learning on College Campuses.

It’s important to learn how to cultivate teachers. In A Jane Austen Education: How Six Novels Taught Me About Love, Friendship, and the Things That Really Matter, William Deresiewicz writes:

The need for teachers: there is something in the modern spirit that bridles at the notion. It seems inegalitarian, undemocratic. It injures our self-esteem, the idea of having to confess our incompleteness and submerge our ego beneath another person. It outrages our Romantic temper, which feels that the self is autonomous and the self is supreme. [ . . .] But Austen accepted it, even celebrated it. Nearly all of her heroines have teachers of one kind or another, and in her own life, we know, her mentors were many crucial.

Most teachers are not very good, despite our need for them. But we need to learn how much we need them, if we’re really going to do the things we want to do in our lives. We might be “autonomous,” but we also need to have someone else’s perspective and experience.

Conclusion

Many professors will help you, but you need to know how to make them want to help you. You need to learn how to signal a willingness to learn, which you can do mostly by formulating good questions and doing the reading or projects your professor suggests. As stated earlier, some professors won’t help you no matter what. They’re not very common, since if they didn’t have a strong desire to teach, they’d have gone into a more lucrative field, since there are few fields less lucrative than teaching at the university level (adjusted for education and opportunity costs). Many, however, will have been burned by students who are dilettantes and time wasters. You need to prove you’re not one of them and learn how to breach their defenses. This is a guide to doing so, but reading the guide is the easy part. The hard part is doing the reading and finishing the projects. That is up to you.

Thanks to Bess Stillman, Derek Huang, and Andrew Melton for reading this essay. For further reading, consider Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life. Leading a meaningful life is not easily accomplished, and for evidence of that assertion I’d submit the tragically small number of people who seem to do so.


[1] But really, who doesn’t want one?

[2] Maybe they are afraid of ending up with that very large pile of words.

[3] They want to know: Are you competent? Can you do math? Will you break the $10,000 PCR machine? Okay, go play with chemicals, read this paper, get back to me in a week.

[4] Learning about confirmation bias is one of the first steps toward combating it, which Steve Joordens discusses in his lecture “You Can Lead Students to Knowledge, But How Do You Make Them Think?” The lecture is about critical thinking, but it’s really about how to think and why.

David Shields’ Reality Hunger and James Wood’s philosophy of fiction

In describing novels from the first half of the 19th Century, David Shields writes in Reality Hunger: A Manifesto that “All the technical elements of narrative—the systematic use of the past tense and the third person, the unconditional adoption of chronological development, linear plots, the regular trajectory of the passions, the impulse of each episode toward a conclusion, etc.—tended to impose the image of a stable, coherent, continuous, unequivocal, entirely decipherable universe.”

I’m not so sure; the more interesting novels didn’t necessarily have “the unconditional adoption of chronological development” or the other features Shields ascribes to them. Caleb Williams is the most obvious example I can immediately cite: the murderers aren’t really punished in it and madness is perpetual. Gothic fiction of the 19th Century had a highly subversive quality that didn’t feature “the regular trajectory of the passions.” To my mind, the novel has always had unsettling features and an unsettling effect on society, producing change even when that change isn’t immediately measurable or apparent, or when we can’t get away from the fundamental constraints of first- or third-person narration. Maybe I should develop this thought more: but Shields doesn’t in Reality Hunger, so maybe innuendo ought to be enough for me too.

Shields is very good at making provocative arguments and less good at making those arguments hold up under scrutiny. He says, “The creators of characters, in the traditional sense, no longer manage to offer us anything more than puppets in which they themselves have ceased to believe.” Really? I believe if the author is good enough. And I construct coherence where it sometimes appears to be lacking. Although I’m aware that I can’t shake hands with David Kepesh of The Professor of Desire, he and the characters around him feel like “more than puppets” in which Roth has ceased to believe.

Shields wants something made new. Don’t we all? Don’t we all want to throw off dead convention? Alas: few of us know how to successfully, and that word “successfully” is especially important. You could write a novel that systematically eschews whatever system you think the novel imposes (this is the basic idea behind the anti-novel), but most people probably won’t like it—a point that I’ll come back to. We won’t like it because it won’t seem real. Most of us have ideas about reality that are informed by some combination of lived experience and cultural conditioning. That culture shifts over time. Shields starts Reality Hunger with a premise that is probably less contentious than much of the rest of the manifesto: “Every artistic movement from the beginning of time is an attempt to figure out a way to smuggle more of what the artist thinks is reality into the work of art.” I can believe this, though I suspect that artists begin getting antsy when you try to pin them down on what reality is: I would call it this thing we all appear to live in but that no one can quite represent adequately.

That includes Shields. Reality Hunger doesn’t feel as new as it should; it feels more like a list of N things. It’s frustrating even when it makes one think. Shields says, “Culture and commercial languages invade us 24/7.” But “commercial languages” only invade us because we let them: TV seems like the main purveyor, and if we turn it off, we’ll probably cut most of the advertising from our lives. If “commercial languages” are invading my life to the extent I’d choose the word “invade,” I’m not aware of it, partially because I conspicuously avoid those languages. Shields says, “I try not to watch reality TV, but it happens anyway.” This is remarkable: I’ve never met anyone who’s tried not to watch reality TV and then been forced to, or had reality TV happen to them, like a car accident or freak weather.

Still, we need to think about how we experience the world and depict it, since that helps us make sense of the world. For me, the novel is the genre that does this best, especially when it bursts its perceived bounds in particularly productive ways. I can’t define those ways with any rigor, but the novel has far more going on than its worst and best critics imagine.

Both the worst and best critics tend to float around the concept of reality. To use Luc Sante’s description in “The Fiction of Memory,” a review of Reality Hunger:

The novel, for all the exertions of modernism, is by now as formalized and ritualized as a crop ceremony. It no longer reflects actual reality. The essay, on the other hand, is fluid. It is a container made of prose into which you can pour anything. The essay assumes the first person; the novel shies from it, insisting that personal experience be modestly draped.

I’m not sure what a “crop ceremony” is or how the novel is supposed to reflect “actual reality.” Did it ever? What is this thing called reality that the novel is attempting to mirror? Its authenticity or lack thereof has, as far as I know, always been in question. The search for realism is always a search and never a destination, even when we feel that some works are more realistic than others.

Yet Sante and Sheilds are right about the dangers of rigidity; as Andrew Potter writes in The Authenticity Hoax: How We Get Lost Finding Ourselves, “One effect of disenchantment is that pre-existing social relations come to be recognized not as being ordained by the structure of the cosmos, but as human constructs – the product of historical contingencies, evolved power relations, and raw injustices and discriminations.”

Despite this, however, we feel realism—if none of us did, we’d probably stop using the term. Our definitions might blur when we approach a precise definition, but that doesn’t mean something isn’t there.

Sante writes, quoting Shields, that “‘Anything processed by memory is fiction,’ as is any memory shaped into literature.” Maybe: but consider these three statements, if I were to make them to you (keep in mind the context of Reality Hunger, with comments like “Try to make it real—compared to what?”):

Aliens destroyed Seattle in 2004.

I attended Clark University.

Alice said she was sad.

One of them is, to most of us, undoubtedly fiction. One of them is true. The other I made up: no doubt there is an Alice somewhere who has said she is sad, but I don’t know her and made her up for the purposes of example. The second example might be “process by memory,” but I don’t think that makes it fiction, even if I can’t give you a firm, rigorous, absolute definition of where the gap between fact and interpretation begins. Jean Bricmont and Alan Sokal give it a shot in Fashionable Nonsense: “For us, as for most people, a ‘fact’ is a situation in the external world that exists irrespective of the knowledge that we have (or don’t have) of it—in particular, irrespective of any consensus or interpretation.”

They go to observe that scientists actually face some problems of definition that I see as similar to those of literature and realism:

Our answer [as to what makes science] is nuanced. First of all, there are some general (but basically negative) epistemological principles, which go back at least to the seventeenth century: to be skeptical of a priori arguments, revelation, sacred texts, and arguments from authority. Moreover, the experience accumulated during three centuries of scientific practice has given us a series of more-or-less general methodological principles—for example, to replicate experiments, to use controls, to test medicines in double-blind protocols—that can be justified by rational arguments. However, we do not claim that these principles can be codified in a definite way, nor that the list is exhaustive. In other words, there does not exist (at least present) a complete codification rationality, is always an adaptation to a new situation.

They lay out some criteria (beware of “revelation, sacred texts, and arguments from authority”) and “methodological principles” (“replicate experiments”) and then say “we do not claim that these principles can be codified in a definite way.” Neither can the principles of realism. James Wood does as good a job of exploring them as anyone. But I would posit that, despite our inability to pin down realism, either as convention or not, most of us recognize it: when I tell people that I attended Clark University, none have told me that my experience is an artifact of memory, or made up, or that there is no such thing as reality and therefore I didn’t. Such realism might merely be convention or training—or it might be real.

In the first paragraph of his review of Chang-Rae Lee’s The Surrendered, James Wood lays out the parameters of the essential question of literary development or evolution:

Does literature progress, like medicine or engineering? Nabokov seems to have thought so, and pointed out that Tolstoy, unlike Homer, was able to describe childbirth in convincing detail. Yet you could argue the opposite view; after all, no novelist strikes the modern reader as more Homeric than Tolstoy. And Homer does mention Hector’s wife getting a hot bath ready for her husband after a long day of war, and even Achilles, as a baby, spitting up on Phoenix’s shirt. Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation. The novel is peculiar in this respect, because while anyone painting today exactly like Courbet, or composing music exactly like Brahms, would be accounted a fraud or a forger, much contemporary fiction borrows the codes and conventions—the basic narrative grammar—of Flaubert or Balzac without essential alteration.

I don’t think literature progresses “like medicine or engineering.” Using medical or engineering knowledge as it stood in 1900 would be extremely unwise if you’re trying to understand the genetic basis of disease or build a computer chip. Papers tend to decay within five to ten years of publication in the sciences.

But I do think literature progresses in some other, less obvious way, as we develop wider ranges of techniques and social constraints allow for wider ranges of subject matter or direct depiction: hence why Nabakov can point out that “Tolstoy, unlike Homer, was able to describe childbirth in convincing detail,” and I can point out that mainstream literature effectively couldn’t depict explicit sexuality until the 20th Century.

While that last statement can be qualified some, it is hard to miss the difference between a group of 19th Century writers like Thackeray, Dickens, Trollope, George Eliot, George Meredith, and Thomas Hardy (who J. Hillis Miller discusses in The Form of Victorian Fiction) and a group of 20th Century writers like D.H. Lawrence, James Joyce, Norman Rush, and A.S. Byatt, who are free to explicitly describe sexual relationships to the extent they see fit and famously use words like “cunt” that simply couldn’t be effectively used in the 19th Century.

In some ways I see literature as closer to math: the quadratic equation doesn’t change with time, but I wouldn’t want to be stuck in a world with only the quadratic equation. Wood gets close to this when he says that “Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation.” The word “perhaps” is essential in this sentence: it gives a sense of possibility and realization that we can’t effectively answer the question, however much we might like to. But both question and answer give a sense of some useful parameters for the discussion. Most likely, literature isn’t exactly like anything else, and its development (or not) is a matter as much of the person doing the perceiving and ordering as anything intrinsic to the medium.

I have one more possible quibble with Wood’s description when he says that “the basic narrative grammar—of Flaubert or Balzac without essential alteration.” I wonder if it really hasn’t undergone “essential alteration,” and what would qualify as essential. Novelists like Elmore Leonard, George Higgins, or that Wood favorite Henry Green all feel quite different from Flaubert or Balzac because of how they use dialog to convey ideas. The characters in Tom Perrotta’s Election speak in a much more slangy, informal style than do any in Flaubert or Balzac, so far as I know. Bellow feels more erratic than the 19th Century writers and closer to the psyche, although that might be an artifact of how I’ve been trained by Bellow and writers after Bellow to perceive the novel and the idea of psychological realism. Taken together, however, the writers mentioned make me think that maybe “the basic narrative grammar” has changed for writers who want to adopt new styles. Yes, we’re still stuck with first- and third-person perspectives, but we get books that are heavier on dialog and lighter on formality than their predecessors.

Wood is a great chronicler of what it means to be real: his interrogation of this seemingly simple term runs through the essays collected in The Irresponsible Self: On Laughter and the Novel, The Broken Estate: Essays on Literature and Belief, and, most comprehensively, in the book How Fiction Works. Taken together, they ask how the “basic narrative grammar” of fiction works or has worked up to this point. In setting out some of the guidelines that allow literary fiction to work, Wood is asking novelists to find ways to break those guides in useful and interesting ways. In discussing Reality Hunger, Wood says, “[Shields’] complaints about the tediousness and terminality of current fictional convention are well-taken: it is always a good time to shred formulas.” I agree and doubt many would disagree, but the question is not merely one of “shred[ing] formulas,” but how and why those formulas should be shred. One doesn’t shred the quadratic formula: it works. But one might build on it.

By the same token, we may have this “basic narrative grammar” not because novelists are conformist slackers who don’t care about finding a new way forward: we may have it because it’s the most satisfying or useful way of conveying a story. Although I don’t think this is true, I think it might be true. Maybe most people won’t find major changes to the way we tell stories palatable. Despite modernism and postmodernism, fewer people appear to enjoy the narrative confusion and choppiness of Joyce than do enjoy the streamlined feel of the latest thriller. That doesn’t mean the latter is better than the former—by my values, it’s not—but it does mean that the overall thrust of fiction might remain where it is.

Robert McKee, in his not-very-good-but-useful book Story: Substance, Structure, Style and The Principles of Screenwriting, gives three major kinds of plots, which blend into one another: “arch plots” that are causal in nature and finish their story lines; “mini plots,” which he says are open and “strive for simplicity and economy while retaining enough of the classical […] to satisfy the audience,” and antiplot, which are where absurdism and the like fall.

He says that as one moves “toward the far reaches of Miniplot, Antiplot, and Non-plot, the audience shrinks” (emphasis in original). From there:

The atrophy has nothing to do with quality or lack of it. All three corners of the story triangle gleam with masterworks that the world treasures, pieces of perfection for our imperfect world. Rather, the audience shrinks for this reason: Most human beings believe that life brings closed experiences of absolute, irreversible change; that their greatest sources of conflict are external to themselves; that they are the single and active protagonists of their own existence; that their existence operates through continuous time within a consistent, causally interconnected reality; and that inside this reality events happen for explainable and meaningful reasons.

The connection between this and Wood’s “basic narrative grammar” might appear tenuous, but McKee and Wood are both pointing towards the ways stories are constructed. Wood is more concerned with language; although plot and its expression (whether in language or in video) can’t be separated from one another, they can still be analyzed independently enough of one another to make a distinction.

The conventions that underlie the “arch plots,” however, can become tedious over time. This is what Wood is highlighting when he discusses Roland Barthes’ “reality effect,” which fiction can achieve: “All this silly machinery of plotting and pacing, this corsetry of chapters and paragraphs, this doxology of dialogue and characterization! Who does not want to explode it, do something truly new, and rouse the implication slumbering in the word ‘novel’?” Yet we need some kind of form to contain story; what is that form? Is there an ideal method of conveying story? If so, what if we’ve found it and are now mostly tinkering, rather than creating radical new forms? If we take out “this silly machinery of plotting and pacing” and dialog, we’re left with something closer to philosophy than to a novel.

Alternately, maybe we need the filler and coordination that so many novels consist of if those novels are to be felt true to life, which appears to be one definition of what people mean by “realistic.” This is where Wood parts with Barthes, or at least makes a distinct case:

Convention may be boring, but it is not untrue simply because it is conventional. People do lie on their beds and think with shame about all that has happened during the day (at least, I do), or order a beer and a sandwich and open their computers; they walk in and out of rooms, they talk to other people (and sometimes, indeed, feel themselves to be talking inside quotation marks); and their lives do possess more or less traditional elements of plotting and pacing, of suspense and revelation and epiphany. Probably there are more coincidences in real life than in fiction. To say “I love you” is to say something at millionth hand, but it is not, then, necessarily to lie.

“Convention may be boring, but it is not untrue simply because it is conventional,” and the parts we think of as conventional might be necessary to realism. In Umberto Eco’s Reflections on The Name of the Rose, he says that “The postmodern reply to the modern consists of recognizing that the past, since it cannot really be destroyed, because its destruction leads to silence, must be revisited: but with irony, not innocently.” That is often the job of novelists dealing with the historical weight of the past and with conventions that are “not untrue simply because [they are] conventional.” Eco and Wood both use the example of love to demonstrate similar points. Wood’s is above; Eco says:

I think of the postmodern attitude as that of a man who loves a very cultivated woman and knows he cannot say to her, ‘I love you madly,’ because he knows that she knows (and that she knows that he knows) that these words have already been written by Barbara Cartland. Still, there is a solution. He can say, ‘As Barbara Cartland would put it, I love you madly.’ At this point, having avoided false innocence, having said clearly that it is no longer possible to speak innocently, he will nevertheless have said what he wanted to say to the woman: that he loves her, but he loves her in an age of lost innocence. If the woman goes along with this, she will have received a declaration of love all the same. Neither of the two speakers will feel innocent, both will have accepted the challenge of the past, of the already said, which cannot be eliminated […]

I wonder if every age thinks of itself as “an age of lost innocence,” only to be later looked on as pure, naive, or unsophisticated. Regardless, for Eco postmodernism requires that we look to the past long enough to wink and then move on with the story we’re going to tell in the manner we’re going to tell it. Perhaps Chang-Rae Lee doesn’t do so in The Surrendered, which is the topic of Wood’s essay—but like so many essays and reviews, Wood’s starts with a long and very useful consideration before coming to the putative topic of its discussion. Wood speaks of reading […] “Chang-Rae Lee’s new novel, “The Surrendered” (Riverhead; $26.95)—a book that is commendably ambitious, extremely well written, powerfully moving in places, and, alas, utterly conventional. Here the machinery of traditional, mainstream storytelling threshes efficiently.” I haven’t read The Surrendered and so can’t evaluate Wood’s assessment.

Has Wood merely overdosed on the kind of convention that Lee uses, as opposed to convention itself? If so, it’s not clear how that “machinery” could be fixed or improved on, and the image itself is telling because Wood begins his essay by asking whether literature is like technology. My taste in literature changes: as a teenager I loved Frank Herbert’s Dune and now find it almost unbearably tedious. Other revisited novels hold up poorly because I’ve overdosed on their conventions and start to crave something new—a lot of fantasy flattens over time like opened soda.

Still, I usually don’t know what “something new” entails until I read it. That’s the problem with saying that the old way is conventional or boring: that much is easier to observe than the fix. Wood knows it, and he’s unusually good at pointing to the problems of where we’ve been and pointing to places that we might go to fix it (see, for example, his recent essay on David Mitchell, who I now feel obliged to read). This, I suspect, is why he is so beloved by so many novelists, and why I spend so much time reading him, even when I don’t necessarily love what he loves. The Quickening Maze struck me as self-indulgent and lacking in urgency, despite the psychological insight Adam Foulds offers into a range of characters’ minds: a teenage girl, a madman, an unsuccessful inventor.

I wanted more plot. In How Fiction Works, Wood quotes from Adam Smith writing in the eighteenth century regarding how writers use suspense to maintain reader interest and then says that “[…] the novel [as an art form; one could also say the capital-N Novel] soon showed itself willing to surrender the essential juvenility of plot […]” Yet I want and crave this element that Wood dismisses—perhaps because of my (relatively) young age: Wood says that Chang-Rae Lee’s Native Speaker was “published when the author was just twenty-nine,” older than I am. I like suspense and the sense of something major at stake, and that could imply that I have a weakness for weak fiction. If so, I can do little more than someone who wants chocolate over vanilla, or someone who wants chocolate despite having heard the virtues of cherries extolled.

When I hear about the versions of the real, reality, and realism that get extolled, I often begin to think about chocolate, vanilla, and cherries, and why some novelists write in such a way that I can almost taste the cocoa while others are merely cardboard colored brown. Wood is very good at explaining this, and his work taken together represents some of the best answers to the questions that we have.

Even the best answers lead us toward more questions that are likely to be answered best by artists in a work of art that makes us say, “I’ve never seen it that way before,” or, better still, “I’ve never seen it.” Suddenly we do see, and we run off to describe to our friends what we’ve seen, and they look at us and say, “I don’t get it,” and we say, “maybe you just had to see it for yourself.” Then we pass them the book or the photo or the movie and wait for them to say, “I’ve already seen this somewhere before,” while we argue that they haven’t, and neither have we. But we press on, reading, watching, thinking, hoping to come across the thing we haven’t seen before so we can share it again with our friends, who will say, like the critics do, “I’ve seen it before.”

So we have. And we’ll see it again. But I still like the sights—and the search.

More on science fiction (again)

Earlier posts on science fiction (see here too) and fantasy have elicited some reactions worth considering; John Markley writes Vast and Cool and Unsympathetic and has a post called The stigma of imagination that I mostly agree with until the last paragraph:

Respectability for fantasy or science fiction is most likely a hopeless cause, at least in the current cultural climate. It has the stigma of childishness and Nerd Cooties at the same time. A genre might be able to get away with one; you won’t get away with both.

Maybe—but I’m not so sure. One very positive outcome of Deconstructionism has been the relative rise of genre fiction and an increase of the perceived merit of texts that aren’t purely in the tradition of Flaubertian realism. Raymond Chandler and Philip K. Dick have Library of America volumes dedicated to them, cultural studies flourishes, Tolkien has a peer-reviewed journal named “Tolkien Studies,” and Clark University, my alma mater, offers English courses in science fiction. Michael Chabon’s genre bending has engendered widespread critical admiration, and he defends the idea of genre as part of literature in his wonderful essay collection Maps and Legends, at one point saying:

Yet all mystery resides there, in the margins, between life and death, childhood and adulthood, Newtonian and quantum, “serious” and “genre” literature. And it is from the confrontation with mystery that the truest stories have always drawn their power.
Like a house on the borderlands, epic fantasy is haunted […]

To be sure, Chabon could be the exception that proves the rule. Nonetheless, I don’t think so; I mentioned Chandler and Dick already, and Philip Pullman has earned a strong and real reputation that brings him a spot along with le Guin among major literary figures. Chabon’s aware that some double standard still exists, saying that “From time to time some writer, through a canny shift in subject matter to focus, or through the coming to literary power of his or her lifelong fans, or through sheer, undeniable literary chops, manages to break out,” but I think he’s overstating the case and that the the double standard he’s implicitly writing about is shrinking by the year. William Gibson and Neal Stephenson wield as much literary authority as anyone this side of Ian McEwan and Louis Menand, and Chabon is busily demolishing whatever barriers might be left.

The result, however, will mean that science fiction is judged relative to other literary books, and by this standard it still too often doesn’t reach high or far enough. Beware of the walls that come down: it lets you into the world, and it also lets the world into you. My problem with science fiction and fantasy isn’t as genres, but when the formula of genre is used by bad writers and then defended by those who don’t appear to have really thought about what great writing means or done the heavy lifting real criticism demands. Some writers—Robert Jordan, I’m looking at you, and The Name of the Wind counts too—the find vociferous followers whose overall literary knowledge often seems low, causing the rest of us who defend the genre but not bad manifestations of the genre much angst.

I have one other partial quibble which isn’t about his assertion but the reasons behind it when Markley writes:

That might explain why magical realism is usually considered legit literature: it has imaginative elements, which is iffy, but it doesn’t compound the sin by thinking about the imaginative elements rationally.

Part of the reason magical realism gets good marks is because it’s associated with what academics like me call post-colonialism, which has been a major topic (or fad, depending on perspective) in universities. This is probably more political than aesthetic, but it partially explains why magical realism has been more accepted than fantasy. Nonetheless, the distinction, if there is one, has been fading, and is likely to continue to fade, like the idea that genre literature isn’t real literature. Notice that magical realism began growing in earnest after Deconstructionism, just like respect for fantasy and science fiction. In addition, speculative elements have long been in literature, as something like Henry James’ The Turn of the Screw or the vast body of myth and myth criticism demonstrates. In some ways, the acceptance of fantasy and science fiction is more a return than an all-out change.

(On a side note, Markley’s post on science fiction and ideology is also worth a read. I’ll add a comment a former professor kept repeating, which is that fantastic literature inevitably returns to comment on the society in which it is produced. I suppose this is opposed to the art-for-art’s-sake school, but I’m buying it nonetheless.)

The Barnes & Nobel Book Clubs forums have a fairly low-level discussion, and I’d like to respond to one poster who says:

I am not familiar with the author of the post on that blog, but what I am assuming they mean is the academic defintion of literary merit. Whether or not one agrees with that point of view (some people see “academic” as elitist), there is a particular approach to evaluating texts seen as standard. However, even from that approach that is a small list. Interestingly, though, in a college course I had on SF and Fantasy lit a few years ago we did read Solaris, Left Hand, Canticle, and Ubik (PKD).

(Mistakes in original)

I responded in the thread with a variation on this and a reply saying that I’m approaching science fiction from overall aesthetic and literary perspective that isn’t really academic. Rather, I think the issue is that some science fiction readers and others are talking past literary critics like the Martian and Tomas in Ray Bradbury’s The Martian Chronicles.

By that I mean too much science fiction and fantasy aren’t sufficiently concerned with freshness and vividness in language and expression, which is the positive way of saying they’re too often filled with flatness and cliche, whether in character or plot. So is much literary fiction, but the best rises. What I’m describing will no doubt be misinterpreted: I’m at a very broad level, and to understand it in full would demand reading books like Jane Smiley’s Reading Like a Writer, James Wood’s How Fiction Works, Martin Amis’ The War Against Cliche, or even Stanislaw Lem’s Microworlds, a book that preempted many of my criticisms about science fiction. Some authors transcend this—in addition to Lem and Le Guin, I might add Stephenson’s Snow Crash and The Diamond Age.

My position isn’t that science fiction is automatically not literary or is literary, but that it can be literary and too infrequently is. Unfortunately, much of the conversation in blogland and print tends to want binaries and fights, and too often the background reading necessary to really contribute to the conversation hadn’t been done. Consequently, as one critic’s comment about the fantasy du jour goes, “There’s not one beautiful sentence in the entire first three books of the Twilight series.” It’s true, at least of the first half of the first one, but if you haven’t put in the time and reading to think about what makes a beautiful sentence, that probably just comes off as snobbery when it’s (probably) not. Real snobs wouldn’t give fantasy or science fiction real attention in the first place, while the rest of us are looking for what we’re always looking for: vigor, crispness, vivacity, and fidelity. If only we could find it more readily, whether in science fiction or elsewhere.

What’s that about technophobic English professors?

* I graduated from Clark University in the not-too-distant past, though back then we read by candlelight and there was no department blog. The blog issue being resolved—as the preceding link demonstrates—also helps kill what a common enough conception that a poster at Rate Your Students summarized:

Unfortunately, the business world stereotypes English profs as probably the least useful among all academics: tweed-clad, bookish anachronisms who, if they’re interesting at all, drive 1960’s English sports cars (but can’t find the gas cap) and make witty chit-chat at parties (but are flummoxed by modern fads like telephones, ball-point pens, and air travel).

Not at Clark! But witty chit-chat is still vogue. Whether this former student’s blog is a testament to the department or a mark of shame has yet to be decided.

* In other news, The New York Times published “Eureka! It Really Takes Years of Hard Work,” about the nature of sudden realizations and creativity:

Epiphany has little to do with either creativity or innovation. Instead, innovation is a slow process of accretion, building small insight upon interesting fact upon tried-and-true process. Just as an oyster wraps layer upon layer of nacre atop an offending piece of sand, ultimately yielding a pearl, innovation percolates within hard work over time.

The same is true of literature and criticism: the great novel always comes after long reading and effort, and the great insight about the great novel doesn’t usually come from the first reading, even if the germ of it can.

* Finally, in still other New York Times news, an essay discussesyet again—the supposed divide between highbrow / lowbrow literature. My dream? That one day we can just discuss what’s good and bad, rather than what section of Barnes & Noble a book appears in.

What's that about technophobic English professors?

* I graduated from Clark University in the not-too-distant past, though back then we read by candlelight and there was no department blog. The blog issue being resolved—as the preceding link demonstrates—also helps kill what a common enough conception that a poster at Rate Your Students summarized:

Unfortunately, the business world stereotypes English profs as probably the least useful among all academics: tweed-clad, bookish anachronisms who, if they’re interesting at all, drive 1960’s English sports cars (but can’t find the gas cap) and make witty chit-chat at parties (but are flummoxed by modern fads like telephones, ball-point pens, and air travel).

Not at Clark! But witty chit-chat is still vogue. Whether this former student’s blog is a testament to the department or a mark of shame has yet to be decided.

* In other news, The New York Times published “Eureka! It Really Takes Years of Hard Work,” about the nature of sudden realizations and creativity:

Epiphany has little to do with either creativity or innovation. Instead, innovation is a slow process of accretion, building small insight upon interesting fact upon tried-and-true process. Just as an oyster wraps layer upon layer of nacre atop an offending piece of sand, ultimately yielding a pearl, innovation percolates within hard work over time.

The same is true of literature and criticism: the great novel always comes after long reading and effort, and the great insight about the great novel doesn’t usually come from the first reading, even if the germ of it can.

* Finally, in still other New York Times news, an essay discussesyet again—the supposed divide between highbrow / lowbrow literature. My dream? That one day we can just discuss what’s good and bad, rather than what section of Barnes & Noble a book appears in.

Finally! Someone else notices that the best instructors aren’t necessarily the most credentialed

Finally! Someone else notices that a lot of academic practices don’t make any sense: “Pictures from an Institution: Leon Botstein made Bard College what it is, but can he insure that it outlasts him?” makes me like Bard; this in particular stands out: “In the thirty-nine years that Botstein has been president of Bard, the college has served as a kind of petri dish for his many pedagogical hypotheses [. . . including] that public intellectuals are often better teachers than newly minted Ph.D.s are.” Why isn’t anyone else following the Bard model?

The question is partially rhetorical. College presidents and trustees are probably systematically selected for conformity, but I’ve gotta think there are other people out there who are going, “Aping the Ivy League model is not going to work for us. What can we do differently?” The current order of things, driven by bogus ranking systems, discourages this sort of thinking. Colleges love the rhetoric of being different, but very few follow that rhetoric to actually being different. Perhaps rising costs will eventually force them to be differentiate or die. Then again, the article says that Bard may be on its way to death or drastic restructuring because of financial problems. Still I don’t see overspending as being fundamentally and intrinsically linked with other issues. Instead, it seems that being a maverick in one field may simply translate to being a maverick in many, including places one doesn’t want mavericks (like finances).

A few weeks ago I wrote about donating to Clark, my alma mater. Although I still think Clark a good school, I’d love to see it move in a more Bard-ish direction. the current President and trustees, however, appear to have come through the system and do not seem like shake-it-up types, regardless of their rhetoric.

Jonah Lehrer’s Imagine is still worth reading

Jonah Lehrer, as is now well known, repeatedly misrepresented research and plagiarized other people’s writing in Imagine: How Creativity Works. But, as Roy Peter Clark points out, “Jonah Lehrer’s ‘Imagine’ is worth reading, despite the problems.” Clark goes on to say, “not all the sins [Lehrer commits . . .] are equally grievous,” but, despite that, “the reading of the book ‘Imagine’ helped me understand my world and my craft, and what else can you hope for from a non-fiction book.”

I’ve found the same thing after reading Imagine based on Clark’s endorsement. But reading it in light of Lehrer’s indiscretions reveals new potential layers of meaning, because a couple of passages have a very different resonance, like this one, about Shakespeare’s milieu:

His [Shakespeare’s] peers repeatedly accused him of plagiarism, and he was often guilty, at least by contemporary standards. What these allegations failed to take into account, however, was that Shakespeare was pioneering a new creative method in which every conceivable source informed his art. For Shakespeare, the act of creation was inseparable from the act of connection. {Lehrer “Imagine”@221}

Lehrer seems to be using the same method. But the age of the Internet makes tracking sources much, much easier than it used to be. And he goes on:

The point isn’t that Shakespeare stole. It’s that, for the first time in a long time, there was stuff worth stealing—and nobody stopped him. Shakespeare seemed to know this—he was intensely aware that his genius depended on the culture around him. {Lehrer “Imagine”@221}

In retrospect, this reads as a preemptive defense of Lehrer’s own method. But I don’t get why Lehrer made stuff up: most of what he invented doesn’t seem to be very important, and it’s the kind of peripheral material that makes for good reading but isn’t essential. Given contemporary attitudes towards plagiarism—the passages above show that he knows and understands those attitudes—why risk so much for so little gain? It’s like a millionaire stealing a pair of $20 jeans. Why tarnish success? I can imagine some possible answers to these questions, but none of them are very satisfying, and I ultimately want to ascribe Lehrer’s lies to simple human vanity.

Imagine is still pretty interesting. I doubt it’s a perfect book, and I wouldn’t cite Lehrer in my neuroscience PhD dissertation. But I am now conscious of the tension between free-form creative thought and focused attention to a particular, grinding problem (“We need structure or everything falls apart. But we also need spaces that surprise us. Because it is the exchanges we don’t expect, with the people we just met, that will change the way we think about everything”); I am conscious of the need for both longtime collaborators and for new faces; and I am conscious of how people with deep domain expertise may benefit from applying that expertise elsewhere. Some of Lehrer’s points, like his description of the virtues of cities or the eccentric greatness of Paul Erdos, are already familiar. But he helps me see them in new ways. A moment like this, for example, shows me something important about my own writing and creative work:

Friedrich Nietzsche, in The Birth of Tragedy, distinguished between two archetypes of creativity, both borrowed from Greek mythology. There was the Dionysian drive—Dionysus was the god of wine and intoxication—which led people to embrace their unconscious and create radically new forms of art. [. . .] The Apollonian artist, by contrast, attempted to resolve the messiness and impose a sober order onto the disorder of reality. Like Auden, creators in the spirit of Apollo distrust the rumors of the right hemisphere. Instead, they insist on paying careful attention, scrutinizing their thoughts until they make sense. Auden put it best: ‘All genuine poetry is in a sense the formation of private spheres out of public chaos.’ {Lehrer “Imagine”@64}

I am far more in the Apollonian mode than the Dionysian mode, but, perhaps for that reason, I’m fascinated by and perhaps even envious of Dionysian thinking, acting, and living. A novel like The Secret History thus becomes all the more important to me, because it has an Apollonian narrator, Richard, dealing with the aftermath of an attempt to reach Dionysian ecstasy. In the novel, not surprisingly, the outcomes are pretty bad, but the idea of deliberately trying to reach an ecstatic experience resonates with my temperament.

There are some moments that appear, on the surface, self-contradictory. Lehrer says, “The most creative ideas, it turns out, don’t occur when we’re alone. Rather, they emerge from our social circles, from collections of acquaintances who inspire novel thoughts. Sometimes the most important people in life are the people we barely know” {Lehrer “Imagine”@204}.

Earlier in Imagine, however, Lehrer discusses how many creative ideas when people are taking morning showers—where most are presumably alone. So do creative ideas emerge from chatting with others, or when our mind is a relaxed state that lets it make disparate connections among ideas? The answer appears to be “both,” but Lehrer doesn’t explicitly discuss the implied contradictions. I’m not saying he couldn’t reconcile them, but I am saying that someone should’ve pointed these kinds of contradictions out.

Even if all of Imagine’s research and stories are somehow wrong—and I don’t think they are—the book still offers novel ways to think about creativity and how to structure one’s life or work more effectively and in ways that I hadn’t foreseen. I wish the publisher hadn’t withdrawn it altogether. Used copies on Amazon now start at $25. It may be that the existing copies thus continue to rise in value because of their scarcity; alternately, readers might turn to pirate editions on the Internet, which I can only assume are easy enough to find (my book came from the University of Arizona’s library).

Deception, commerce, and book blurbs

I’ve read about how you can’t trust book blurbs—the tidbits of praise on the covers and in the front matter of books. Usually I buy without seeing the cover because a book has been recommended, whether through friends, reviews, or reputation, and so don’t notice the blurbs, but an example from Gregory Clark’s A Farewell to Alms: A Brief Economic History of the World is so egregious that I double checked. On the front, the only text aside from the title and author reads: “‘…the next blockbuster in economics.’ —New York Times.”

“That’s interesting!” I thought, because I recalled Tyler Cowen’s piece in that very paper and, although it was positive, it wasn’t that predictive. The ellipsis gave away the problem, as Cowen actually said: “Professor Clark’s idea-rich book may just prove to be the next blockbuster in economics.” (I would link to the article, but it hides behind a firewall. The headline: “ECONOMIC SCENE; What Makes a Nation Wealthy? Maybe It’s the Working Stiff.”) The blurb fundamentally and unfairly changes what Cowen wrote, which is an unscrupulous move on the part of A Farewell to Alms‘ publisher, the Princeton University Press.

Despite the lie on the cover, the book itself is worth reading and makes a series of fascinating, if possibly incorrect, arguments about the story of human development and its consequences for the modern world. Clark acknowledges this in the preface: “Doubtless some of the arguments developed here will prove over-simplified, or merely false […] [b]ut far better such error than the usual dreary academic sins, which now seem to define so much writing in the humanities, of willful obfuscation and jargon-laden vanity.” What refreshing candor! The first 50 pages offer a myriad of ideas about how humanity became both massively wealthier and modestly poorer than it ever has before. I wish all nonfiction I read was so enlightening—and A Farewell to Alms doesn’t need the deceptive blurb on the front to be worthwhile.

EDIT: I just noticed that the blurbs on the back include the fuller quote from Cowen, but my original point stands.

%d bloggers like this: