Why these assignment sheets: The world isn't going to be a routine place, and writing projects shouldn't be either

Phil Bowermaster writes:

Increasingly, perhaps, a job is something that we each have to create. We can’t count on someone else to create one for us. That model is disappearing. We have to carve something out for ourselves, something that the machines won’t immediately grab.

Bowermaster is describing on a macro scale what I try to do a micro scale with the papers I assign to students. The important part of my assignment sheets for freshman composition papers are only two paragraphs long, and students sometimes find them frustrating, but I do them this way because the world is headed in a direction that offers less direction and more power to do the right or wrong thing. Here’s an example of an assignment sheet:

Purpose: To explain and interpret a possible message or messages suggested by a) a text or texts we have read for class, b) a text or texts in Writing as Revision, or c) a book of your own choosing. If you write on a book of your own, you must clear your selection with me first. Your goal should be persuade readers of your interpretation using the texts studied and outside reading material.

You should construct a thesis that is specific and defensible and then explicate it through points, illustrations, and explanation. See Chapters 8 and 9 of A Student’s Guide To First-Year Writing for more information on the nature of textual analysis.

That’s it. Students can read more about the assignment if they want to, and they have a lot of freedom in picking a topic. Students often want more direction, which I give to some extent, but I don’t give step-by-step instructions because a) step-by-step instructions yield boring papers and b) in their real-life writing, the real challenge isn’t the writing. It’s the deciding what to write about and how to write once you’ve decided to start. The writing assignment often isn’t given; the writing assignment is made.

It’s a big leap to go from “write-a-good-paper” assignment sheets to conceptualizing “a job [as] something that we each have to create.” Maybe too big a leap. But the thinking and rationale behind my decision is clear: jobs that can be easily codified and described as a series of steps—jobs that are easily explained, in other words—are increasingly going away, either to off-shoring or automation. The ones that persist will be the ones that don’t exist now because no one has thought to do them. But a lot of school still appears to consists of a person in front of the room saying, “Follow these steps,” having the students follow the steps, and then moving on.

That model isn’t totally wrong—you can’t create something from nothing—but maybe we should more often be saying, “Here’s the kind of thing you should be doing. What steps should you take? How should you take them? Do something and then come talk to me about it.” That kind of model might be more time consuming and less easily planned, and I wouldn’t want to use it in every hour of every day. Many basic skills still need to be taught along the lines or “This is how you use a comma,” or “this is how an array works.” But we should be collectively moving towards saying, “Here are some constraints. Show me you can think. Show me you can make something from this.” And class isn’t totally devoid of support: unlike the real world, class has mandatory drafts due, lots of discussion about what makes strong writing strong, and the chance to see other people’s work. The imposed, artificial deadlines are particularly important. It’s not like I hand out assignment sheets and shove students out to sea, to flounder or float completely on their own.

Still, from what I can see, the world is increasingly rewarding adaptability and flexibility. I don’t see that trend changing; if anything, it seems likely to accelerate. If schools are going to (collectively) do a better job, they probably need to work on learning how to teach adaptability in the process of teaching subject-specific material. Offering the kinds of assignments I do is a microscopically small step in that direct, but big changes usually consist of a series of small steps. The assignments are one such step. This post is another.

In “A Welcome Call to Greatness,” John Hagel discusses That Used to Be Us, a book by Tom Friedman and Michael Mandelbaum that discusses what Hagel calls “creative creators” – “people who do their nonroutine work in distinctively nonroutine ways.” And that’s what I’m trying to do above: train students into being able to do nonroutine writing of a sort that will be distinctive, interesting, and well-done, but without a great deal of obvious managerial oversight from someone else. Great writing seldom springs from someone micromanaging: it springs from discussions, ideas, unexpected metaphors, connections, seeing old things in new ways, and form a plenitude of other places that can’t be easily described.

In “The Age of the Essay,” Paul Graham says:

Anyone can publish an essay on the Web, and it gets judged, as any writing should, by what it says, not who wrote it. Who are you to write about x? You are whatever you wrote.

Popular magazines made the period between the spread of literacy and the arrival of TV the golden age of the short story. The Web may well make this the golden age of the essay. And that’s certainly not something I realized when I started writing this.

He’s right. The most challenging writing most of my students will do isn’t even going to have the opportunity for someone else to micromanage it. The writing will increasingly be online. It will increasingly be their own decision to write or not write. As Penelope Trunk says, it will increasingly be essential for a good career. It won’t be routine. As I said above, routine work that can be codified and described in a series of steps will be exported to the lowest bidder. Valuable work will be the work nobody has dreamt up. Jobs will be “something that we each have to create.” I’m sure a lot of people will be unhappy with the change, but the secular forces moving in this direction look too great to be overcome by any individual. You surf the waves life and society throws at you, or you fall off the board and struggle. The worst cases never get back on the board and drown. I want students to have the best possible shot at staying on the board, and that means learning they can’t assume someone else is going to create a job—or an assignment—for them. They have to learn to do it themselves. They need to be creative. As Hagel quotes Mandelbaum and Friedman as saying, “Continuous innovation is not a luxury anymore – it is becoming a necessity.” I worry that too few students are getting the message.

I think of some of my friends who are unemployed, and when I ask them what they do all day, they say they spend time searching for a job, hanging out, watching TV. To me, this is crazy. If I were unemployed, I’d be writing, or learning Python, or posting on Craigslist with offers to work doing whatever I can imagine doing. The last thing I’d be doing is watching TV. In other words, I’d be doing something similar to what I’m doing now, even when I am employed: building skills, trying new things, and not merely sitting around waiting for good things to come to me. They won’t. Good things are the things one makes. Most of my employed friends seem to get this on some level, or have found their way into protected niches like teaching or nursing. I wonder if my unemployed friends had teachers and professors who forced them to think for themselves, or if they had teachers and professors who were content to hand them well-defined assignments that didn’t require much thinking about the “how” instead of the “what.”

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

Looking them in the eye and Peter Shankman’s story

There’s a back-and-forth going on at Hacker News over Peter Shankman’s post, “The Greatest Customer Service Story Ever Told, Starring Morton’s Steakhouse.” First, read Shankman’s post, because what Morton’s did is, in fact, amazing, and I say this in an age where most of what people call “amazing” is, in fact, not. And I don’t want to spoil the surprise, beyond noting that he sent this Tweet as a joke: “Hey @Mortons – can you meet me at newark airport with a porterhouse when I land in two hours? K, thanks. :)”

The top Hacker News commenter said, “With modern communication systems, flying in airplanes to lunch meetings and flying back that night is such an absurd waste of resources it qualifies as obscene.” Someone else disagreed: “The difference of quality between meeting face to face and through a webcam is so high that it’s sometimes worth taking a plane just for one lunch.”

The second person is right: sometimes the quality of a face-to-face meeting is worth a plane trip. As Paul Graham said in “Cities and Ambition:” “The physical world is very high bandwidth, and some of the ways cities send you messages are quite subtle.” You don’t reproduce the same effect meeting someone in person when you “meet” them online. This doesn’t just apply to romantic partners, either, although that’s an obvious example of the effect described: how many people want to see their lovers solely on the Internet?

The high-bandwidth physical world argues for meetings, and I suspect Shankman wasn’t just going to New Jersey for the meeting—he was going to communicate how important the meeting was. You don’t just spend hours on a plane for something frivolous; he was sending a signal and reaping face-to-face rewards. If someone flew to Tucson solely to meet with me, I’d be impressed. Very few people fly on a whim.

A brief story, although it‘s not on the same scale as Shankman’s: I’m a grad student in English Lit at the University of Arizona, which means I teach freshman composition. Students e-mail me all the time. Constantly. Unless there’s some compelling reason to reply, I usually answer their emails in class; if they want or need a longish explanation, I tell them to come to office hours (note: if they can’t make office hours, I also do office hours by appointment, so I’m not doing this to stiff them).

This strategy has a three-fold benefit: it cuts down on the amount of e-mail I receive over the course of the semester because students realize I won’t answer frivolous e-mails twelve hours after they’re sent. If I have follow-up questions, or the student does, those questions are easier to ask face-to-face. Misunderstandings caused by not not being face-to-face are evaded; it’s hard to ascertain context from e-mail. I think everyone has had misunderstandings and hurt feelings caused by dashed off e-mails that aren’t carefully considered. Finally, if students want me to read their papers or other work and show up to office hours, I know they really want me to edit their work, and their desire to get feedback isn’t just a passing fancy as ephemeral as a Facebook status update. The back-and-forth that can come from reading work and immediately responding to it can’t be easily duplicated—especially among non-professionals—over e-mail or other asynchronous communications.

I meant to list three things, I really did. But the reasons kept popping into my head, and I think they’re all valid.

Taken together, these issues point to why I doubt face-to-face meetings will disappear any time soon. If anything, their value might rise as they become less common. When I can, I interview writers, and I only do such interviews face-to-face because I think they’re more valuable. I’ve signaled to the writer that their time is valuable enough for me to come to them, and the conversations that result are, on average, deeper than I think they would be otherwise. There’s something about a person sitting across from you that you don’t get over the phone or Internet.

EDIT: Regarding students and e-mail, this story by Wired’s Chris Anderson also gets it right: “Why is e-mail volume getting ever worse? I believe it’s because of a simple fact: E-mail is easier to create than to respond to. This seems counterintuitive — after all, it’s quicker to read than to write. But reading a message is just the start. It may contain a hard-to-answer question, such as ‘What are your thoughts on this?'” The solution is to reduce e-mail wherever possible by making it equally expensive for emailers as e-mail receivers. Office hours help do this, and showing up at them signals that the student isn’t merely wasting time.

This principle affects other scales, too. Big tech companies still have central offices, usually in very high-rent areas, where they make sure everyone in the company gets together on a regular basis. The willingness of companies to pay for offices indicates they still get a lot of utility from having large numbers of people hanging out in a concentrated area—perhaps because of knowledge-spillover effects, which is Edward Glaeser’s explanation in The Triumph of the City.

If there weren’t such spillover effects, companies would disburse to avoid paying for office space, people would move to rural areas with fast internet and low real estate costs, education wouldn’t consist of a group getting together in a classroom, and the world would look much different than it does. Even in an age of social media, a lot of people want to live in Manhattan—a city not exactly renown for its wonderful weather. Commenters have been predicting the death of distance and the death of place and so on since the dawn of the Internet, if not earlier, and so far they’ve proven wrong. As long as humans remain basically as we are today—in the absence, in other words, of some singularity-type event—I don’t think people are going to want to stop seeing each across a table, or standing next to each other in a room. Social media has not turned our world in Snow Crash, at least not yet. Shankman knows this. Digital technologies complement, rather than substitute for, real world experience. He uses Twitter and flies for face-to-face meetings. That’s the essence of one aspect of modernity: being able to handle multiple registers of communication fluently and realizing that most of them have their place for most people.

Looking them in the eye and Peter Shankman's story

There’s a back-and-forth going on at Hacker News over Peter Shankman’s post, “The Greatest Customer Service Story Ever Told, Starring Morton’s Steakhouse.” First, read Shankman’s post, because what Morton’s did is, in fact, amazing, and I say this in an age where most of what people call “amazing” is, in fact, not. And I don’t want to spoil the surprise, beyond noting that he sent this Tweet as a joke: “Hey @Mortons – can you meet me at newark airport with a porterhouse when I land in two hours? K, thanks. :)”

The top Hacker News commenter said, “With modern communication systems, flying in airplanes to lunch meetings and flying back that night is such an absurd waste of resources it qualifies as obscene.” Someone else disagreed: “The difference of quality between meeting face to face and through a webcam is so high that it’s sometimes worth taking a plane just for one lunch.”

The second person is right: sometimes the quality of a face-to-face meeting is worth a plane trip. As Paul Graham said in “Cities and Ambition:” “The physical world is very high bandwidth, and some of the ways cities send you messages are quite subtle.” You don’t reproduce the same effect meeting someone in person when you “meet” them online. This doesn’t just apply to romantic partners, either, although that’s an obvious example of the effect described: how many people want to see their lovers solely on the Internet?

The high-bandwidth physical world argues for meetings, and I suspect Shankman wasn’t just going to New Jersey for the meeting—he was going to communicate how important the meeting was. You don’t just spend hours on a plane for something frivolous; he was sending a signal and reaping face-to-face rewards. If someone flew to Tucson solely to meet with me, I’d be impressed. Very few people fly on a whim.

A brief story, although it‘s not on the same scale as Shankman’s: I’m a grad student in English Lit at the University of Arizona, which means I teach freshman composition. Students e-mail me all the time. Constantly. Unless there’s some compelling reason to reply, I usually answer their emails in class; if they want or need a longish explanation, I tell them to come to office hours (note: if they can’t make office hours, I also do office hours by appointment, so I’m not doing this to stiff them).

This strategy has a three-fold benefit: it cuts down on the amount of e-mail I receive over the course of the semester because students realize I won’t answer frivolous e-mails twelve hours after they’re sent. If I have follow-up questions, or the student does, those questions are easier to ask face-to-face. Misunderstandings caused by not not being face-to-face are evaded; it’s hard to ascertain context from e-mail. I think everyone has had misunderstandings and hurt feelings caused by dashed off e-mails that aren’t carefully considered. Finally, if students want me to read their papers or other work and show up to office hours, I know they really want me to edit their work, and their desire to get feedback isn’t just a passing fancy as ephemeral as a Facebook status update. The back-and-forth that can come from reading work and immediately responding to it can’t be easily duplicated—especially among non-professionals—over e-mail or other asynchronous communications.

I meant to list three things, I really did. But the reasons kept popping into my head, and I think they’re all valid.

Taken together, these issues point to why I doubt face-to-face meetings will disappear any time soon. If anything, their value might rise as they become less common. When I can, I interview writers, and I only do such interviews face-to-face because I think they’re more valuable. I’ve signaled to the writer that their time is valuable enough for me to come to them, and the conversations that result are, on average, deeper than I think they would be otherwise. There’s something about a person sitting across from you that you don’t get over the phone or Internet.

EDIT: Regarding students and e-mail, this story by Wired’s Chris Anderson also gets it right: “Why is e-mail volume getting ever worse? I believe it’s because of a simple fact: E-mail is easier to create than to respond to. This seems counterintuitive — after all, it’s quicker to read than to write. But reading a message is just the start. It may contain a hard-to-answer question, such as ‘What are your thoughts on this?'” The solution is to reduce e-mail wherever possible by making it equally expensive for emailers as e-mail receivers. Office hours help do this, and showing up at them signals that the student isn’t merely wasting time.

This principle affects other scales, too. Big tech companies still have central offices, usually in very high-rent areas, where they make sure everyone in the company gets together on a regular basis. The willingness of companies to pay for offices indicates they still get a lot of utility from having large numbers of people hanging out in a concentrated area—perhaps because of knowledge-spillover effects, which is Edward Glaeser’s explanation in The Triumph of the City.

If there weren’t such spillover effects, companies would disburse to avoid paying for office space, people would move to rural areas with fast internet and low real estate costs, education wouldn’t consist of a group getting together in a classroom, and the world would look much different than it does. Even in an age of social media, a lot of people want to live in Manhattan—a city not exactly renown for its wonderful weather. Commenters have been predicting the death of distance and the death of place and so on since the dawn of the Internet, if not earlier, and so far they’ve proven wrong. As long as humans remain basically as we are today—in the absence, in other words, of some singularity-type event—I don’t think people are going to want to stop seeing each across a table, or standing next to each other in a room. Social media has not turned our world in Snow Crash, at least not yet. Shankman knows this. Digital technologies complement, rather than substitute for, real world experience. He uses Twitter and flies for face-to-face meetings. That’s the essence of one aspect of modernity: being able to handle multiple registers of communication fluently and realizing that most of them have their place for most people.

Innovation You — Jeff DeGraff

I started Innovation You because of this Arnold Kling post. Suggestion: read his post and this one instead of the whole book. If you’re interested in how innovation and ideas work, try Steven Berlin Johnson’s Where Good Ideas Come From and Derek Sivers’ Anything You Want. They cover similar territory infinitely better. DeGraff asks a lot of questions that feel absurd and obvious at the same time, like, “How do you innovate you?” The answer is obvious: read, write, try new things. If you don’t know how to do that, there might be no hope for you. Or very little hope.

So little hope that you’d be like a student I had who I’ll call “Sarah.” She didn’t know what to write her second paper on, so she came to my office hours for help. This isn’t at all uncommon and is exactly what you, if you’re a student, should do, and if you’re one of my students who happens to be reading this, make sure you do come to office hours. Anyway, Sarah didn’t know what to write about, so I asked if she liked anything we’d read. No. Okay, did she like anything we read in the first unit? No. What classes was she taking? It was something like business, econ, a humanities class. Did she like any of them? No. What did she like? She didn’t know—shopping, hanging out with friends. What was important to her? Getting a job when she graduated, her family. How was she going to get a job if she didn’t like any of her classes? She didn’t know. I backpedalled: almost all my assignment sheets include a caveat that, if you’d like to write about a book of your own choosing, you can as long as you clear it with me first (this is to weed out the people who want to write about Twilight or self-help books or things like that). I suggested that she use that option and write on a book of her own choosing. Sarah’s response: “I have no books.” That’s a direct quote. Mind you, this is on a university campus with a giant library and equally giant bookstore. She was beyond my help; I think she’s the only student who’s come into office hours who I’ve been utterly unable to assist.

Sarah might be helped by Innovation You.

DeGraff says things like, “These days, people from all walks of life come to me for individual guidance. Who am I?” Fortunately, the question of “Who am I?” is a very contemporary one, like asking, “Should I get the iPhone with less storage space or pay for more?”, and it has no history or background whatsoever. If you’re the kind of person who smacks your head and says, ” ‘Who am I?’ is a question I’ve never thought to ask before!”, this book is for you. It has lots of very short stories that reduce people to pawns. Don’t read this book, though there are some worthwhile bits. Here’s one, where DeGraff describes a woman who kept looking for a synagogue like the one she went to as a child and not finding one that met her standards, whatever those might be, because

She was evaluating and criticizing, not creating. She reminded me of certain older, unmarried people I’ve known who decide later in life that they want a spouse after all. They are experts at going on dates and evaluating what’s wrong with every possible candidate. And they’re right—there’s something wrong with everyone. We’re human. But we’re worthwhile anyway. People don’t marry when they’ve found perfection because there is no perfection. They marry wen they’ve found someone they love whose faults they can accept, and who can accept their faults in return. {DeGraff “You”@34}

Very true. It’s a lot of what Lori Gottlieb says in Marry Him! (link goes to a Megan McArdle discussion of said book). A lot of what DeGraff says is said better in other books. Here are the other two quotes worth going in Devonthink Pro:

“Most of the distractions and wasted time in your life tend to be created by a small number of distracting, wasteful people. So today, many of us focus on trying to do more for the most important clients or customers and to avoid whoever is wasteful or doesn’t show results” {DeGraff “You”@42}.

We also avoid a small number of behaviors, like obsessively checking e-mail. And:

“At a personal level, almost anyone you know will tell you that they are overly busy and overly stressed, but who controls that? The person saying so. So we suffer our ‘do it all’ mentality and inadvertently create a melange of mediocrity. Trying to have it all, all at the same time, is at best difficult, and, at worst, destructive” {DeGraff “You”@89}.

Congrats. I’ve now saved you from spending the $6 (Amazon used) or $14 (Amazon new) that you might otherwise have spent, because you’ve got almost all the book’s contentful sections in a handful of quotes. If you’re wondering how to live your life, read the epiphany posts at Hacker News and you’ll get basically the same thing.

Why don’t novels with love stories describe how characters come to like each other?

I was talking to a friend about Anita Shreve’s Testimony, which has a bunch of characters who fall in love or lust with one another, including the four whose taped orgy unleashes emergent destructive forces on everyone around them. Or, rather, the reaction to the video unleashes those forces; the video itself is harmless save for how others treat it. The important thing for this post, however, is how those moments of love or lust are depicted. The short version is that they aren’t. In one sentence, characters are going about their business; in another, they are noticing one another in a potentially erotic way; many sentences later, they’re in bed with each other. But the moments when real interest develops are never really portrayed, save maybe through action or sudden thought. It’s like trying to describe the moment when an idea hits: we can resort to metaphor, but we can’t truly describe what it’s like to be in a state of flow.

My best guess to the question posed by the title is that in real life very few people decide they like or love each other. It just. . . happens, like an idea. You might see manifestations of it; in Testimony, the relationship between Mike and Anna really starts with the touch of a hand. The one between Silas and Noelle begins with them spending more time together. The attractive is partly physical and partly something else. The “something else” interests me.

I wouldn’t be surprised if, in evolutionary terms, we’re not even supposed to understand or analyze our feelings; they’re just supposed to guide us to survival and reproduction. Based on the large number of studies cited in The Evolutionary Biology of Human Female Sexuality and elsewhere that show how much we understand subconsciously, this probably shouldn’t surprise us. But it does, especially in the context of stories, since so many of them have or should have reasons behind the characters’ action in them. When we push those reasons, however, we begin to see that they’re not so firm as we might once have imagined. I’d like to know about the limits of stories and how they reflect the way people act because sussing the limits helps us figure out how, if at all, we can or should transcend them.

Why don't novels with love stories describe how characters come to like each other?

I was talking to a friend about Anita Shreve’s Testimony, which has a bunch of characters who fall in love or lust with one another, including the four whose taped orgy unleashes emergent destructive forces on everyone around them. Or, rather, the reaction to the video unleashes those forces; the video itself is harmless save for how others treat it. The important thing for this post, however, is how those moments of love or lust are depicted. The short version is that they aren’t. In one sentence, characters are going about their business; in another, they are noticing one another in a potentially erotic way; many sentences later, they’re in bed with each other. But the moments when real interest develops are never really portrayed, save maybe through action or sudden thought. It’s like trying to describe the moment when an idea hits: we can resort to metaphor, but we can’t truly describe what it’s like to be in a state of flow.

My best guess to the question posed by the title is that in real life very few people decide they like or love each other. It just. . . happens, like an idea. You might see manifestations of it; in Testimony, the relationship between Mike and Anna really starts with the touch of a hand. The one between Silas and Noelle begins with them spending more time together. The attractive is partly physical and partly something else. The “something else” interests me.

I wouldn’t be surprised if, in evolutionary terms, we’re not even supposed to understand or analyze our feelings; they’re just supposed to guide us to survival and reproduction. Based on the large number of studies cited in The Evolutionary Biology of Human Female Sexuality and elsewhere that show how much we understand subconsciously, this probably shouldn’t surprise us. But it does, especially in the context of stories, since so many of them have or should have reasons behind the characters’ action in them. When we push those reasons, however, we begin to see that they’re not so firm as we might once have imagined. I’d like to know about the limits of stories and how they reflect the way people act because sussing the limits helps us figure out how, if at all, we can or should transcend them.

Are teachers underpaid? It depends.

There’s a meme going around that teachers are “underpaid;” you can read one manifestation of it in this Hacker News comment, but I’m sure you’ll run across lots of other examples if you read the news. Here’s the poster’s main point about teaching: “It’s way harder than you think, and unless you’re a tenured professor at a university, teachers make shit.” I’ve never taught high school, but I’m a grad student and teach freshmen, so I have some experience standing in front of people for long periods of time and trying to be both interesting and informative at the same time. A few observations:

1) The first time you teach a class, it’s incredibly hard and time consuming, but the difficulty drops like a logarithm to a relatively low plateau after you’ve done it a few times. This appears to be reflected in data. According to Bureau of Labor Statistics (BLS) data, the average teacher works slightly less than 40 hours per week. If you have better data, I’d like to see it. Note too that people getting teaching degrees at the graduate level get substantially lower GRE scores than those in almost all other disciplines. This will come up later.

2) At one point I thought about teaching high school English. Seattle Public Schools paid about $36K / yr with a Masters or $30K / yr without, and those numbers topped at around $70K and $55K after 30 years (IIRC, Bellevue Public School teachers made something like ~10K more). You can verify that as of July 2011 through the 2010 – 2011 salary schedule (it’s actually a little higher than I remembered, or raises have been substantial). That doesn’t count retirement; teaching is unusual because a lot of the benefits are backloaded in the form of retirement pay. One woman in my grad program taught English for 26 years in Michigan and took an early retirement offer; I think she gets 70% of her last year’s salary for life. Granted, those deals are going away because of the budget crisis, but a lot of the retirement stuff is still baked in.

3) You can multiply those numbers by 1.2 or so because teachers only have mandatory work for nine months of the year. People in most professions gets two weeks to a month off.

4) After two to three years, you effectively can’t be fired because of union rules (unless you sleep with a student and get caught in a flagrant manner, don’t show up, etc.). See this post for lots of citations on that, as well as a lot of the information that’s going into this comment. Not being able to be fired has value. Paul Graham figured this out a while ago, and wrote in an essay that “Economic statistics are misleading because they ignore the value of safe jobs. An easy job from which one can’t be fired is worth money; exchanging the two is one of the commonest forms of corruption. A sinecure is, in effect, an annuity.”

Note: there are major downsides to teaching. You have to like working with relatively undeveloped people (if you’re teaching high school) or children (if you’re teaching elementary school). In teaching, it’s very hard to make substantially more money if you really want to; whether you’re a good or bad teacher isn’t likely to make you more money. Still, you’ll hit the median household income neighborhood of $40,000 pretty quickly. My big impression is that teaching isn’t going to make you rich, but you’re also unlikely to ever be poor. To say that “teachers make shit” isn’t really true. It is to true to say that teachers have back-loaded compensation packages that tend to be high in benefits (e.g. good health care, retirement) and low in upfront salary.

Given this, we’re still left with the question of whether this is “too much” or “too little.” Some teachers are probably “underpaid” and some “overpaid,” depending on the demand for their field. To understand why, look at Payscale.com’s salary data for college majors. Humanities and social science majors are on the low end of the starting salary scale—not far from education majors, who start at $35K and have a mid-career median at $55K. This isn’t far from the pay at Seattle Public Schools, although Seattle probably has a higher cost of living than most places in the country. Salaries also vary by district; there’s been a lot of fury over, for example, New Jersey teacher salaries, since they’re relatively high, especially when one factors in health care. Arizona, by contrast, does not appear to suffer from that problem. The “underpaid” kind are experiencing major shortages—math, science, computer science, and so forth, which start in the vicinity of $50K and have a mid-career median in the $100K range. Those fields start close to where teachers can expect to be after 15 years. If you’re teaching computer science instead of taking a job that starts at $100K from Microsoft, Google, or Facebook, you’re underpaid. That’s why it’s so hard to districts to find really good math or science teachers.

There’s also the issue of student quality. In Seattle, there’s a strong north-south divide, with most of the southern schools being really tough and much more dangerous than the northern schools (the breakdown occurs along racial lines, as discussed in this 2006 Wall Street Journal article). If the pay is the same—and in Seattle, it is—most teachers will prefer the easier schools.

In the U.S., pay is in part proportionate to risk. Bill Gates isn’t just rich because he’s smart and hardworking; he also spent long hours in a company he created that could’ve easily netted him nothing. To some extent, teachers have collectively traded firing risk for lower salaries. Among other things, educational reformers are trying to sever this link, since getting great performance out of people who have no incentive for great performance save the goodness of their own hearts is problematic. Most people who experienced public schools—which is to say, most people—are probably aware of this on some level. There’s a movement afoot to make teachers more accountable, and I think it’s going to succeed. This should drive more money to great teachers, less to lousy ones, and more to people in technical fields. If teachers as a whole want more money, they better be ready to take more risk and be prepared to have their performance evaluated—like it is in virtually every other white-collar profession.

EDIT: A countervailing view that observes U.S. teachers don’t appear to be as productive as teachers from other countries, perhaps because of pay problems. Still, I wonder what alternate professional opportunities are like in other countries relative to the U.S.

How should teenage characters speak? Engaging Janet Reid, Gossip Girl, and 90210

In a query letter critique, Janet Reid says:

If you’re writing a 14-year old character, you need to know how they talk: “Threatening us with violence” sounds like a sociologist; “told us he’d mess us up” sounds like what the kids on my corner say to each other.

This might be true in many instances, but I’m reminded of something this Salon.com review of “Gossip Girl” and “90210” says:

Where Blair and Serena’s lines snap, crackle and pop with wit and cleverness, the soggy stars of “90210” stumble over one cliché after another. “Awkward!” Annie blurts at Ethan after they encounter Ethan’s ex Naomi, then Annie does her best impression of the cynical teenage eye roll, as Ethan mutters, “Good times!” Oof.

But every scene is filled with such teen-bot tripe: “Whatever works for you.” “Helloo-ooo?” “Shut up!” “Me and Ethan? Not so much.” Maybe real teens sound like that, but real teens are repellent and worthless, remember? Plus, nothing’s worse than shoving such drivel into the mouths of a bunch of airbrushed anorexics and overgrown child actors.”90210’s” Annie has more in common with Broadway’s Annie than a real human being. Putting teen lingo in her mouth is like dressing a cat in a little nurse outfit. It’s sort of cute at first, but then it just gets sad.

“Repellant and worthless” is overstatement, very. The characters from “Gossip Girl” are more interesting because they don’t speak like teenagers, or like regular ones. If they did, they’d be boring (or, depending on your view, more boring than they already are). I suspect that’s why so many teen narrators are “precocious.” The alternative is dull. If you have a normal 15-year old, even 15-year olds will find them boring and insipid on the page. So you need a precocious 15-year old who adults can tolerate, and perhaps enjoy, while 15-year olds will imagine themselves to be equally witty even if they’re probably not. But your precocious 15-year-old still probably shouldn’t sound like Umberto Eco. That’s the challenge: giving a character enough of a voice and enough intelligence to make them interesting while not overwriting them. This challenge might explain why so many books starring teenagers are told from a distant, adult future recalling events from the past.

The first season of Gossip Girl, by the way, is pretty funny. I referenced it in class one day because, being a West Coaster, I had no idea what “cotillion” was until an episode featured it. That allusion elicited shock from students. Apparently I come across as a very hard core literary type. Which, of course, I am. A friend and I watched a few episodes and the last one from the second season, but it was so repetitive that we gave up.

On another note, the rest of Reid’s advice on the query letter is typically accurate. If you’ve ever wanted to try and get a literary agent or publisher to read your manuscript, take a couple hours to look through Query Shark first.