Geek Heresy: Rescuing Social Change From the Cult of Technology — Kentaro Toyama

My review on Grant Writing Confidential is actually germane to readers of The Story’s Story, too, so I’ll start by directing you there. The book’s central and brilliant point is simple: for at least a century various people have imagined that better technology and the spread of technology will solve all sorts of social ills and improve all sorts of institutions, with education being perhaps the most obvious.

Geek_heresy2There are many other fascinating points—too many to describe all of them here. To take one, it’s often hard to balance short- and long-term wants. Many people want to write a novel but don’t want to write right now. Over time, that means the novel never gets written, because novels get written one sentence and one day at a time. Technology does not resolve this challenge. If anything, Internet access may make it worse. Many of us have faced an important long-term project only to diddle around on websites:

Short-term pleasure often leads to long-term dissatisfaction. That intuition underlies the psychologist’s distinction between hedonia and eudaimonia. Pleasure-seeking hedonism is questionable, but maybe long-term eudaimonic life satisfaction is good.

One sees these issues all over. Porn remains ridiculously popular (though some consumers of it are no doubt fine). Many people drink soda despite how incredibly detrimental soda is to health, and in my view how bad soda tastes compared to, say, ice cream. TV watching time is still insanely high, though it may be slightly down from its previous highs. There are various ways one can try to remove agency from the people watching porn while drinking soda and keeping one eye on a TV in the background, but the simpler solution is to look at people’s actions and see revealed preferences at work.

Most people don’t have the souls of artists and innovators trapped in average everyday lives. Most people want their sodas and breads and sugars and TV and SUVs and all the other things that elite media critics decry (often reasonable, in my view). Most people don’t connect what they’re doing right now to their long-term outcomes. Most people don’t want to be fat but the soda is right here. A lot of people want a better love life but in the meantime let’s check out Pornhub. Most people want amazing Silicon Valley tech jobs, but Netflix is here right now and Coursera seems far away.

And, to repeat myself, technology doesn’t fix any of that. As Toyama says of one project that gives computer access to children, “technology amplifies the children’s propensities. To be sure, children have a natural desire to learn and play and grow. But they also have a natural desire to distract themselves in less productive ways. Digital technology amplifies both of these appetites.” I had access to computers as a teenager. I wasted more time than I want to contemplate playing games on them, rather than building the precursors to Facebook. Large markets and social issues emerge from individual choices, and a lot of elite media types want to blame environment instead of individual. But each individual chooses computer games—or something real.

It turns out that “Low-cost technology is just not an effective way to fight inequality, because the digital divide is much more a symptom than a cause of other divides. Under the Law of Amplification, technology – even when it’s equally distributed – isn’t a bridge, but a jack. It widens existing disparities.” But those disparities emerge from individual behaviors. People who want to be writers need to write, now. People who want better partners or sex lives need to quit the sugar, now. One could pair any number of behaviors and outcomes in this style, and one could note that most people don’t do those things. The why seems obvious to me but maybe not to others. The people who become elite developers often say coding is fun for them in a way it apparently isn’t to others (including me). Writing is fun to me in a way it apparently isn’t to others. So I do a lot of it, less because it’s good for me than because it’s fun, for whatever temperamental reason. Root causes interest me, as they do many people with academic temperaments. Root causes don’t interest most people.

Let me speak to my own life. I’ve said variations on this before, but when I was an undergrad I remember how astounded some of my professors were when they’d recommend a book and I’d read it and then show up in office hours. I didn’t understand why they were astounded until I started teaching, and then I realized what most students are like and how different the elite thinkers and doers are from the average. And this is at pretty decent colleges and universities! I’m not even dealing with the people who never started.

Most of the techno-optimists, though—I used to be one—don’t realize the history of the promise of technology to solve problems:

As a computer scientist, my education included a lot of math and technology but little of the history or philosophy of my own field. This is a great flaw of most science and engineering curricula. We’re obsessed with what works today, and what might be tomorrow, but we learn little about what came before.

Yet technology doesn’t provide motivation. It’s easy to forget this. Still, I wonder if giving 100 computers to 100 kids might be useful because one of them will turn out to be very important. The idea that a small number of people drive almost all human progress is underrated. In The Enlightened Economy Joel Mokyr observes that the Industrial Revolution may actually have been driven primarily by ten to thirty thousand people. That’s a small number and a small enough number that the addition to or subtraction of a single individual from the network may have serious consequences.

This isn’t an idea that I necessarily buy but it is one I find intriguing and possibly applicable to a large number of domains. Toyama’s work may reinforce it.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

The writer’s life and Donald Knuth’s Selected Papers on Computer Science

Doing almost anything right is hard. Few of us appreciate how hard, and how much we have to specialize if we’re to accomplish anything significant.

Still, it’s useful to fertilize ourselves by looking at distant fields. I’m fascinated by other people’s professions and what non-specialists can take from them; I’ve never wanted to be a politician, but Chris Matthews’ Hardball offers surprisingly deep life insights. Non-soldiers love The Art of War, and Anthony Bourdain demonstrates how much we want to know about what goes on in the kitchen—and not just sexually. Being who I am, I tend to look for lessons about writing (and material to write about) in other people’s work, and Donald Knuth’s Selected Papers on Computer Science is the latest to catch my eye, especially when he discusses just how hard writing programs really is:

Software creation not only takes time, it’s also much more difficult than I thought it would be. Why is this so? I think the main reason is that a longer attention span is needed when working on a large computer program than when doing other intellectual tasks. A great deal of technical information must be kept in one’s head, all at once, in high-speed random-access memory somewhere in the brain. I found to my dismay that I could not be writing large programs while teaching my regular classes; I simply couldn’t do justice to both activities simultaneously, nor could I be happy if the programs were left unwritten. So I reluctantly took occasional leaves of absence from university teaching. In this sense I believe that program-writing is substantially more demanding than book-writing.

Another reason that programming is harder than the writing of books and research papers is that programming demands a significantly higher standard of accuracy. Programs don’t simply have to make sense to a another human being, they must make sense to a computer.

Writing a novel isn’t quite as hard as writing a complex software system, but it’s close. It’s useful to keep story information in “high-speed random-access memory,” and I’m usually thinking subconsciously about whatever primary novel I’m working on regardless of what else is going on around me.

The only major exception is when I’m deeply into writing a proposal, which requires similar bandwidth. Program-writing might be substantially more demanding than book writing, but book writing is still shockingly demanding, especially if one is to pay attention to its details. That, in essence, is what it means to be an expert on something: to be aware of its details. I’ll return to my first sentence and reiterate it by saying that I don’t think most of us realize the sheer number of details involved in virtually every thing or activity around us. We don’t see the moss on the bark of the trees; we see the forest from 30,000 feet.

Knuth sees at a lot of scales, and he’s also good at telling people to work with each other, regardless of boundaries. He says, for example, that “Scientists always find it easiest to write for colleagues who share their own subspecialty. But George Forsythe told me in 1970 that I should be prepared to explain things to a wider group of people [. . .]” This is true of literary theorists too. But we should write for those who we might not know well, or who might not know us well, and I think the inverse of Knuth’s advice is true: people who mostly write popular works would probably be better off writing for a specialist audience at times, in order to deepen knowledge of their own field. Know everything about something and something about everything, to paraphrase a quote that Google attributes to a wide array of people. Knuth, one senses, does: “When I try to characterize my own life’s work, I think of it primarily as an attempt to balance theoretical studies with practical achievements.”

The idea of theory and practice balancing each other comes up over and over again. It resonated with me because I’ve long hated the false binary between the two. He notes that some fields are more like each other than some of us imagine too:

The best programs are written so that computing machines can perform them quickly and so that human beings can understand them clearly. A programmer is ideally an essayist who works with traditional aesthetic and literary forms as well as mathematical concepts, to communicate the way that an algorithm works and to convince a reader that the results will be correct. Programs often need to be modified, because requirements and equipment change. Programs often need to be combined with other programs. Success at these endeavors is directly linked to the effectiveness of a programmer’s expository skill.

Numerous analogies can be drawn between hacking, in the sense Paul Graham uses the term, and writing. My awareness of the likeness probably stems from my deeper-than-average knowledge of writing and my very shallow knowledge of programming, along with the sense that metaphors for writing abound. Plus, very few people compare programmers and essayists, as Knuth does here, and yet his reasons are surprisingly convincing. The key word is “surprisingly:” the two fields that don’t seem incredibly similar somehow are. That’s what good writers do. Maybe it’s what good programmers do too.

In Founders at Work Philip Greenspun said:

People don’t like to write. It’s hard. The people who were really good software engineers were usually great writers; they had tremendous ability to organize their thoughts and communicate. The people who were sort of average-quality programmers and had trouble thinking about the larger picture were the ones who couldn’t write.

People don’t like to write, and they don’t like to hack much, either. I like to write, and while I’m often interested in computer science and computer-related topics, I never had the weird, driving need I felt towards writing. To quote Graham again”: “I know a handful of super-hackers, so I sat down and thought about what they have in common. Their defining quality is probably that they really love to program.” That’s a defining quality of writers too. It feels more like writing chose me than I chose writing, even though I would posit that, logically, computer scientists and engineers have had the greatest impact on average daily life of any group over the last, say, 30 years. But both writing and programming are hard; unambiguous communication is hard (as is communication that’s artistically ambiguous).

Elsewhere Knuth notes that “Anyone who has prepared a computer program will appreciate the fact that an algorithm must be very precisely defined, with an attention to detail that is unusual in comparison with the other things people do.” Detail applies to writing as well: one not bad definition for a writer might be, “a person who attends to the details of their writing.”

Do you?

The writer's life and Donald Knuth's Selected Papers on Computer Science

Doing almost anything right is hard. Few of us appreciate how hard, and how much we have to specialize if we’re to accomplish anything significant.

Still, it’s useful to fertilize ourselves by looking at distant fields. I’m fascinated by other people’s professions and what non-specialists can take from them; I’ve never wanted to be a politician, but Chris Matthews’ Hardball offers surprisingly deep life insights. Non-soldiers love The Art of War, and Anthony Bourdain demonstrates how much we want to know about what goes on in the kitchen—and not just sexually. Being who I am, I tend to look for lessons about writing (and material to write about) in other people’s work, and Donald Knuth’s Selected Papers on Computer Science is the latest to catch my eye, especially when he discusses just how hard writing programs really is:

Software creation not only takes time, it’s also much more difficult than I thought it would be. Why is this so? I think the main reason is that a longer attention span is needed when working on a large computer program than when doing other intellectual tasks. A great deal of technical information must be kept in one’s head, all at once, in high-speed random-access memory somewhere in the brain. I found to my dismay that I could not be writing large programs while teaching my regular classes; I simply couldn’t do justice to both activities simultaneously, nor could I be happy if the programs were left unwritten. So I reluctantly took occasional leaves of absence from university teaching. In this sense I believe that program-writing is substantially more demanding than book-writing.

Another reason that programming is harder than the writing of books and research papers is that programming demands a significantly higher standard of accuracy. Programs don’t simply have to make sense to a another human being, they must make sense to a computer.

Writing a novel isn’t quite as hard as writing a complex software system, but it’s close. It’s useful to keep story information in “high-speed random-access memory,” and I’m usually thinking subconsciously about whatever primary novel I’m working on regardless of what else is going on around me.

The only major exception is when I’m deeply into writing a proposal, which requires similar bandwidth. Program-writing might be substantially more demanding than book writing, but book writing is still shockingly demanding, especially if one is to pay attention to its details. That, in essence, is what it means to be an expert on something: to be aware of its details. I’ll return to my first sentence and reiterate it by saying that I don’t think most of us realize the sheer number of details involved in virtually every thing or activity around us. We don’t see the moss on the bark of the trees; we see the forest from 30,000 feet.

Knuth sees at a lot of scales, and he’s also good at telling people to work with each other, regardless of boundaries. He says, for example, that “Scientists always find it easiest to write for colleagues who share their own subspecialty. But George Forsythe told me in 1970 that I should be prepared to explain things to a wider group of people [. . .]” This is true of literary theorists too. But we should write for those who we might not know well, or who might not know us well, and I think the inverse of Knuth’s advice is true: people who mostly write popular works would probably be better off writing for a specialist audience at times, in order to deepen knowledge of their own field. Know everything about something and something about everything, to paraphrase a quote that Google attributes to a wide array of people. Knuth, one senses, does: “When I try to characterize my own life’s work, I think of it primarily as an attempt to balance theoretical studies with practical achievements.”

The idea of theory and practice balancing each other comes up over and over again. It resonated with me because I’ve long hated the false binary between the two. He notes that some fields are more like each other than some of us imagine too:

The best programs are written so that computing machines can perform them quickly and so that human beings can understand them clearly. A programmer is ideally an essayist who works with traditional aesthetic and literary forms as well as mathematical concepts, to communicate the way that an algorithm works and to convince a reader that the results will be correct. Programs often need to be modified, because requirements and equipment change. Programs often need to be combined with other programs. Success at these endeavors is directly linked to the effectiveness of a programmer’s expository skill.

Numerous analogies can be drawn between hacking, in the sense Paul Graham uses the term, and writing. My awareness of the likeness probably stems from my deeper-than-average knowledge of writing and my very shallow knowledge of programming, along with the sense that metaphors for writing abound. Plus, very few people compare programmers and essayists, as Knuth does here, and yet his reasons are surprisingly convincing. The key word is “surprisingly:” the two fields that don’t seem incredibly similar somehow are. That’s what good writers do. Maybe it’s what good programmers do too.

In Founders at Work Philip Greenspun said:

People don’t like to write. It’s hard. The people who were really good software engineers were usually great writers; they had tremendous ability to organize their thoughts and communicate. The people who were sort of average-quality programmers and had trouble thinking about the larger picture were the ones who couldn’t write.

People don’t like to write, and they don’t like to hack much, either. I like to write, and while I’m often interested in computer science and computer-related topics, I never had the weird, driving need I felt towards writing. To quote Graham again”: “I know a handful of super-hackers, so I sat down and thought about what they have in common. Their defining quality is probably that they really love to program.” That’s a defining quality of writers too. It feels more like writing chose me than I chose writing, even though I would posit that, logically, computer scientists and engineers have had the greatest impact on average daily life of any group over the last, say, 30 years. But both writing and programming are hard; unambiguous communication is hard (as is communication that’s artistically ambiguous).

Elsewhere Knuth notes that “Anyone who has prepared a computer program will appreciate the fact that an algorithm must be very precisely defined, with an attention to detail that is unusual in comparison with the other things people do.” Detail applies to writing as well: one not bad definition for a writer might be, “a person who attends to the details of their writing.”

Do you?

Eight years of writing and the first busted Moleskine

Most of my writing happens on a computer, which means it’s pretty hard to depict the final product in a visually satisfying way.* But I also carry around a pretentious Moleskine™ notebook for the random ideas that strike in grocery stories or at parties. The latest notebook, however, developed a split binding:

I’ve been using Moleskines for about eight years, which means I go through about two of them per publishable novel:

Notice how none of the others have the binding split that afflicted the latest one. I haven’t consciously treated this one differently from its predecessors or used it any longer. Maybe the quality control at Moleskine central has declined, although people have made claims in that direction for a very long time.

Regardless of the reason, the latest notebook has about twelve usable pages left; I tend to write nonfiction, blog post ideas, things I need to remember, reminders about e-mails, entries from an unkept diary, and stuff like that in the back. Ideas, quotes, things people say, and other material related to fiction goes in front. When back and front meet in the middle, it’s time to get a new one.

When I start working on a new novel, I usually go back through all the old notebooks at the beginning to see what material might be usable and when I started taking ideas for that specific project. Some ideas for novels have been burbling in the back of my mind for a very long time, waiting for me to have the time and skill to move them from a couple of scrawled lines to 80,000 words of story. The oldest Moleskines I have were bought in the 2002 neighborhood. They’ve held up pretty well; the ones I started buying in the 2005 neighborhood are showing their age. Tough to say if this is an indication of falling quality control or something else altogether.

While Googling around for the complaint about Moleskine quality I linked to above, I also found a site that recommends The Guildhall Notebook. I’ve already ordered one, although apparently Guildhall doesn’t have a U.S. distributor, so I have to wait for mine to ship from the UK. I hope the improved binding is worth the wait. EDIT 1: They weren’t worth the wait, or the hassle; if that weren’t enough, Christine Nusse of Exaclair Inc. /Quo Vadis Planners, which distributes or distributed Guildhall notebooks, said in an e-mail that her understanding is that the notebooks are being discontinued. She recommends the Quo Vadis Habana instead (although I think it too big) or a Rhodia notebook (which I think just right, as I said below.

So even if you want a Guildhall pocket notebook, you probably won’t be able to find one for long; fortunately, the Rhodia Webbie is a better alternative.

EDIT 2: Someone found me by asking, “are moleskines pretentious”? Answer, in post form: “Are Moleskines pretentious? Yup. Guildhall Notebooks are worse.”

EDIT 3: I’ve settled on the Rhodia Webbie as a full-time notebook: it’s expensive but much more durable than other notebooks I’ve found. I’ll write a full review at some point.

EDIT 4: I posted an updated photo of the stack. Or you can see it here:


* Even describing it using conventional prepositions is tough: do I write “on” or “in” or “with” a computer? Good arguments exist for any of the three.

%d bloggers like this: