Geek Heresy: Rescuing Social Change From the Cult of Technology — Kentaro Toyama

My review on Grant Writing Confidential is actually germane to readers of The Story’s Story, too, so I’ll start by directing you there. The book’s central and brilliant point is simple: for at least a century various people have imagined that better technology and the spread of technology will solve all sorts of social ills and improve all sorts of institutions, with education being perhaps the most obvious.

Geek_heresy2There are many other fascinating points—too many to describe all of them here. To take one, it’s often hard to balance short- and long-term wants. Many people want to write a novel but don’t want to write right now. Over time, that means the novel never gets written, because novels get written one sentence and one day at a time. Technology does not resolve this challenge. If anything, Internet access may make it worse. Many of us have faced an important long-term project only to diddle around on websites:

Short-term pleasure often leads to long-term dissatisfaction. That intuition underlies the psychologist’s distinction between hedonia and eudaimonia. Pleasure-seeking hedonism is questionable, but maybe long-term eudaimonic life satisfaction is good.

One sees these issues all over. Porn remains ridiculously popular (though some consumers of it are no doubt fine). Many people drink soda despite how incredibly detrimental soda is to health, and in my view how bad soda tastes compared to, say, ice cream. TV watching time is still insanely high, though it may be slightly down from its previous highs. There are various ways one can try to remove agency from the people watching porn while drinking soda and keeping one eye on a TV in the background, but the simpler solution is to look at people’s actions and see revealed preferences at work.

Most people don’t have the souls of artists and innovators trapped in average everyday lives. Most people want their sodas and breads and sugars and TV and SUVs and all the other things that elite media critics decry (often reasonable, in my view). Most people don’t connect what they’re doing right now to their long-term outcomes. Most people don’t want to be fat but the soda is right here. A lot of people want a better love life but in the meantime let’s check out Pornhub. Most people want amazing Silicon Valley tech jobs, but Netflix is here right now and Coursera seems far away.

And, to repeat myself, technology doesn’t fix any of that. As Toyama says of one project that gives computer access to children, “technology amplifies the children’s propensities. To be sure, children have a natural desire to learn and play and grow. But they also have a natural desire to distract themselves in less productive ways. Digital technology amplifies both of these appetites.” I had access to computers as a teenager. I wasted more time than I want to contemplate playing games on them, rather than building the precursors to Facebook. Large markets and social issues emerge from individual choices, and a lot of elite media types want to blame environment instead of individual. But each individual chooses computer games—or something real.

It turns out that “Low-cost technology is just not an effective way to fight inequality, because the digital divide is much more a symptom than a cause of other divides. Under the Law of Amplification, technology – even when it’s equally distributed – isn’t a bridge, but a jack. It widens existing disparities.” But those disparities emerge from individual behaviors. People who want to be writers need to write, now. People who want better partners or sex lives need to quit the sugar, now. One could pair any number of behaviors and outcomes in this style, and one could note that most people don’t do those things. The why seems obvious to me but maybe not to others. The people who become elite developers often say coding is fun for them in a way it apparently isn’t to others (including me). Writing is fun to me in a way it apparently isn’t to others. So I do a lot of it, less because it’s good for me than because it’s fun, for whatever temperamental reason. Root causes interest me, as they do many people with academic temperaments. Root causes don’t interest most people.

Let me speak to my own life. I’ve said variations on this before, but when I was an undergrad I remember how astounded some of my professors were when they’d recommend a book and I’d read it and then show up in office hours. I didn’t understand why they were astounded until I started teaching, and then I realized what most students are like and how different the elite thinkers and doers are from the average. And this is at pretty decent colleges and universities! I’m not even dealing with the people who never started.

Most of the techno-optimists, though—I used to be one—don’t realize the history of the promise of technology to solve problems:

As a computer scientist, my education included a lot of math and technology but little of the history or philosophy of my own field. This is a great flaw of most science and engineering curricula. We’re obsessed with what works today, and what might be tomorrow, but we learn little about what came before.

Yet technology doesn’t provide motivation. It’s easy to forget this. Still, I wonder if giving 100 computers to 100 kids might be useful because one of them will turn out to be very important. The idea that a small number of people drive almost all human progress is underrated. In The Enlightened Economy Joel Mokyr observes that the Industrial Revolution may actually have been driven primarily by ten to thirty thousand people. That’s a small number and a small enough number that the addition to or subtraction of a single individual from the network may have serious consequences.

This isn’t an idea that I necessarily buy but it is one I find intriguing and possibly applicable to a large number of domains. Toyama’s work may reinforce it.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

Eight years of writing and the first busted Moleskine

Most of my writing happens on a computer, which means it’s pretty hard to depict the final product in a visually satisfying way.* But I also carry around a pretentious Moleskine™ notebook for the random ideas that strike in grocery stories or at parties. The latest notebook, however, developed a split binding:

I’ve been using Moleskines for about eight years, which means I go through about two of them per publishable novel:

Notice how none of the others have the binding split that afflicted the latest one. I haven’t consciously treated this one differently from its predecessors or used it any longer. Maybe the quality control at Moleskine central has declined, although people have made claims in that direction for a very long time.

Regardless of the reason, the latest notebook has about twelve usable pages left; I tend to write nonfiction, blog post ideas, things I need to remember, reminders about e-mails, entries from an unkept diary, and stuff like that in the back. Ideas, quotes, things people say, and other material related to fiction goes in front. When back and front meet in the middle, it’s time to get a new one.

When I start working on a new novel, I usually go back through all the old notebooks at the beginning to see what material might be usable and when I started taking ideas for that specific project. Some ideas for novels have been burbling in the back of my mind for a very long time, waiting for me to have the time and skill to move them from a couple of scrawled lines to 80,000 words of story. The oldest Moleskines I have were bought in the 2002 neighborhood. They’ve held up pretty well; the ones I started buying in the 2005 neighborhood are showing their age. Tough to say if this is an indication of falling quality control or something else altogether.

While Googling around for the complaint about Moleskine quality I linked to above, I also found a site that recommends The Guildhall Notebook. I’ve already ordered one, although apparently Guildhall doesn’t have a U.S. distributor, so I have to wait for mine to ship from the UK. I hope the improved binding is worth the wait. EDIT 1: They weren’t worth the wait, or the hassle; if that weren’t enough, Christine Nusse of Exaclair Inc. /Quo Vadis Planners, which distributes or distributed Guildhall notebooks, said in an e-mail that her understanding is that the notebooks are being discontinued. She recommends the Quo Vadis Habana instead (although I think it too big) or a Rhodia notebook (which I think just right, as I said below.

So even if you want a Guildhall pocket notebook, you probably won’t be able to find one for long; fortunately, the Rhodia Webbie is a better alternative.

EDIT 2: Someone found me by asking, “are moleskines pretentious”? Answer, in post form: “Are Moleskines pretentious? Yup. Guildhall Notebooks are worse.”

EDIT 3: I’ve settled on the Rhodia Webbie as a full-time notebook: it’s expensive but much more durable than other notebooks I’ve found. I’ll write a full review at some point.

EDIT 4: I posted an updated photo of the stack. Or you can see it here:


* Even describing it using conventional prepositions is tough: do I write “on” or “in” or “with” a computer? Good arguments exist for any of the three.

Computers and network effects: Why your computer is “slow”

Going Nowhere Really Fast, or How Computers Only Come in Two Speeds” is half-right. Here’s the part that’s right:

[…] it remains obvious that computers come in just two speeds: slow and fast. A slow computer is one which cannot keep up with the operator’s actions in real time, and forces the hapless human to wait. A fast computer is one which can, and does not.

Today’s personal computers (with a few possible exceptions) are only available in the “slow” speed grade.

So far so good: I wish I didn’t have to wait as long as I do for Word to open documents or load or for OS X to become responsive after reboot. But then there’s the reason offered as to why computers feel subjectively slower in many respects than they did:

The GUI of my 4MHz Symbolics 3620 lisp machine is more responsive on average than that of my 3GHz office PC. The former boots (into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created) faster than the latter boots into its syrupy imponade hell.

This implies that the world is filled with “bloat.” But such an argument reminds me of Joel Spolsky’s Bloatware and the 80/20 myth. He says:

A lot of software developers are seduced by the old “80/20” rule. It seems to make a lot of sense: 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies.

Unfortunately, it’s never the same 20%. Everybody uses a different set of features.

Exactly. And he goes on to quote Jamie Zawinski saying, “Convenient though it would be if it were true, Mozilla [Netscape 1.0] is not big because it’s full of useless crap. Mozilla is big because your needs are big. Your needs are big because the Internet is big. There are lots of small, lean web browsers out there that, incidentally, do almost nothing useful.”

That’s correct; Stanislav’s 4MHz Symbolics 3620 lisp machine was/is no doubt a nice computer. But modern, ultra-responsive computers don’t exist not because people like bloat—they don’t exist because people in the aggregate choose trade-offs that favor a very wide diversity of uses. People don’t want to make the trade-offs that fast responsiveness implies in sufficient numbers for there to be a market for such a computer.

Nothing is stopping someone from making a stripped-down version of, say, Linux that will boot “into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created faster than the latter boots into its syrupy imponade hell.” But most people evidently prefer the features that modern OSes and programs offer. Or, rather, they prefer that modern OSes support THEIR pet feature and make everything as easy to accomplish as possible at the expense of speed. If you take out their favorite feature… well, then you can keep your superfast response time and they’ll stick with Windows.

To his credit, Stanislav responded to a version of what I wrote above, noting some of the possible technical deficiencies of Linux:

If you think that a static-language-kernel abomination like Linux (or any other UNIX clone) could be turned into a civilized programming environment, you are gravely mistaken.

That may be true: my programming skill and knowledge end around simple scripting and CS 102. But whatever the weaknesses of Linux, OS X, and Windows, taken together they represent uncounted hours of programming and debugging time and effort. For those of you who haven’t tried it, I can only say that programming is an enormous challenge. To try and replicate all that modern OSes offer would be hard—and probably effectively impossible. If Stanislav wants to do it, though, I’d be his first cheerleader—but the history of computing is also rife with massive rewrites of existing software and paradigms that fail. See, for example, GNU/Hurd for a classic example. It’s been in development since 1990. Did it fail for technical or social reasons? I have no idea, but the history of new operating systems, however technically advanced, is not a happy one.

Stanislav goes on to say:

And if only the bloat and waste consisted of actual features that someone truly wants to use.

The problem is that one man’s feature is another’s bloat, and vice-versa, which Joel Spolsky points out, and that’s why the computer experience looks like it does today: because people hate bloat, unless it’s their bloat, in which case they’ll tolerate it.

He links to a cool post on regulated utilities as seen in New York (go read it). But I don’t think the power grid metaphor is a good one because transmission lines do one thing: move electricity. Computers can be programmed to do effectively anything, and, because users’ needs vary so much, so does the software. You don’t have to build everything from APIs to photo manipulation utilities to web browsers on top of power lines.

Note the last line of Symbolics, Inc.: A failure of heterogeneous engineering, which is linked to in Stanislav’s “About” page:

Symbolics is a classic example of a company failing at heterogeneous engineering. Focusing exclusively on the technical aspects of engineering led to great technical innovation. However, Symbolics did not successfully engineer its environment, custormers [sic], competitors and the market. This made the company unable to achieve long term success.

That kind of thinking sounds, to me, like the kind of thinking that leads one to lament how “slow” modern computers are. They are—from one perspective. From another, they enable things that the Lisp machine didn’t have (like, say, YouTube).

However, I’m a random armchair quarterback, and code talks while BS walks. If you think you can produce an OS that people want to use, write it. But when it doesn’t support X, where “X” is whatever they want, don’t be surprised when those people don’t use it. Metcalfe’s Law is strong in computing, and there is a massive amount of computing history devoted to the rewrite syndrome; for another example, see Dreaming in Code, a book that describes how an ostensibly simple task became an engineering monster.

Steve Jobs’ prescient comment

“The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That’s over. Apple lost. The desktop market has entered the dark ages, and it’s going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.”

(Emphasis added.)

—That’s from a 1996 interview with Jobs, and he was completely right: little of interest happened to the desktop interface virtually everyone uses until around 2003 or 2004, when OS X 10.3 was released. The first major useful change in desktops that I recall during the period was Spotlight in OS X 10.4, which was, not coincidentally, around the time I got a PowerBook.

%d bloggers like this: