Briefly noted: Kindle Voyage

For a while I’ve had a Kindle Voyage. It’s functional and the screen is nice. Not much has changed since this 2010 post. Amazon still has no good system for organizing and sorting books, and Amazon doesn’t want you to use desktop computers and that shows in their whole ecosystem design.

The Voyage hardware is, at best, slightly better than the last Kindle iteration I used. Really, though, the improvements are so marginal that I can’t imagine anyone buying the new version unless their old one dies or is lost, as happened to me: Amazon will often knock some money off the new version if you ask them to “repair” the old version. To get the discount, Amazon requires that you send the broken Kindle to them. I don’t know what happens after that. Probably Amazon trashes it, but I’d like to imagine that it’s refurbished.

A lot about the Kindle Voyage is okay. There’s little to love. If you’re going to bother buy a Kindle the Voyage is a better choice than the regular Kindle Paperwhites because it has buttons, albeit buttons that aren’t as prominent or tactile as I’d like.

I don’t use the Kindle for books much, because I still prefer paper and Instapaper is my killer app. At the margins, I now read more nonfiction and fewer books in general, including novels. You’ve probably read or noticed that too many popular nonfiction books are just unsatisfactorily elongated articles. Preferring to read those rather than just clicking the “buy book” button is easier with Instapaper.

This review is thorough and says most of what I’d say. I don’t know how people produce many thousands of words in Kindle reviews. It’s a device without a personality. Which isn’t bad: It just is. There are good use cases for it, but not for me using it.

I still find button presses annoyingly too easy.

 

Review: The CODE Keyboard (With Cherry MX “Clear” switches)

In the last three months, a bunch of people have written to ask if I’ve tried any keyboards since 2011’s “Further thoughts on the Kinesis Advantage, Unicomp Space Saver, and Das Keyboards” (evidently Google has brought my keyboard articles to the top of its search rankings again). The short answer is yes, but only one, and I bought it: the 87-key CODE Keyboard with Cherry MX “Clear” switches. The switches are slightly quieter than the Kinesis Advantage’s Cherry MX “Brown” switches, while still retaining excellent tactile feel. If I were using a conventional keyboard, I’d very slightly prefer the Unicomp Ultra Classic to the CODE keyboard, but in real-world usage the difference is tiny. Anyone who is noise sensitive or works in the same room as other people should use the CODE Keyboard, however, which is substantially quieter at little cost to feeling.

87-key CODE keyboard

There isn’t much more to say about the CODE Keyboard: it has backlighting, which is nice if you care about that sort of thing (I don’t). It comes in a 87-key version, which is also nice because it’s smaller and because many of us don’t need extensive number pad use. It feels durable and in the two or so years I’ve had it I haven’t detected wear. The Unicomp Ultra Classic has a slight edge in the durability rankings because its predecessors—the IBM Model M—have been in service for decades. Unicomp and IBM keyboards are so good that Unicomp suffers because it sells a product that doesn’t need to be replaced. The CODE Keyboard is likely to be similarly durable, though it’s only been on the market for a couple years. If it has any weaknesses they’re not apparent to me. The profile is close to as slim as it can be without compromising function. I’ve never been a fan of Apple’s chiclet-style keyboards, though they’re obviously necessary for laptops.

I haven’t been posting about keyboards, though, because companies haven’t been sending them lately—I guess that since Anandtech and Ars Technica have begun reviewing keyboards, a site targeting writers and readers rather than hackers and gamers gets bumped to the bottom of the priority queue. Outlet proliferation is even greater: there’s an active Reddit subsection devoted to them, and Googling “mechanical keyboards” brings up more background than I have time or inclination to digest. Writers also tend to be less vocal about their love for gadgets than tech people.

I’m also much less interested in experimenting with different keyboards because I don’t perceive much room for improvement over the good keyboards we have now. Until neural implants get developed and keyboards become as weird a curiosity as Victorian-era telegraph machines are today, we’ve probably gotten to be around as good as we’re likely to get.

87-key CODE keyboardWhen I first bought a Unicomp Ultra Classic I was in college and a couple tiny companies made mechanical keyboards, including Unicomp and Matias (whose early products were so screwed up I never tried the later ones). Today web startups are common and people are used to buying things online; dozen companies or more are making mechanical keyboards, and it’s hard to pick a bad keyboard. Any of them will be much better than the keyboards that ship with most computers.

Plus, as said earlier, the good options have mostly driven out the bad, or forced the weak early keyboards to be improved. Matias’s more recent keyboards apparently don’t have the ghosting that made earlier versions useless. Companies like Vortex are producing physically small keyboards that have programmable keys, as is the obnoxiously named POK3R. Other companies have produced wireless Bluetooth versions and versions with extra USB ports, neither of which matter to me. A profusion of ultra minor variations is a hallmark of maturity. Most of the keyboards still have Darth Vader or computer nerd aesthetics, but that probably speaks to their target audience. Except, possibly, for the CODE Keyboard, none of the current mechanical keyboards seem like Apple made them.

87-key CODE keyboardBut the real news is no news: a bunch of keyboards exist and they’re all pretty good. The word “slight” appears three times in the first two paragraphs because there aren’t clear winners. If you type a lot and aren’t interested in the minutia, get a CODE Keyboard and put the rest out of your mind. If you want a more ergonomic experience and have the cash, get a Kinesis Advantage and learn how to use it (and be ready for weird looks from your friends when they see it). Options are beautiful but don’t let them drive you mad.

Two visions for the future, inadvertently juxtaposed: Nell Zink and Marc Andreessen

Last week’s New Yorker inadvertently offers two visions for the future: one in a profile of the writer Nell Zink and the other in a profile of the venture capitalist Marc Andreessen. Both profiles are excellent. One of their subjects, however, is mired in a fixed, contemporary mindset, while the other subject looks to a better future.

This is Schultz’s description of Zink: “Zink writes about the big stuff: the travesty of American apartheid; the sexual, economic, and intellectual status of women; the ephemerality of desire and its enduring consequences.” Is any of that stuff really big? Does it matter? Or is it just a list of somewhat transitory issues that obsess modern intellectuals who are talking to each other through magazines like The New Yorker? The material well-being of virtually any American is so much higher than it was in, say, 1900, as to diminish the relative importance of many of the ideas Zink or Schultz considers “big.” At one point Zink “delivered a short lecture on income stagnation: a bird ridiculing its fellow-bird for stupidity.” But global inequality is falling and, moreover, the more interesting question may be absolute material conditions, rather than relative ones. One gets the sense that Zink is a more parochial thinker than she thinks. I sense from The Wallcreeper that she writes about the motiveless and pathless.

Here, by contrast, is Andreessen as described by Tad Friend:

Andreessen is tomorrow’s advance man, routinely laying out “what will happen in the next ten, twenty, thirty years,” as if he were glancing at his Google calendar. He views his acuity as a matter of careful observation and extrapolation, and often invokes William Gibson’s observation “The future is already here—it’s just not very evenly distributed.” Jet packs have been around for half a century, but you still can’t buy them at Target.

and:

The game in Silicon Valley, while it remains part of California, is not ferocious intelligence or a contrarian investment thesis: everyone has that. It’s not even wealth [. . . .] It’s prescience. And then it’s removing every obstacle to the ferocious clarity of your vision: incumbents, regulations, folkways, people. Can you not just see the future but summon it?

Having a real vision counts, and it seems that too few people have a vision for the future. Andreessen is thinking not of today but of what can be made better tomorrow. I would not deny the impact of slavery on contemporary culture or the importance of desire on life, but I would ask Zink: if the U.S. is doing things poorly, who is doing them better? And if the U.S. is doing things poorly why then is Silicon Valley the center of the future?

One of these people reads as an optimist, the other as a pessimist. One reads as someone who makes things happen and the other as someone who complains about things that other people do. One reads as a person with a path. The other doesn’t.

Don’t get me wrong. I liked The Wallcreeper when I read it a couple months ago. I didn’t have much to say about it on this blog because it seems kind of interesting but left me without much feeling. But I can’t help thinking that Andreessen’s vision for the future is big, while Zink’s vision of the present is small.

As a bonus, check out “All Hail the Grumbler! Abiding Karl Kraus,” which is poorly titled but describes Jonathan Franzen’s relationship to art, technology, and other matters. He’s in the Zink school; perhaps something about studying German inculcates an anti-technology backlash among writers, since Germany and the U.S. are both among the most technophilic societies in the world (for good reasons, I would argue). From the article:

Kraus’s savage criticism of popular newspapers, suspicion of technology, and defense of art all appeal to Franzen, whose nonfiction essays strike similar notes. For instance, in the spirit of Kraus, Franzen has attacked the intrusiveness of cellphones and the loss of private space as people bark out the dreck of their lives.

But even “privacy” is a relatively new idea: being alone to read books only really got going in the 18th Century, when books got cheap enough for normal people to borrow them from libraries. The luddites of the day lamented the withdrawal from the private sphere into onanistic privacy. They asked: Why wrap yourself in some imaginary world when the big real world is out there?

As you may imagine I’m more neutral towards these developments. Like many literary types I think the world would be a better place with more reading and less reality TV, but I’ll also observe that the kind of people who share that view are likely to read this blog and the kind of people who don’t aren’t likely to give a shit about what I or anyone like me says.

Much later in the essay its author, Russell Jacoby, writes: “Denouncing capitalist technology has rarely flourished on the left, which, in general, believes in progress.” I get what he’s saying, But denouncing technology in general has always been a fool’s game because a) pretty much everyone uses it and b) to the extent one generation (or a member of a generation) refuses a given technology, the next generation takes it up entirely. Franzen may not like technology circa 2015 but he is very fond of the technology of the printing press. At what point does Franzen think “good” technology stopped?

I’m reminded, unfairly perhaps, of the many noisy environmentalists I’ve known who do things like bring reusable bags to grocery but then fly on planes at least a couple times a year. Buy flying pollutes more than pretty much anything anyone else does. A lot of SUV-drivers living in exurbs actually create less pollution than urban cosmopolitans who fly every two months. By the same token, the same people who denounce one set of technical innovations are often dependent on or love some other set of technical innovations.

Almost no one wants to really, really go backwards, technologically speaking, in time. Look at behaviors rather than words. I do believe that Franzen doesn’t use Facebook or write a blog or whatever, but he probably uses other stuff, and, if he has kids, they probably want smart phones and video games because all their friends have smart phones and video games.

I’m not saying smart phones and video games are good—quite the opposite, really—and I’m sympathetic to Zimbardo’s claim that “video games and porn are destroying men.” But I am saying that the claims about modern technology doing terrible things to people or culture goes back centuries and has rarely if ever proven true, and the people making such claims are usually, when viewed in the correct light, hypocrites on some level. Jacoby does hit a related point: “Presumably, if enough people like SUVs, reality TV, and over-priced athletic footwear, little more may be said. The majority has spoken.” But I want to emphasize the point and say more about not the banal cultural stuff like bad TV (and obviously there is interesting TV) but the deeper stuff, like technology.

The Andreessens of the world are right. There is no way back. The only way is forward, whether we want to admit it or not. The real problem with our cultural relationship to technology—and this is a Peter Thielian point—is that we’re in denial about dependence, need, and the need to build the future.

GeekDesk “Max” sit-stand desk review: Two years with a motorized desk

The single, most important thing about this GeekDesk review can be encapsulated in a single sentence: I’d never return, full-time and voluntarily, to a conventional desk. The rest is mere commentary. Detailed commentary, to be sure, but the important stuff should be up front.

I’m going to divide this review into two major sections: the first is about using the sit-stand desk, and the second is about installing it.

Usage

Geekdesk_and_iMac_2There is by now extensive evidence that sitting for long periods of time is terrible for both health and for concentration. The former has only recently hit the news; see this New York Times story or “The health hazards of sitting” from The Washington Post. Others may be easily found. Yet standing for long periods is also unlikely to be good for you, as anyone who has worked retail or restaurants already knows. Hence the sit-stand desk. Sites like Hacker News are rife with testimonials about standers. Let me add to the cacophony.

The latter issue—concentration—is less easily measured, but many of us who do brainwork at desks know the impossible-to-ignore feeling that we must stand and pace. A standing desk facilitates that kind of concentration.

For a long time I got a magical “wow” feeling when I tire of standing and watch the desk lower, or when I tire of sitting and watch the desk rise. Very few products of any sort offer that “Wow.” By now, however, having a sit-stand desk is mundane. We can acclimate to almost anything—and in one particular domain acclimation is called the “Coolidge Effect“— but, as mentioned earlier, I wouldn’t want to go back.

Like any sort of change there is a break-in period, and someone used to sitting for most of the day shouldn’t go to standing for most of it. Start with half an hour or an hour at a time. To the extent I have a method it’s simple: when I’m tired of standing I sit and vice-versa.

Geekdesk_and_iMac_3One other point: pretty much everyone I’ve seen who has tried a mat recommends getting a mat (see, for example, this thread for a wide array of testimonials). I haven’t seen anyone who tried a mat and didn’t like or recommend one. The good ones cost at least $60. Geekdesk now sells a mat, and I’m sure theirs is fine.

Otherwise, I don’t have much to report about usage—which is probably good; like any tool, a desk exists to support some other end. The memory feature on the GeekDesk Max works well. I haven’t thought about it in ages. The desk’s motor (or, more properly, motors: I believe it has one in each leg) is quiet and smooth. In the years I’ve had mine I’ve discerned no changes in the quality of the motor. I suppose it could die tomorrow, and the official warranty is only for two years, but GeekDesk seems like the kind of company that’ll either replace the desk if it dies the day after the warranty expires or cut you a deal on a new one. More on the quality of the company is below.

My desk also has a Humanscale keyboard tray attached. The Humanscale systems have become much more expensive since I bought mine, but I don’t know much about the alternatives. I do know that used versions are available on eBay and Craigslist. I also know that the keyboard trays will last for decades because my parents originally bought ones about twenty years ago. Most of the keyboard trays also offer 360 degree ranges of motions, which can be handy.

The motor—or more properly motors, because I believe there is one in each leg—is still quiet and smooth after about two years of use.

Cost

I like to imagine money spent on computer / desk setups to be allocated pretty damn well, considering how many hours a week I spend working at a computer. But people are funny about money: Dan Ariely describes some of the ways people mis-allocate money based on anchor points in Predictably Irrational.

Most people, if you press them, have some important indulgence they think “worth” spending money on. It might be shoes, lingerie, cars, boats, sports, travel, or hobbies, but it’s almost always there. The surprise at the expense of really good desks is, to my mind, an indication of priorities more than any comment on the absolute value or lack thereof in a workspace.

It’s almost impossible to say whether something like this is “worth it” to another person, but the usual points in favor a sit-stand desk are simple: many people spend 20 to 60 hours a week at a desk. On a cost-per-day basis, a good desk costs less than coffee or ramen. Put that way the upfront costs seem much smaller. It is interesting that many people are willing to pay four figures for minor car creature comforts but spend much less on desks or beds, which are often occupied for ten times as long as a car. Nonetheless anchoring effects are strong and perhaps they can’t be overcome.

I don’t know how long the motor on this desk will last. A conventional desk can probably endure for decades, and that’s obviously not true of a desk with moving parts. This desk comes with a two-year warranty. I’m guessing too that GeekDesk the company will either knock some cash off a new version or do something else nice if the desk dies the day after the warranty expires.

Installation

If you’re not accustomed to using power tools and building things on your own, pay the $95 or so to have your desk built for you. If I’d done this, I would’ve saved a lot of time and hassle. When I first bought the desk, I had no idea what I was doing and screwed up the screwing-in process by not having a drill-bit extender. Seems like an obvious point in retrospect, but at the time I messed up in the installation and ended up stripping a screw and installing others at an angle. In addition, although the screws Geekdesk sent were “self-starting,” they should still be installed with pilot holes.

desk_problems-7416I’m jumping ahead of myself, and I could tell the long and somewhat boring story about how this happened, but the short version is that I called GeekDesk not really sure about what I’d fucked up. GeekDesk’s customer service is insanely fabulous. About 80 – 85% of the problems were my own damn fault, and I should’ve been more careful when I assembled the desk, and I should’ve been more careful with the screws. But I wasn’t, and when I gave up and punted, Isaiah actually hired the installer at GeekDesk’s expense. I volunteered to pay, but they said they’d do it. Very few companies go this far.

The installer was a third-party company; perhaps not surprisingly, GeekDesk does not have a horde of desk installers across the nation. Unfortunately, the guy GeekDesk sent installed the screws at an angle just like I did, and in the process of screwing around (haha!) with them, managed to strip two heads, which then caused him to go to Home Depot for some more screws. This doesn’t inspire confidence in him, or in the self-starting screw system.

stripped_screw-7394In my case, I’ve seldom had any need to use power tools and am an amateur. But his entire profession involves putting things like desks together correctly. He was also ready to leave the mis-screwed screws, until I pointed out that the desk was still wobbling.

Anyway, for the Humanscale track I drilled small pilot holes, and now the desk doesn’t wobble, provided that it’s braced against a wall sufficient to absorb shock but not so much as to impede the motor. The desk doesn’t feel as solid as the Maxon Series 1000 desk it replaced, but I haven’t notice any monitor shake either.

(A side note: most reviews for newspapers or magazines appear after the writer has tried the product for a few days or weeks. I prefer to write them after a few months or years: that’s often how long it takes to really evaluate value.)

There are other, similar sit-stand desks, like the NextDesk Terra, but it’s $1,500 and I can’t discern any obvious improvements. It’s also wider, at 63 inches, than the GeekDesk. The Wirecutter‘s reviewers like it better; still, I’d rather save the money and use the Geekdesk.


Here is an earlier post on GeekDesk; note the datestamp.

Facebook and cellphones might be really bad for relationships

There’s some possibly bogus research about “How your cell phone wrecks your relationships — even when you’re not using it.” I say “possibly bogus” because these kinds of social science studies are notoriously unreliable and unreproducible.* Nonetheless, this one reinforces some of my pre-existing biases and is congruent with things that I’ve observed in my own life and the lives of friends, so I’m going to not be too skeptical of its premises and will instead jump into uninformed speculation.

It seems like cell phones and Facebook cordon a large part of your life from your significant other (assuming you have one or aspire to have one) and encourage benign-seeming secrecy in that other part of your life. In the “old days,” developing friendships or quasi-friendships with new people required face-to-face time, or talking on the phone (which, at home, was easily enough overheard) or writing letters (which are slow, a lot of people aren’t very good at it or don’t like to write letters). Now, you can be developing new relationships with other people while your significant other is in the same room, and the significant other won’t know about the relationship happening via text message. You can also solicit instant attention, especially by posting provocative pictures or insinuating song lyrics, while simultaneously lying to yourself about what you’re doing in a way that would be much harder without Facebook and cell phones.

Those new relationships start out innocently, only to evolve, out of sight, into something more. Another dubious study made the rounds of the Internet a couple months ago, claiming that Facebook was mentioned in a third of British divorce petitions. Now, it’s hard to distinguish correlation from causation here—people with bad relationships might be more attached to their phones and Facebook profiles—but it does seem like Facebook and cellphones enable behavior that would have been much more difficult before they became ubiquitous.

I don’t wish to pine for a mythical golden age, which never existed anyway. But it is striking, how many of my friends’ and peers’ relationships seem to founder on the shoals of technology. Technology seems to be enabling a bunch of behaviors that undermine real relationships, and, if so, then some forms of technology might be pushing us towards shorter, faster relationships; it might also be encouraging us to simply hop into the next boat if we’re having trouble, rather than trying to right the boat we’re already in. Facebook also seems to encourage a “perpetual past,” by letting people from the past instantly and quietly “re-connect.” Sometimes this is good. Sometimes less so. How many married people want their husband or wife chatting again with a high school first love? With a summer college flame? With a co-worker discussing intimate details of her own failing relationship?

Perhaps relationship norms will evolve to discourage the use of online media (“Are we serious enough to de-active each other’s Facebook accounts?” If the answer is “no,” then we’re not serious and, if I’m looking for something serious, I should move on). Incidentally, I don’t think blogs have the same kind of effect; this blog, for instance, is reasonably popular by the standards of occasional bloggers, and has generated a non-zero number of groupies, but the overall anonymity of readers (and the kind of content I tend to post) in relation to me probably put a damper on the kinds of relationship problems that may plague Facebook and cell phones.

EDIT: See also “I’m cheating on you right now: An admiring like on your Facebook page. A flirty late-night text. All while my partner’s right there next to me” mentions, unsurprisingly:

A study in 2013 at the University of Missouri surveyed 205 Facebook users aged 18–82 and found that “a high level of Facebook usage is associated with negative relationship outcomes” such as “breakup/divorce, emotional cheating, and physical cheating.”

Again, I want to maintain some skepticism and am curious about studies that don’t find a difference and thus aren’t published. But some research does match my own anecdotal impressions.


* If you’d like to read more, “Scientific Utopia: II – Restructuring Incentives and Practices to Promote Truth Over Publishability” is a good place to start, though it will strike horror in the epistemologist in you. Or, alternately, as Clay Shirky points out in “The Cognitive Surplus, “[…] our behavior contributes to an environment that encourages some opportunities and hinders others.” In the case of cell phones and Facebook, I think the kinds of behaviors encouraged are pretty obvious.

Product Review: The Leuchtturm 1917 notebook

The Leuchtturm 1917 is perfectly competent. It’s slightly larger than a Moleskine, when a notebook should be, if anything, slightly smaller. This is a small point. The paper quality is, to my eye and hand, indistinguishable from Moleskine’s, which in turn is very similar to Guildhall and most of the other non-Rhodia notebooks I’ve tried. It has one other annoying feature: the last 30 or so pages are perforated; this is another way of saying, “They’ll eventually fall out.” If you’re the kind of person who wants to desecrate your notebook by tearing out pages, then the Leuchtturm 1917 is for you. To be sure, perforated pages are a minor annoyance. But if you’re not trying to avoid minor annoyances, stick with Moleskines, since they’re widely available.

The only major problem with the Leuchtturm 1917 simple: it doesn’t offer any major, obvious improvements over the Moleskine. It doesn’t offer any real disadvantages, either, other than its departure from the canonical 3.5 x 5.5 size and the perforated final pages. Unlike the Quo Vadis Habana, however, the Leuchtturm 1917 isn’t so much larger that carrying it around becomes a chore.

If this review seems slight, that’s because it is—the differences between this notebook and a Moleskine are trivial. They experience the same corner tearing, although I didn’t use the Leuchtturm long enough for the tears to develop into the cover partially coming off. If you’ve used a Moleskine, you’ve already in effect used this notebook; both are decent, but neither beats the Rhodia Webbie.

More on that soon.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

How to be a faster writer: Don't

There’s a Slate article making the rounds on “How to be a faster writer,” which has lots of good advice, including some that a lot of people don’t seem to appreciate (like: you don’t become a good writer over night; you grow into it, like any other cognitively challenging skill.) It’s also got some not-so-good advice:

The research verifies that taking notes makes writing easier­—as long as you don’t look at them while you are writing the draft! Doing so causes a writer to jump into reviewing/evaluating mode instead of getting on with the business of getting words on the screen.

If research, outline, and so forth are actually part of the writing process, I think they can be smoothly integrated with the art of writing itself (as I write this, the Slate article I’m responding to is on the right and the Textmate window on the left, letting me look back and forth).

When I was writing completely unpublishable novels, I didn’t use outlines, and I ended up with piles of words that utterly lacked narrative tension and the many good qualities that stem from narrative tension. Such piles of words didn’t have much point, which more astute readers observed. One told me to think about writing a novel in which something happens.

So I went through a three-novel phase during which I’d heavily outline, and I’d usually have the outline on one screen (or one side of the screen) and a main document on the other. This prevented me from getting “stuck” or from writing off into nowhereville without the structure of a scene. A lot of amateur writers have trouble with plot: they think their novels should resemble famous ones they’ve read in school, in which characters spend a lot of time talking about their feelings in a very deep way, or the sense of being lost, or the ennui imparted by the modern world. There’s nothing precisely wrong with this sort of writing, if done well, but most people seem to like reading (and writing) novels where something happens in a series scenes that build to a climax better. Sure, a lot of novels you’ve read in school don’t really do that for various reasons, some very good, but if you imitate them, you’ll often be doing yourself and your reader a disservice. If you’re unconsciously imitating the boring novels you’ve read in school, that’s even worse, because you don’t have enough command over your craft to know what you’re doing.

These days, I still make a bit of an outline, but I can do a lot of the outlining in my head—the last novel I finished, One Step Into the Labyrinth, really needed an outline because the plot was complex; about half a dozen literary agents have the full manuscript or a piece of it, so you may yet see it in bookstores near you. The novel I’m working on now isn’t as complex, and although I’m not using an outline, I’m still writing in scenes that build up to something. In essence, I’ve learned how to write in scenes without necessarily needing an external structure to guide those scenes and make sure they work towards a whole. I suspect this to be a sign of growth, and, I hope, not a malignant sign, like cancer.

My Dad doesn’t write proposals using outlines. He’s internalized virtually everything he needs to know about delivering human services. When I gave technical writing students a proposal writing assignment for the Department of Education’s Educational Opportunity Centers (EOC) Program, however, I knew I couldn’t expect them to write like my Dad did, because what’s appropriate for experts isn’t appropriate for amateurs. I couldn’t just give them an RFP and let ’em rip—I had to get them to think about how services should actually be delivered and real-world constraints; many had a charmingly strong vision of the power and competence of volunteers. Others wanted to hire 30 staff people on an RFP that offered a $230,000 / year grant. Virtually all had to be taught to read between the lines. My Dad—and these days, I—will do that automatically.

Slate says that, during writing

the writer’s brain is juggling three things: the actual text, what you plan to say next, and—most crucially—theories of how your imagined readership will interpret what’s being written. A highly skilled writer can simultaneously be a writer, editor, and audience.

That’s basically what I’m describing above. Is something that took me a long time to grow but now that ability to be a writer, editor, and audience simultaneously exists. Even before it did, however, I used notes, outlines, a miscellaneous file, doodled; sometimes I had, and have, a chunk of text that I know will fit in a particular spot, as long as I find it, usually by digging through a miscellaneous file. In the novel I’m working on now, I’m still using two screens, as shown in the screenshots to the right (click to make them larger). Note: because this is work-in-progress, try not to read the text, because it’s not particularly important what it says and the conversation I was working on last night doesn’t make sense or have the same resonance out of context.

Anyway, as you can see, one screenshot shows my main window: I’m trying to use a program named Scrivener for the first time, which has a somewhat steep learning curve but is probably very useful for a novel with multiple speakers. The other is a second, 23″ Dell monitor which has a list of characters and a miscellaneous file where I drop notes, phrases, ideas, and so forth. I’m using Word at the moment, but I’ve used Mellel and all manner of other writing programs for this purpose. Nothing even remotely sophisticated is happening on those screens, so the word processor doesn’t matter much.

I can go for long stretches without referencing the second monitor, depending on the situation. But the second monitor, if anything, helps me stay in active writing mode. If I get an idea tangential to the main thread that’s developing, I don’t need to do a conditional jump and then try to find my way back to the main narrative. I hit the miscellaneous file, dump a couple sentences on the idea, and return to the main workspace. Sometimes I will read a lot of sentences on the second screen, comparing them with ones on the first. I don’t think this makes me move into strict “reviewing/evaluating mode,” because that’s part of the way I imagine “how [my] imagined readership will interpret what’s being written.” This might be something that comes from skill.

I’ve gone on long enough about a minor point of contention. I’d like to tremendously agree with some of the other points made in Slate, like this:

Second, read everything, all the time. That’s the only way to build the general knowledge that you can tuck away in long-term memory, only to one day have it magically surface when you’re searching for just the right turn of phrase. And, lastly, the trickiest part of writing—from a cognitive perspective—is getting outside of yourself, of seeing your writing through the eyes of others.

When people ask me what they should do to be good writers, I tell them to read a lot and write a lot. And, ideally, find a good editor. It’s nice to see that “science” agrees. If you pay enough attention to writers and would-be writers, I think it becomes apparent that a lot of them don’t quite have enough knowledge to pull off what they’re trying to do—yet. In her interview with James Franco, Terry Gross says that “I think that every young writer or painter actually goes through that […] putting out everything inside them, but there isn’t much inside them yet because they’re young and unformed.” And Franco agrees that he experienced the problems or possibilities Gross describes.

I should also explain why the last word of my post title is “Don’t.” I put it there because you don’t learn to become a faster writer through some kind of trick that will make you magically produce text faster. You become a better writer through experience and through reading. Those aren’t things you can do in a day or a week or a month. They’re things you do over years. The only way to start if you haven’t already is to start now, especially since the greatest value in writing isn’t always in writing for other people. I’ve been rereading Mihaly Csikszentmihalyi’s book Flow: The Psychology of Optimal Experience, which was even better the second time around than the first (probably because now I have the background knowledge to really grok it). He says:

[I]t is never a waste to write for intrinsic reasons. First of all, writing gives the mind a disciplined means of expression. It allows one to record events and experiences so that they can be easily recalled, and relived in the future. It is a way to analyze and understand experiences, a self-communication that brings order to them.

“A disciplined means of expression” is available to anyone, even someone with no readers. Csikszentmihalyi gets that writing isn’t just about writing: “If the only point to writing were to transmit information, then it would deserve to become obsolete. But the point of writing is to create information, not simply to pass it along. [. . .] It is the slow, organically growing process of thought involved in writing that lets the ideas emerge in the first place.” It’s about generating ideas that emerge through an attempt to express those ideas (Paul Graham says something similar in The Age of the Essay). Given that writing is about itself, we shouldn’t be as worried about how fast we’re writing; as demonstrated in Flow, when we’re really writing well we often won’t have a sense of time, because we’ll be in a moment-to-moment existence in which our task demands complete concentration and little else matters. Doesn’t that sound better than merely getting words on the page? It sure does to me.

By the way, you shouldn’t valorize writers and writing too much, because writing can have strange effects on the mind. In Michael Chabon’s Wonder Boys, Grady Tripp describes “the midnight disease” that writers suffer from,

[…] which started out as a simple feeling of disconnection from other people, an inability to ‘fit in’ by no means unique to writers, a sense of envy and of unbridgeable distance like that felt by someone tossing a restless pillow in a world full of sleepers. Very quickly, though, what happened with the midnight disease was that you began actually to crave this feeling of apartness, to cultivate and even flourish within it. You pushed yourself farther and farther and farther apart until one black day you woke to discover that you yourself had become the chief object of your hostile gaze.

And I don’t think this unique to writer: programmers, hackers, engineers, scientists, and others probably feel too: all the people who, like Gollum in The Lord of the Rings, still desire to walk free under the sun even as they are compelled to return to darkness and solitude. The solitude is what it takes to do the work: but for writers, they’re writing about people, which is odd that one needs to get away from people to describe people, but it’s nonetheless true for many of us.

By the way, most of those delicious quotes come from DevonThink Pro, but they’re still evidence that I’ve done a certain amount of reading and thinking about writing that enabled me to write this post over an hour or so (back to Slate: “It’s obviously a huge help to write about a subject you know well”). If I was 19 and writing this post, I simply wouldn’t have been able to write it. Not like this, anyway. If you look at the blog archives—I discourage you, but if you must, you must—and compare early posts to the posts I write these days, there simply is no comparison. That’s because I’ve learned how to write blog posts effectively, or somewhat effectively, anyway. I’m capable of doing things now that I simply couldn’t do then. Want to really write faster? You can teach yourself how in ten years.

Are Moleskines pretentious? Yup. Guildhall Notebooks and Rhodia Webbies are worse

Someone found this blog through the search query, “are moleskines pretentious”. The answer is so obvious (“yes”) that it worries me someone had to search for it. On the other hand, if you’re going to be a writer / artist / thinker type (see, for example, Rands in Repose for a hacker’s view), they’re pretty handy and probably worth the derisive, deserved stares and commentary you’ll get. Keep using them because you don’t know when a sentence will turn into a book. Or a post. Or something else important.

But I’m getting off-topic, which is how both moleskine notebooks (in the sense of the cover material) and Moleskine™ Notebooks (in the sense of the massive conglomerate that markets such notebooks) are pretentious. It might be even worse to posted about your dissatisfaction with recent Moleskines, along with pictures of the stack you’ve acquired over the years. At the moment, I’ve started using a Guildhall pocket notebook, which is a pain in the ass to find because they’re apparently discontinued (or so says their distributor, Exaclair). If you’re looking for one, start here. But for me, the real question is how well it’ll hold up after six to nine months of rigorous scientific testing that consists of travel in my pocket, backpack, and so forth. Maybe no notebook can, but the older Moleskines seemed to survive quite nicely. We’ll see if the Guildhall does.

One reason using a Moleskine can seem or be pretentious is simple: you appear to be more worried about appearance than what you’re actually doing with it, and writing blog posts, even recursively self-aware blog posts, enhances this problem. I don’t have a solution to this aspect of the issue beyond a suggestion that you actually produce something (posts, novels, paintings, patches to the Linux kernel, hedge funds, etc.) that your notebook habit contributes to.

By the way: after an exhaustive study of notebooks, I’ve discovered that the Rhodia Webbie is optimal. It even beats a $70, hand-made Japanese notebook that’s lovely but has overly thin paper. So if you’re looking for the right notebook, skip my persnickety, endless testing and go straight to the right one.

%d bloggers like this: