Is there an actual Facebook crisis, or media narrative about Facebook crisis?

Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis” just appeared in the New York Times, but the “crisis” seems imaginary: is there an actual crisis, outside the media narrative? Has Facebook seen a fall in monthly, weekly, or daily active users? That data would support a crisis narrative, but the best the article can do is, “its pell-mell growth has slowed.” Slowing growth makes sense for a company with two billion people using it; all companies eventually reach market saturation.

To me, the story reads a lot like a media narrative that has very little to do with users’s actual lives; I’ve been reading variations on “Why Facebook sucks” and “Why Facebook is doomed” for a very long time. It’s like the “Why this is the year of Linux on the desktop,” but for media companies.

Don’t get me wrong: I’m barely a Facebook user and I agree with much of the criticism. You can argue that Facebook is bad for reasons x, y, z, and I may even agree—but what I do, anecdotally, is less significant than what users do and want to do. As always, “revealed preferences” are useful: every time someone uses Facebook, that person is implicitly showing that they like Facebook more than not and find it valuable more than not. Aggregate those decisions together, and we see that there is no crisis. Facebook continues to grow. I personally think people should read more books and spend less time on Facebook, but I’m a literary boffin type person who would say the same of television. Lots of literary boffin type persons have had the same view of TV since TV came out—you should read more books and watch less TV—but, in the data, people didn’t watch less TV until quite recently, when Facebook started to replace TV.

The conventional media sources, including the NYT, don’t want to confront their own role in the 2016 election—the relentless focus on Clinton’s email server was insane. What should have been a footnote, at best, instead saw nearly wall-to-wall coverage. We don’t want to acknowledge that most people’s epistemological skill is low. Why look at ourselves, when we have this handy scapegoat right… over… there?

Facebook is a Girardian scapegoat for a media ecosystem that is unable or unwilling to consider its own role in the 2016 fiasco. With any media story, there are at least two stories: the story itself and the decision behind working on and publishing and positioning that particular story. The second story is very seldom discussed by journalists and media companies themselves, but it’s an important issue in itself.

In a tweet, Kara Swisher wrote that Zuckerberg is “unkillable, unfireable and untouchable.” I disagree: users can fire him whenever they want. Swisher had a good retort: “Remember aol.” Still, large, mature markets behave differently than small, immature markets: in 1900, there were many car companies. By 1950, only a few were left. Market size and market age both matter. Facebook reportedly has two billion users, a substantial fraction of the entire human population. It has survived Google+ and its users have demonstrated that they love wasting spending time online. Maybe they’ll find an alternate way to do it (again, I’m not personally a big Facebook user), but if they do, I don’t think it’ll be because of the 5000th media scare story about Facebook. So far, I’ve read zero media stories that cite Rene Girard and the scapegoating mechanism: I don’t think the media understands itself right now.

Concern trolling, competition, and “Facebook Made Me Do It”

In “Facebook Made Me Do It,” Jenna Wortham says that she was innocently browsing Instagram and saw

a photo of my friend in a hotel room, wearing lime green thong underwear and very little else. It was scandalous, arguably over the top for a photo posted in public where, in theory, anyone who wanted would be able to see it. But people loved it. It had dozens of likes as well as some encouraging comments.

Of course it had dozens of likes and some encouraging comments: as should be obvious, a lot of men like seeing nude and semi-nude women. So do a lot of women; I read the quoted section to my fiancée and she said, “they like it because it’s hot.”

No shit.

So why does Wortham use language that lightly chastises the anonymous thong-wearer-and-poster? What do “arguably over the top” and “scandalous” mean here? Perhaps in 1890 it was scandalous to see women in their underwear. Today one sees women effectively in their underwear on beaches, catalogs, billboards, the Internet, and, not uncommonly, the Internet.

Since it’s not actually a scandal to see a woman in a thong and “arguably over the top” doesn’t really say anything, I think there are separated, unstated reasons related to competition and to a term coined by the Internet: “concern trolling.”

Concern trolling happens when

A person who lurks, then posts, on a site or blog, expressing concern for policies, comments, attitudes of others on the site. It is viewed as insincere, manipulative, condescending.

In this case, it happens on the Internet, and Wortham is expressing faux concern about a friend, when she’s really saying that a) she doesn’t like that the friend can take a shortcut to Instagram fame and attention through posting hot lingerie shots and b) she doesn’t like the friend as a sexual competitor. A friend who does or says something more sexually adventurous than the observer or writer is “over the top” because she’s a competitor; a friend who is less adventurous is uptight. Those kinds of words and phrases only make sense relative to the person using them, and they’re both used to derogate rivals, just in different ways.

Wortham doesn’t want to say as much, however, for an innocuous reason—she only has so many words available, as she writes in the New York Times instead of a blog, and for a less salubrious reason: she wants readers to believe that she’s writing from the voice of God, or the arbiter of culture, or something like that, and has widely shared views on community standards that the friend in the hotel room should uphold. If she explains that the views she’s espousing are really her own, and that they reflect sexual and attention competition in the form of concern trolling.

There’s a term of art that describes Wortham’s problem: “Don’t hate the player—hate the game.” Wortham is, in a highbrow and subtle way, hating the player.

The concern trolling continues later in the article, when Wortham quotes a professor saying, “The fact that the world is going to see you increases the risks you are willing to take.” But there’s no evidence cited for this claim, and, moreover, in the context of the article it’s possible to substitute “fun you’re going to have” for “risks you are willing to take.” Given a choice between inviting Wortham or her friend who posts herself to Instagram in a green thong to a party, and I know who I’m going to invite.

The Facebook Eye and the artist’s eye

“We are increasingly aware of how our lives will look as a Facebook photo, status update or check-in,” according to Nathan Jurgenson in “The Facebook Eye,” and the quote stood out not only because I think it’s true, but because this kind of double awareness has long been characteristic of writers, photographers, artists, and professional videographers. Now it’s simply being disseminated through the population at large.

I’m especially aware of this tendency among writers, and in my own life I even encourage and cultivate it by carrying around a notebook. Now, a notebook obviously doesn’t have the connectivity of a cell phone, but it does still encourage a certain performative aspect, and a readiness to harvest the material of every day life in order to turn it into art. Facebook probably isn’t art—at least to me it isn’t, although I can imagine some people arguing that it is—and I think that’s the key difference between the Facebook Eye and what artists are doing and have been doing for a very long time. I’ve actually been contemplating and taking notes on a novel about a photographer who lives behind his (potentially magic) camera instead of in the moment, and that might be part of the reason why I’m more cognizant of the feeling being expressed.

Anyway, Michael Lewis’s recently gave an NPR interview about his recent Obama article (which is worth reading on its own merits, and, like Tucker Max’s “What it’s like to play basketball with Obama,” uses the sport as a way of drawing larger conclusions about Obama’s personality and presidency). In the interview, Lewis sees Obama as having that writer’s temperament, and even says that “he really is, at bottom, a writer,” and goes on to say Obama is “in a moment, and not in a moment at the same time.” Lewis says Obama can be “in a room, but detach himself at the same time,” and he calls it “a curious inside-outside thing.” As I indicated, I don’t think this is unique to writers, although it may be more prevalent or pronounced in writers. Perhaps that’s why writers love great art and, in some ways, sex, more than normal people: both offer a way into living in the present. If writers are more predisposed towards alcoholism—I’m not sure if they are or not, though many salient examples spring to mind—getting out of the double perspective might be part of the reason why.

I think the key differences between what I do, with a notebook, and what Facebook enables via phones, are distance and perspective. My goal isn’t to have an instantaneous audience for the fact that I just did Cool Activity X. Whatever may emerge from what I’m observing is only going to emerge in a wholly different context that obscures its origins as a conversation, a snatch of overheard dialogue, a thing read in a magazine, or an observation from a friend. The lack of immediacy means that I don’t think I’m as immediately performative in most circumstances.

But the similarities remain: Jurgenson writes that “my concern is that the ultimate power of social media is how it burrows into us, our minds, our consciousness, changing how we consciously experience the world even when logged off.” And I think writing and other forms of art do the same thing: they “burrow into us,” like parasites that we welcome, and change the way we experience the world.

Still, the way we experience the world has probably been changing continuously throughout human history. The idea of having “human history” is a relatively recent idea: most hunter-gatherers didn’t have it, for example. The changes Facebook (and its analogues; I’m only using Facebook as a placeholder for a broader swath of technologies) is bringing seem new, weird, and different because they are, obviously, new. For all I know, most of my students already have the Facebook Eye more than any other kind of eye or way of being. This has its problems, as William Deresiewicz points out in “Solitude and Leadership,” but presumably people who watch with the Facebook Eye are getting something—even a very cheap kind of fame—out of what they do. And writers generally want fame too, regardless of what they say—if they didn’t, they’d be silent.

I think the real problem is that artists become aware of their double consciousness, while most normal people probably aren’t—they just think of it as “normal.” But then again, very few us probably contemplate how “normal” changes by time and place in general.


Thanks to Elena for sending me “The Facebook Eye”.

Facebook and cellphones might be really bad for relationships

There’s some possibly bogus research about “How your cell phone wrecks your relationships — even when you’re not using it.” I say “possibly bogus” because these kinds of social science studies are notoriously unreliable and unreproducible.* Nonetheless, this one reinforces some of my pre-existing biases and is congruent with things that I’ve observed in my own life and the lives of friends, so I’m going to not be too skeptical of its premises and will instead jump into uninformed speculation.

It seems like cell phones and Facebook cordon a large part of your life from your significant other (assuming you have one or aspire to have one) and encourage benign-seeming secrecy in that other part of your life. In the “old days,” developing friendships or quasi-friendships with new people required face-to-face time, or talking on the phone (which, at home, was easily enough overheard) or writing letters (which are slow, a lot of people aren’t very good at it or don’t like to write letters). Now, you can be developing new relationships with other people while your significant other is in the same room, and the significant other won’t know about the relationship happening via text message. You can also solicit instant attention, especially by posting provocative pictures or insinuating song lyrics, while simultaneously lying to yourself about what you’re doing in a way that would be much harder without Facebook and cell phones.

Those new relationships start out innocently, only to evolve, out of sight, into something more. Another dubious study made the rounds of the Internet a couple months ago, claiming that Facebook was mentioned in a third of British divorce petitions. Now, it’s hard to distinguish correlation from causation here—people with bad relationships might be more attached to their phones and Facebook profiles—but it does seem like Facebook and cellphones enable behavior that would have been much more difficult before they became ubiquitous.

I don’t wish to pine for a mythical golden age, which never existed anyway. But it is striking, how many of my friends’ and peers’ relationships seem to founder on the shoals of technology. Technology seems to be enabling a bunch of behaviors that undermine real relationships, and, if so, then some forms of technology might be pushing us towards shorter, faster relationships; it might also be encouraging us to simply hop into the next boat if we’re having trouble, rather than trying to right the boat we’re already in. Facebook also seems to encourage a “perpetual past,” by letting people from the past instantly and quietly “re-connect.” Sometimes this is good. Sometimes less so. How many married people want their husband or wife chatting again with a high school first love? With a summer college flame? With a co-worker discussing intimate details of her own failing relationship?

Perhaps relationship norms will evolve to discourage the use of online media (“Are we serious enough to de-active each other’s Facebook accounts?” If the answer is “no,” then we’re not serious and, if I’m looking for something serious, I should move on). Incidentally, I don’t think blogs have the same kind of effect; this blog, for instance, is reasonably popular by the standards of occasional bloggers, and has generated a non-zero number of groupies, but the overall anonymity of readers (and the kind of content I tend to post) in relation to me probably put a damper on the kinds of relationship problems that may plague Facebook and cell phones.

EDIT: See also “I’m cheating on you right now: An admiring like on your Facebook page. A flirty late-night text. All while my partner’s right there next to me” mentions, unsurprisingly:

A study in 2013 at the University of Missouri surveyed 205 Facebook users aged 18–82 and found that “a high level of Facebook usage is associated with negative relationship outcomes” such as “breakup/divorce, emotional cheating, and physical cheating.”

Again, I want to maintain some skepticism and am curious about studies that don’t find a difference and thus aren’t published. But some research does match my own anecdotal impressions.


* If you’d like to read more, “Scientific Utopia: II – Restructuring Incentives and Practices to Promote Truth Over Publishability” is a good place to start, though it will strike horror in the epistemologist in you. Or, alternately, as Clay Shirky points out in “The Cognitive Surplus, “[…] our behavior contributes to an environment that encourages some opportunities and hinders others.” In the case of cell phones and Facebook, I think the kinds of behaviors encouraged are pretty obvious.

Facebook, go away—if I want to log in, I know where to find you

Facebook keeps sending me e-mails about how much I’m missing on Facebook; see the image at the right for one example. But I’m not convinced I’m missing anything, no matter how much Facebook wants me to imagine I am.

In “Practical Tips on Writing a Book from 23 Brilliant Authors,” Ben Casnocha says that writers need to “Develop a very serious plan for dealing with internet distractions. I use an app called Self-Control on my Mac.” Many other writers echo him. We have, all of us, a myriad of choices every day. We can choose to do something that might provide some lasting meaning or value. Or we can choose to tell people who are often effectively strangers what we ate for dinner, or that we’re listening to Lynyrd Skynyrd and Lil’ Wayne, or our inconsidered, inchoate opinions about the political or social scandal of the day, which will be forgotten by everybody except Wikipedia within a decade, if not a year.

Or we can choose to do something better—which increasingly means we have to control distractions—or, as Paul Graham puts it, “disconnect” them. Facebook and other entities that make money from providing distractions are, perhaps not surprisingly, very interested in getting you more interested in their distractions. That’s the purpose of their e-mails. But I’ve becoming increasingly convinced that Facebook offers something closer to simulacra than real life, and that the people who are going to do something really substantial are, increasingly, going to be the people who can master Facebook—just as the people who did really substantial things in the 1960 – 2005 period learned to master TV.

Other writers in the “Practical Tips” essay discuss the importance of setting work times (presumably distinct from Facebook times) or developing schedules or similar techniques to make sure you don’t let, say, six hours pass, then wonder what happened during those six hours—probable answers might include news, e-mail, social networks, TV, dabbling, rearrange your furniture, cleaning, whatever. All things that might be worthwhile, but only in their place. And Facebook’s place should be small, no matter how much the site itself encourages you to make it big. I’ll probably log on Facebook again, and I’m not saying you should never use Facebook, or that you should always avoid the Internet. But you should be cognizant of what you’re doing, and Facebook is making it increasingly easy not to be cognizant. And that’s a danger.

I was talking to my Dad, who recently got on Facebook—along with Curtis Sittenfeld joining, this is a sure sign Facebook is over—and he was creeped out by having Pandora find his Facebook account with no active effort on his part; the same thing happened when he was posting to TripAdvisor under what he thought was a pseudonym. On the phone, he said that everyone is living in Neuromancer. And he’s right. Facebook is trying to connect you in more and more places, even places you might not necessarily want to be connected. This isn’t a phenomenon unique to Facebook, of course, but my Dad’s experience shows what’s happening in the background of your online life: companies are gathering data from you that will reappear in unpredictable places.

There are defenses against the creeping power of master databases. I’ve begun using Ghostery, a brilliant extension for Firefox, Safari, and Chrome that lets one see web bugs, beacons, and third-party sites that follow your movements around the Internet. Here’s an example of the stuff Salon.com, a relatively innocuous news site, loads every time a person visits:

What is all that stuff? It’s like the mystery ingredients in so much prepackaged food: you wonder what all those polysyllabic substances are but still know, on some level, they can’t be good for you. In the case of Salon.com’s third-party tracking software, Ghostery can at least tell you what’s going on. It also gives you a way to block a lot of the tracking—hence the strikethroughs on the sites I’ve blocked. The more astute among you will note that I’m something of a hypocrite when it comes to a data trail—I still buy stuff from Amazon.com, which keeps your purchase history forever—but at least one can, to some extent, fight back against the companies who are tracking everything you do.

But fighting back technologically, through means like Ghostery, is only part of the battle. After I began writing this essay, I began to notice things like this, via a Savage Love letter writer:

I was briefly dating someone until he was a huge asshole to me. I have since not had any contact with him. However, I have been Facebook stalking him and obsessing over pictures of the guys I assume he’s dating now. Why am I having such a hard time getting over him? Our relationship was so brief! He’s a major asshole!

I don’t think Facebook is making it easier for the writer to get over him or improve your life. It wouldn’t be a great stretch to think Facebook is making the process harder. So maybe the solution is to get rid of Facebook, or at least limit one’s use, or unfriend the ex, or some combination thereof. Go to a bar, find someone else, reconnect with the real world, find a hobby, start a blog, realize that you’re not the first person with these problems. Optimal revenge, if you’re the sort of person who goes in that direction, is a life well-lived. Facebook stalking is the opposite: it’s a life lived through the lives of others, without even the transformative power of language that media like the novel offer.

Obviously, obsessive behavior predated the Internet. But the Internet and Facebook make it so much easier to engage in obsessive behavior—you don’t even have to leave your house!—that the lower friction costs make the behavior easier to indulge. One solution: remove the tool by which you engage in said obsessive behavior. Dan Savage observes, “But it sounds like you might still have feelings for this guy! Just a hunch!” And if those feelings aren’t reciprocated, being exposed to the source of those feelings on a routine basis, even in digital form, isn’t going to help. What is going to help? Finding an authentic way of spending your time; learning to get in a state of flow; building or making stuff that other people find useful. Notice that Facebook is not on that list.

Some of you might legitimately ask why I keep a Facebook account, given my ambivalence, verging on antipathy. The answers are several fold: the most honest is probably that I’m a hypocrite. The next-most honest is that, if / when my novels start coming out, Facebook might be useful as an ad tool. And some people use Facebook and only Facebook to send out messages about events and parties. It’s also a useful to figure out when I’m going to a random city who might’ve moved there. Those people you lost touch with back in college suddenly become much closer when you’re both strangers somewhere.

But those are rare needs. The common needs that Facebook fulfills—to quasi-live through someone else’s life, to waste time, to feel like you’re on an anhedonic treadmill of envy—shouldn’t be needs at all. Facebook is encouraging you to make them needs. I’m encouraging you to realize that the real answers to life aren’t likely to be found on Facebook, no matter how badly Facebook wants to lure you to that login screen—they’re likely going to be found within.


By the way, I love In Practical Tips on Writing a Book from 23 Brilliant Authors. I’ve read it a couple times and still love it. It’s got a lot of surface area for such a short post, which is why I keep linking to it in various contexts.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

%d bloggers like this: