Is there an actual Facebook crisis, or media narrative about Facebook crisis?

Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis” just appeared in the New York Times, but the “crisis” seems imaginary: is there an actual crisis, outside the media narrative? Has Facebook seen a fall in monthly, weekly, or daily active users? That data would support a crisis narrative, but the best the article can do is, “its pell-mell growth has slowed.” Slowing growth makes sense for a company with two billion people using it; all companies eventually reach market saturation.

To me, the story reads a lot like a media narrative that has very little to do with users’s actual lives; I’ve been reading variations on “Why Facebook sucks” and “Why Facebook is doomed” for a very long time. It’s like “Why this is the year of Linux on the desktop,” but for media companies.

Don’t get me wrong: I’m barely a Facebook user and I agree with much of the criticism. You can argue that Facebook is bad for reasons x, y, z, and I will likely agree—but what I do, individually and anecdotally, is less significant than what users do and want to do. “Revealed preferences” matter: every time someone uses Facebook, that person shows they like Facebook more than not—and find it valuable more than not.

Aggregate those decisions together, and we see that there is no crisis. Facebook continues to grow; if their growth is slowing, it is because virtually everyone with an Internet connection is already on Facebook. I personally think people should read more books and spend less time on Facebook, but I’m a literary boffin type person who would also say the same of television. Lots of literary boffin type persons have had the same view of TV since TV came out—you should read more books and watch less TV—but, in the data, people didn’t watch less TV until quite recently, when Facebook started to replace TV.

So why is the media so vociferously anti-Facebook right now? The conventional media sources, including the NYT, don’t want to confront their own role in the 2016 election—the relentless focus on Clinton’s email server was insane. What should have been a footnote, at best, instead saw nearly wall-to-wall coverage. We don’t want to acknowledge that most people’s epistemological skill is low. Why look at ourselves, when we have this handy scapegoat right… over… there?

Facebook is a Girardian scapegoat for a media ecosystem that is unable or unwilling to consider its own role in the 2016 fiasco. With any media story, there are at least two stories: the story itself and the decision behind working on and publishing and positioning that particular story. The second story is very seldom discussed by journalists and media companies themselves, but it’s an important issue in itself.

In a tweet, Kara Swisher wrote that Zuckerberg is “unkillable, unfireable and untouchable.” I disagree: users can fire him whenever they want. Swisher had a good retort: “Remember aol.” Still, large, mature markets behave differently than small, immature markets: in 1900, there were many car companies. By 1950, only a few were left. Market size and market age both matter. Facebook reportedly has two billion users, a substantial fraction of the entire human population. It has survived Google+ and its users have demonstrated that they love wasting spending time online. Maybe current Facebook users will find an alternate way to spend/waste time online (again, I’m not personally a big Facebook user), but if they do, I don’t think it’ll be because of the 5000th media scare story about Facebook.

So far, I’ve read zero media stories that cite Rene Girard and the scapegoating mechanism: I don’t think the media understands itself right now.

Why you really can’t trust the media: Claire Cain Miller and Farhad Manjoo get things wrong in the New York Times

In “The Next Mark Zuckerberg Is Not Who You Might Think,” the New York Times‘s Claire Cain Miller repeats an unfortunate quote that is a joke but was taken out of context: “‘I can be tricked by anyone who looks like Mark Zuckerberg,’ Paul Graham, co-founder of the seed investor Y Combinator, once said.”* But Graham has already publicly observed that this is a joke. As the link shows he’s publicly stated as much. Thousands of people have already read the column, but yesterday morning I thought that it’s not too late to correct it for those yet to come. So I wrote to both Miller and to the corrections email address with a variant of this paragraph.

In response I got this:

Thanks for your email. I’m confident that most readers will understand that the line was tongue in cheek, however. The idea that a co-founder of Y Combinator could be persuaded to part with seed funding simply by dint of the solicitor’s wearing a hooded sweatshirt is, of course, preposterous. At any rate, there is nothing to “correct,” so to speak, as Mr. Graham did in fact say those words.

Best regards,

Louis Lucero II
Assistant to the Senior Editor for Standards
The New York Times

But that’s not real satisfying either: nothing in the original article to indicate that Miller meant the line tongue-in-cheek. Based on the surrounding material, it seems like she took it seriously. Here is the full paragraph:

Yet if someone like that came to a top venture capitalist’s office, he or she could very well be turned away. Start-up investors often accept pitches only from people they know, and rely heavily on gut feelings, intuition and what’s worked before. “I can be tricked by anyone who looks like Mark Zuckerberg,” Paul Graham, co-founder of the seed investor Y Combinator, once said.

I wrote back:

Thanks for your response, but it’s pernicious because Graham, as he explains at the link, does not actually think he can be tricked by anyone who looks like Mark Zuckerberg, and his statement is part of the reason why he can’t, and why he doesn’t necessarily expect the next tech titan to look like Zuckerberg. One of the epistemological roles of humor is to say something but mean the opposite: have your read Umberto Eco’s The Name of the Rose? In addition to being a fantastic book, many sections deal with precisely this aspect of humor, and the role it plays in human discourse.

There’s actually a Wikipedia article on quoting out of context that’s both relevant here and helps explain why some reasonably famous people are becoming more cagey about speaking in public, in uncontrolled circumstances, or to the press.

To say that anyone even slightly familiar with Graham’s thought or writing—which is available publicly, for free, to anyone with an Internet connection (as most New York Times reporters have) will understand that the quote is absurd. Graham has probably done more to promote women in technology than anyone else. He wrote an entire essay, “Female Founders,” on this subject, which arose in part because he was “accused recently of believing things I don’t believe about women as programmers and startup founders. So I thought I’d explain what I actually do believe.” Miller didn’t bother reading that. She got it wrong, and it goes uncorrected. So this bogus quote that says the opposite of what Graham means is still going around.

Meanwhile, Farhad Manjoo wrote “As More Tech Start-Ups Stay Private, So Does the Money,” in which he cites various reasons why startups may stay private (“rooted in part in Wall Street’s skepticism of new tech stocks”) but misses a big one: Sarbanes-Oxley.** It’s almost impossible to read anything about the IPO market for tech companies without seeing a discussion of the costs of compliance (millions of dollars a year) and the other burdens with it.

I tweeted as much to him and he replied, “@seligerj a whole article about a complex issue and no mention of my pet interest that is just of many factors in the discussion!!!!??” Except it’s not a pet interest. It’s a major issue. Manjoo could have spent 30 seconds searching Google Scholar and an hour reading, and he’d conclude that SBO is really bad for the IPO market (and it encourages companies to go private). But why bother when a snarky Tweet will do? A snarky Tweet takes 10 seconds and real knowledge takes many hours. General problems with it are well-known. Not surprisingly, Paul Graham has written about those too. So has Peter Thiel in Zero to One. Ignoring it is not a minor issue: it’s like ignoring the role of hydrogen in water.

Manjoo’s article is at least a little better because his is a misleading oversight instead of an overt misquotation. But it’s still amazing not just for missing a vital issue in the first place but the response to having that issue pointed out.

If the articles were posted to random blogs or splogs I’d of course just ignore them, because the standards to which random blogs are held are quite low. But they were posted to the New York Times, which is actually much better than the rest of the media. That two writers could get so much so wrong in so short a space is distressing because of what that says not only about the Times but the rest of the media. I’m not even a domain expert here: I don’t work in the area and primarily find it a matter of intellectual curiosity.

This post is important because the Times is a huge megaphone. Policymakers who don’t know a lot about specific issues related to tech read and (mostly) trust it. While sophisticated readers or people who have been reading Graham for years might know the truth, most people don’t. A huge megaphone should be wielded carefully. Too often it isn’t.

Oddly, one of my earliest posts was about another howler in the New York Times. I’ve seen some since but yesterday’s batch was particularly notable. There are many good accounts of why you can’t trust the media—James Fallows gives one in Breaking the News and Ryan Holiday another in Trust Me, I’m Lying—but I’ve rarely seen two back-to-back examples as good as these. So good, in fact, that I want to post about them publicly both to inform others and for archive purposes: next time someone says, “What do you mean, you can’t trust even the New York Times?”, I’ll have examples of why ready to go.


* I’m not linking to the article because it’s terrible for many reasons, and I’d like to focus solely on the one cited, which is provably wrong.

** I’m not linking directly to this article either; The Hacker News thread about it is more informative than the article itself.

Hipsters haven’t ruined Paris; Parisian voters have

Last week’s New York Times has a somewhat dumb article by Thomas Chatterton Williams called “How Hipsters Ruined Paris,” which describes how Paris is changing:

Today, the neighborhood has been rechristened “South Pigalle” or, in a disheartening aping of New York, SoPi. Organic grocers, tasteful bistros and an influx of upscale American cocktail bars are quietly displacing the pharmacies, dry cleaners and scores of seedy bar à hôtesses that for decades have defined the neighborhood.

Elsewhere, the usual complaints appear: “Our neighborhood, though safe and well on its way to gentrification…” But demand to live in Paris is rising while the supply of housing remains constant, or close to constant—which means prices rise, and richer people move into once-poorer neighborhoods, and bring with them their predilections for high-end coffee and fancy bars and all the similar stuff I and my ilk like. If you want more diverse neighborhoods, you have to get lower rents, and the only effective way to accomplish that is through taller buildings—which, quelle horreur, destroy the character of the neighborhood!

Matt Yglesias wrote about this basic problem in The Rent is Too Damn High, which continues to go unread and uncited by people writing about neighborhoods, whose work would be improved by knowledge.

Concern trolling, competition, and “Facebook Made Me Do It”

In “Facebook Made Me Do It,” Jenna Wortham says that she was innocently browsing Instagram and saw

a photo of my friend in a hotel room, wearing lime green thong underwear and very little else. It was scandalous, arguably over the top for a photo posted in public where, in theory, anyone who wanted would be able to see it. But people loved it. It had dozens of likes as well as some encouraging comments.

Of course it had dozens of likes and some encouraging comments: as should be obvious, a lot of men like seeing nude and semi-nude women. So do a lot of women; I read the quoted section to my fiancée and she said, “they like it because it’s hot.”

No shit.

So why does Wortham use language that lightly chastises the anonymous thong-wearer-and-poster? What do “arguably over the top” and “scandalous” mean here? Perhaps in 1890 it was scandalous to see women in their underwear. Today one sees women effectively in their underwear on beaches, catalogs, billboards, the Internet, and, not uncommonly, the Internet.

Since it’s not actually a scandal to see a woman in a thong and “arguably over the top” doesn’t really say anything, I think there are separated, unstated reasons related to competition and to a term coined by the Internet: “concern trolling.”

Concern trolling happens when

A person who lurks, then posts, on a site or blog, expressing concern for policies, comments, attitudes of others on the site. It is viewed as insincere, manipulative, condescending.

In this case, it happens on the Internet, and Wortham is expressing faux concern about a friend, when she’s really saying that a) she doesn’t like that the friend can take a shortcut to Instagram fame and attention through posting hot lingerie shots and b) she doesn’t like the friend as a sexual competitor. A friend who does or says something more sexually adventurous than the observer or writer is “over the top” because she’s a competitor; a friend who is less adventurous is uptight. Those kinds of words and phrases only make sense relative to the person using them, and they’re both used to derogate rivals, just in different ways.

Wortham doesn’t want to say as much, however, for an innocuous reason—she only has so many words available, as she writes in the New York Times instead of a blog, and for a less salubrious reason: she wants readers to believe that she’s writing from the voice of God, or the arbiter of culture, or something like that, and has widely shared views on community standards that the friend in the hotel room should uphold. If she explains that the views she’s espousing are really her own, and that they reflect sexual and attention competition in the form of concern trolling.

There’s a term of art that describes Wortham’s problem: “Don’t hate the player—hate the game.” Wortham is, in a highbrow and subtle way, hating the player.

The concern trolling continues later in the article, when Wortham quotes a professor saying, “The fact that the world is going to see you increases the risks you are willing to take.” But there’s no evidence cited for this claim, and, moreover, in the context of the article it’s possible to substitute “fun you’re going to have” for “risks you are willing to take.” Given a choice between inviting Wortham or her friend who posts herself to Instagram in a green thong to a party, and I know who I’m going to invite.

How not to choose a college: Frank Bruni ignores the really important stuff

Frank Bruni wrote an essay called “How to Choose a College” without mentioning the most important fact about college for the life outcomes of many students: debt. That’s liking writing about the Titanic and ignoring the whole iceberg thing.

In How to Win at the Sport of Business, Mark Cuban writes, “financial debt is the ultimate dream killer. Your first house, car, whatever you might want to buy, is going to be the primary reason you stop looking for what makes you the happiest.” He’s right about debt often being “the ultimate dream killer,” but he should add student loans to his roster of “whatever you might want to buy,” especially because student loans are effectively impossible to discharge through bankruptcy. I don’t think most 18 year olds really understand what tens or hundreds of thousands of dollars of debt will really mean to them five years, ten years, twenty years after they graduate.

To me, the most interesting metric a university could offer these days is the mean, median, and mode debt of students upon graduation.

Money shouldn’t be the only factor in choosing a college, but it should be a major one, unless one has uncommonly wealthy parents.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

%d bloggers like this: