Is there an actual Facebook crisis, or media narrative about Facebook “crisis?”

Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis” uses the word “crisis” in the headline, but the “crisis” cited seems imaginary: is there an actual crisis, outside the media narrative? Has Facebook seen a fall in monthly, weekly, or daily active users? That data would support a crisis narrative, but the best the article can do is, “its pell-mell growth has slowed.” Slowing growth makes sense for a company with two billion people using it; all companies eventually reach market saturation.

“Delay, Deny and Deflect” reads like a media narrative that has very little to do with users’s actual lives; I’ve been reading variations on “Why Facebook sucks” and “Why Facebook is doomed” for at least a decade, along with predictions of Facebook’s decline. These kinds of stories are like “Why this is the year of Linux on the desktop,” but for media companies.

Don’t get me wrong: I’m barely a Facebook user and I agree with much of the criticism. You can argue that Facebook is bad for reasons x, y, z, and I will likely nod along—but what I do, individually and anecdotally, is less significant than what users as a whole do and want to do. “Revealed preferences” matter: every time someone uses Facebook, that person shows they like Facebook more than not—and find it valuable more than not.

Aggregate those decisions together, and we see that there is no crisis, because Facebook continues to grow by most metrics; if their growth is slowing, it is because virtually everyone with an Internet connection is already on Facebook. I personally think people should read more books and spend less time on Facebook, but I’m a literary boffin type person who would also say the same of television. Lots of literary boffin type persons have had the same view of TV since TV came out—you should read more books and watch less TV—but, in the data, people didn’t watch less TV until quite recently, when Facebook started to replace TV.

Why is the media so vociferously anti-Facebook right now? The conventional media sources, including the NYT, don’t want to confront their own role in the 2016 election—the relentless focus on Clinton’s email server, for example, was insane. That minor story only got the relentless play it did because most big media sources worried about being accused of “bias” and wanted to find a “both sides” narrative. What should have been a footnote, at best, instead saw ceaseless wall-to-wall coverage. Bias concerns meant media sources felt they had to keep pushing the email server as a story. At the same time, we don’t want to acknowledge that most people’s epistemological skill is low. Why look at ourselves, when we have this handy scapegoat right… over… there?

Facebook is a Girardian scapegoat for a media ecosystem that is unable or unwilling to consider its own role in the 2016 fiasco. With any media story, there are at least two stories: the story itself and the decision behind working on and publishing and positioning that particular story. The second story is very seldom discussed by journalists and media companies themselves, but it’s an important issue in itself.

In a tweet, Kara Swisher wrote that Zuckerberg is “unkillable, unfireable and untouchable.” I disagree: users can fire him whenever they want, but they haven’t, or haven’t yet. Swisher had a good retort: “Remember aol,” although that’s a retort that rebuts her original tweet. While she has a point about Facebook conceivably following into senescence, as AOL did, large, mature markets behave differently than small, immature markets: in 1900, there were many car companies. By 1950, only a few were left. Market size and market age both matter; as mentioned elsewhere in this post, a substantial fraction of the entire human population uses Facebook. Facebook has survived Google+ and its users have demonstrated that they love wasting spending time online. Maybe current Facebook users will find an alternate way to spend/waste time online (again, I’m not personally a big Facebook user), but if they do, I don’t think it’ll be because of the 5000th media scare story about Facebook.

So far, I’ve read zero media stories that cite Rene Girard and the scapegoating mechanism: I don’t think the media understands itself right now.

Way back when, I read the tech nerd site Slashdot, which for many years declared “year x is the year of Linux on the desktop.” The year people would get tired of paying for Microsoft operating systems and embrace freedom. Normal people didn’t care, and Microsoft was 100 times more monopolative than Facebook. Today, most desktop machines still overwhelming use Windows, Linux is still 1% of the desktop population, and MacOS has grown some in popularity but is still too expensive for most people. What tech nerds and journalists desire is not necessarily what normal people care about.

EDIT: Former newspaper editor Andrew Potter explains succinctly how the media works in “Why everyone hates the mainstream media: Judgements about status are embedded in almost everything aspect of the news. To read the news is to be insulted — which is why people are fleeing the mainstream media in droves.” Since November 2016, the media has been ceaselessly working to lower Facebook’s status. It seems to have succeeded in terms of lowering Facebook’s status among journalists and media pundits, but it seems to have failed to change mass behavior, much as the thousands of essays about how TV is bad failed to change TV habits. Most media pieces attempting to lower Facebook’s status use every kind of rhetoric conceivable except the numbers Facebook cites in its quarterly reports.

Concern trolling, competition, and “Facebook Made Me Do It”

In “Facebook Made Me Do It,” Jenna Wortham says that she was innocently browsing Instagram and saw

a photo of my friend in a hotel room, wearing lime green thong underwear and very little else. It was scandalous, arguably over the top for a photo posted in public where, in theory, anyone who wanted would be able to see it. But people loved it. It had dozens of likes as well as some encouraging comments.

Of course it had dozens of likes and some encouraging comments: as should be obvious, a lot of men like seeing nude and semi-nude women. So do a lot of women; I read the quoted section to my fiancée and she said, “they like it because it’s hot.”

No shit.

So why does Wortham use language that lightly chastises the anonymous thong-wearer-and-poster? What do “arguably over the top” and “scandalous” mean here? Perhaps in 1890 it was scandalous to see women in their underwear. Today one sees women effectively in their underwear on beaches, catalogs, billboards, the Internet, and, not uncommonly, the Internet.

Since it’s not actually a scandal to see a woman in a thong and “arguably over the top” doesn’t really say anything, I think there are separated, unstated reasons related to competition and to a term coined by the Internet: “concern trolling.”

Concern trolling happens when

A person who lurks, then posts, on a site or blog, expressing concern for policies, comments, attitudes of others on the site. It is viewed as insincere, manipulative, condescending.

In this case, it happens on the Internet, and Wortham is expressing faux concern about a friend, when she’s really saying that a) she doesn’t like that the friend can take a shortcut to Instagram fame and attention through posting hot lingerie shots and b) she doesn’t like the friend as a sexual competitor. A friend who does or says something more sexually adventurous than the observer or writer is “over the top” because she’s a competitor; a friend who is less adventurous is uptight. Those kinds of words and phrases only make sense relative to the person using them, and they’re both used to derogate rivals, just in different ways.

Wortham doesn’t want to say as much, however, for an innocuous reason—she only has so many words available, as she writes in the New York Times instead of a blog, and for a less salubrious reason: she wants readers to believe that she’s writing from the voice of God, or the arbiter of culture, or something like that, and has widely shared views on community standards that the friend in the hotel room should uphold. If she explains that the views she’s espousing are really her own, and that they reflect sexual and attention competition in the form of concern trolling.

There’s a term of art that describes Wortham’s problem: “Don’t hate the player—hate the game.” Wortham is, in a highbrow and subtle way, hating the player.

The concern trolling continues later in the article, when Wortham quotes a professor saying, “The fact that the world is going to see you increases the risks you are willing to take.” But there’s no evidence cited for this claim, and, moreover, in the context of the article it’s possible to substitute “fun you’re going to have” for “risks you are willing to take.” Given a choice between inviting Wortham or her friend who posts herself to Instagram in a green thong to a party, and I know who I’m going to invite.

The Facebook Eye and the artist’s eye

“We are increasingly aware of how our lives will look as a Facebook photo, status update or check-in,” according to Nathan Jurgenson in “The Facebook Eye,” and the quote stood out not only because I think it’s true, but because this kind of double awareness has long been characteristic of writers, photographers, artists, and professional videographers. Now it’s simply being disseminated through the population at large.

I’m especially aware of this tendency among writers, and in my own life I even encourage and cultivate it by carrying around a notebook. Now, a notebook obviously doesn’t have the connectivity of a cell phone, but it does still encourage a certain performative aspect, and a readiness to harvest the material of every day life in order to turn it into art. Facebook probably isn’t art—at least to me it isn’t, although I can imagine some people arguing that it is—and I think that’s the key difference between the Facebook Eye and what artists are doing and have been doing for a very long time. I’ve actually been contemplating and taking notes on a novel about a photographer who lives behind his (potentially magic) camera instead of in the moment, and that might be part of the reason why I’m more cognizant of the feeling being expressed.

Anyway, Michael Lewis’s recently gave an NPR interview about his recent Obama article (which is worth reading on its own merits, and, like Tucker Max’s “What it’s like to play basketball with Obama,” uses the sport as a way of drawing larger conclusions about Obama’s personality and presidency). In the interview, Lewis sees Obama as having that writer’s temperament, and even says that “he really is, at bottom, a writer,” and goes on to say Obama is “in a moment, and not in a moment at the same time.” Lewis says Obama can be “in a room, but detach himself at the same time,” and he calls it “a curious inside-outside thing.” As I indicated, I don’t think this is unique to writers, although it may be more prevalent or pronounced in writers. Perhaps that’s why writers love great art and, in some ways, sex, more than normal people: both offer a way into living in the present. If writers are more predisposed towards alcoholism—I’m not sure if they are or not, though many salient examples spring to mind—getting out of the double perspective might be part of the reason why.

I think the key differences between what I do, with a notebook, and what Facebook enables via phones, are distance and perspective. My goal isn’t to have an instantaneous audience for the fact that I just did Cool Activity X. Whatever may emerge from what I’m observing is only going to emerge in a wholly different context that obscures its origins as a conversation, a snatch of overheard dialogue, a thing read in a magazine, or an observation from a friend. The lack of immediacy means that I don’t think I’m as immediately performative in most circumstances.

But the similarities remain: Jurgenson writes that “my concern is that the ultimate power of social media is how it burrows into us, our minds, our consciousness, changing how we consciously experience the world even when logged off.” And I think writing and other forms of art do the same thing: they “burrow into us,” like parasites that we welcome, and change the way we experience the world.

Still, the way we experience the world has probably been changing continuously throughout human history. The idea of having “human history” is a relatively recent idea: most hunter-gatherers didn’t have it, for example. The changes Facebook (and its analogues; I’m only using Facebook as a placeholder for a broader swath of technologies) is bringing seem new, weird, and different because they are, obviously, new. For all I know, most of my students already have the Facebook Eye more than any other kind of eye or way of being. This has its problems, as William Deresiewicz points out in “Solitude and Leadership,” but presumably people who watch with the Facebook Eye are getting something—even a very cheap kind of fame—out of what they do. And writers generally want fame too, regardless of what they say—if they didn’t, they’d be silent.

I think the real problem is that artists become aware of their double consciousness, while most normal people probably aren’t—they just think of it as “normal.” But then again, very few us probably contemplate how “normal” changes by time and place in general.


Thanks to Elena for sending me “The Facebook Eye”.

Facebook and cellphones might be really bad for relationships

There’s some possibly bogus research about “How your cell phone wrecks your relationships — even when you’re not using it.” I say “possibly bogus” because these kinds of social science studies are notoriously unreliable and unreproducible.* Nonetheless, this one reinforces some of my pre-existing biases and is congruent with things that I’ve observed in my own life and the lives of friends, so I’m going to not be too skeptical of its premises and will instead jump into uninformed speculation.

It seems like cell phones and Facebook cordon a large part of your life from your significant other (assuming you have one or aspire to have one) and encourage benign-seeming secrecy in that other part of your life. In the “old days,” developing friendships or quasi-friendships with new people required face-to-face time, or talking on the phone (which, at home, was easily enough overheard) or writing letters (which are slow, a lot of people aren’t very good at it or don’t like to write letters). Now, you can be developing new relationships with other people while your significant other is in the same room, and the significant other won’t know about the relationship happening via text message. You can also solicit instant attention, especially by posting provocative pictures or insinuating song lyrics, while simultaneously lying to yourself about what you’re doing in a way that would be much harder without Facebook and cell phones.

Those new relationships start out innocently, only to evolve, out of sight, into something more. Another dubious study made the rounds of the Internet a couple months ago, claiming that Facebook was mentioned in a third of British divorce petitions. Now, it’s hard to distinguish correlation from causation here—people with bad relationships might be more attached to their phones and Facebook profiles—but it does seem like Facebook and cellphones enable behavior that would have been much more difficult before they became ubiquitous.

I don’t wish to pine for a mythical golden age, which never existed anyway. But it is striking, how many of my friends’ and peers’ relationships seem to founder on the shoals of technology. Technology seems to be enabling a bunch of behaviors that undermine real relationships, and, if so, then some forms of technology might be pushing us towards shorter, faster relationships; it might also be encouraging us to simply hop into the next boat if we’re having trouble, rather than trying to right the boat we’re already in. Facebook also seems to encourage a “perpetual past,” by letting people from the past instantly and quietly “re-connect.” Sometimes this is good. Sometimes less so. How many married people want their husband or wife chatting again with a high school first love? With a summer college flame? With a co-worker discussing intimate details of her own failing relationship?

Perhaps relationship norms will evolve to discourage the use of online media (“Are we serious enough to de-active each other’s Facebook accounts?” If the answer is “no,” then we’re not serious and, if I’m looking for something serious, I should move on). Incidentally, I don’t think blogs have the same kind of effect; this blog, for instance, is reasonably popular by the standards of occasional bloggers, and has generated a non-zero number of groupies, but the overall anonymity of readers (and the kind of content I tend to post) in relation to me probably put a damper on the kinds of relationship problems that may plague Facebook and cell phones.

EDIT: See also “I’m cheating on you right now: An admiring like on your Facebook page. A flirty late-night text. All while my partner’s right there next to me” mentions, unsurprisingly:

A study in 2013 at the University of Missouri surveyed 205 Facebook users aged 18–82 and found that “a high level of Facebook usage is associated with negative relationship outcomes” such as “breakup/divorce, emotional cheating, and physical cheating.”

Again, I want to maintain some skepticism and am curious about studies that don’t find a difference and thus aren’t published. But some research does match my own anecdotal impressions.


* If you’d like to read more, “Scientific Utopia: II – Restructuring Incentives and Practices to Promote Truth Over Publishability” is a good place to start, though it will strike horror in the epistemologist in you. Or, alternately, as Clay Shirky points out in “The Cognitive Surplus, “[…] our behavior contributes to an environment that encourages some opportunities and hinders others.” In the case of cell phones and Facebook, I think the kinds of behaviors encouraged are pretty obvious.

Facebook, go away—if I want to log in, I know where to find you

Facebook keeps sending me e-mails about how much I’m missing on Facebook; see the image at the right for one example. But I’m not convinced I’m missing anything, no matter how much Facebook wants me to imagine I am.

In “Practical Tips on Writing a Book from 23 Brilliant Authors,” Ben Casnocha says that writers need to “Develop a very serious plan for dealing with internet distractions. I use an app called Self-Control on my Mac.” Many other writers echo him. We have, all of us, a myriad of choices every day. We can choose to do something that might provide some lasting meaning or value. Or we can choose to tell people who are often effectively strangers what we ate for dinner, or that we’re listening to Lynyrd Skynyrd and Lil’ Wayne, or our inconsidered, inchoate opinions about the political or social scandal of the day, which will be forgotten by everybody except Wikipedia within a decade, if not a year.

Or we can choose to do something better—which increasingly means we have to control distractions—or, as Paul Graham puts it, “disconnect” them. Facebook and other entities that make money from providing distractions are, perhaps not surprisingly, very interested in getting you more interested in their distractions. That’s the purpose of their e-mails. But I’ve becoming increasingly convinced that Facebook offers something closer to simulacra than real life, and that the people who are going to do something really substantial are, increasingly, going to be the people who can master Facebook—just as the people who did really substantial things in the 1960 – 2005 period learned to master TV.

Other writers in the “Practical Tips” essay discuss the importance of setting work times (presumably distinct from Facebook times) or developing schedules or similar techniques to make sure you don’t let, say, six hours pass, then wonder what happened during those six hours—probable answers might include news, e-mail, social networks, TV, dabbling, rearrange your furniture, cleaning, whatever. All things that might be worthwhile, but only in their place. And Facebook’s place should be small, no matter how much the site itself encourages you to make it big. I’ll probably log on Facebook again, and I’m not saying you should never use Facebook, or that you should always avoid the Internet. But you should be cognizant of what you’re doing, and Facebook is making it increasingly easy not to be cognizant. And that’s a danger.

I was talking to my Dad, who recently got on Facebook—along with Curtis Sittenfeld joining, this is a sure sign Facebook is over—and he was creeped out by having Pandora find his Facebook account with no active effort on his part; the same thing happened when he was posting to TripAdvisor under what he thought was a pseudonym. On the phone, he said that everyone is living in Neuromancer. And he’s right. Facebook is trying to connect you in more and more places, even places you might not necessarily want to be connected. This isn’t a phenomenon unique to Facebook, of course, but my Dad’s experience shows what’s happening in the background of your online life: companies are gathering data from you that will reappear in unpredictable places.

There are defenses against the creeping power of master databases. I’ve begun using Ghostery, a brilliant extension for Firefox, Safari, and Chrome that lets one see web bugs, beacons, and third-party sites that follow your movements around the Internet. Here’s an example of the stuff Salon.com, a relatively innocuous news site, loads every time a person visits:

What is all that stuff? It’s like the mystery ingredients in so much prepackaged food: you wonder what all those polysyllabic substances are but still know, on some level, they can’t be good for you. In the case of Salon.com’s third-party tracking software, Ghostery can at least tell you what’s going on. It also gives you a way to block a lot of the tracking—hence the strikethroughs on the sites I’ve blocked. The more astute among you will note that I’m something of a hypocrite when it comes to a data trail—I still buy stuff from Amazon.com, which keeps your purchase history forever—but at least one can, to some extent, fight back against the companies who are tracking everything you do.

But fighting back technologically, through means like Ghostery, is only part of the battle. After I began writing this essay, I began to notice things like this, via a Savage Love letter writer:

I was briefly dating someone until he was a huge asshole to me. I have since not had any contact with him. However, I have been Facebook stalking him and obsessing over pictures of the guys I assume he’s dating now. Why am I having such a hard time getting over him? Our relationship was so brief! He’s a major asshole!

I don’t think Facebook is making it easier for the writer to get over him or improve your life. It wouldn’t be a great stretch to think Facebook is making the process harder. So maybe the solution is to get rid of Facebook, or at least limit one’s use, or unfriend the ex, or some combination thereof. Go to a bar, find someone else, reconnect with the real world, find a hobby, start a blog, realize that you’re not the first person with these problems. Optimal revenge, if you’re the sort of person who goes in that direction, is a life well-lived. Facebook stalking is the opposite: it’s a life lived through the lives of others, without even the transformative power of language that media like the novel offer.

Obviously, obsessive behavior predated the Internet. But the Internet and Facebook make it so much easier to engage in obsessive behavior—you don’t even have to leave your house!—that the lower friction costs make the behavior easier to indulge. One solution: remove the tool by which you engage in said obsessive behavior. Dan Savage observes, “But it sounds like you might still have feelings for this guy! Just a hunch!” And if those feelings aren’t reciprocated, being exposed to the source of those feelings on a routine basis, even in digital form, isn’t going to help. What is going to help? Finding an authentic way of spending your time; learning to get in a state of flow; building or making stuff that other people find useful. Notice that Facebook is not on that list.

Some of you might legitimately ask why I keep a Facebook account, given my ambivalence, verging on antipathy. The answers are several fold: the most honest is probably that I’m a hypocrite. The next-most honest is that, if / when my novels start coming out, Facebook might be useful as an ad tool. And some people use Facebook and only Facebook to send out messages about events and parties. It’s also a useful to figure out when I’m going to a random city who might’ve moved there. Those people you lost touch with back in college suddenly become much closer when you’re both strangers somewhere.

But those are rare needs. The common needs that Facebook fulfills—to quasi-live through someone else’s life, to waste time, to feel like you’re on an anhedonic treadmill of envy—shouldn’t be needs at all. Facebook is encouraging you to make them needs. I’m encouraging you to realize that the real answers to life aren’t likely to be found on Facebook, no matter how badly Facebook wants to lure you to that login screen—they’re likely going to be found within.


By the way, I love In Practical Tips on Writing a Book from 23 Brilliant Authors. I’ve read it a couple times and still love it. It’s got a lot of surface area for such a short post, which is why I keep linking to it in various contexts.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

Week 36 Links: A Jane Austen Education, what Facebook is like, The Longform.org Guide to the Porn Industry, A Game of Thrones as comedy, and more

Who put a bikini on this poor statue?

* Reading this review of William Deresiewicz’s A Jane Austen Education is bizarre because it’s like reading about myself, right down to the love for Madame Bovary:

In 1990, William Deresiewicz was on his way to gaining a Ph.D. in English literature at Columbia University. Describing that time in the opening pages of his sharp, endearingly self-effacing new book, “A Jane Austen Education,” Deresiewicz explains that he faced one crucial obstacle. He loathed not just Jane Austen but the entire gang of 19th-century British novelists: Hardy, Dickens, Eliot . . . the lot.

At 26, Deresiewicz wasn’t experiencing the hatred born of surfeit that Mark Twain described when he told a friend, “Every time I read ‘Pride and Prejudice’ I want to dig her up and hit her over the skull with her own shinbone.” What Deresiewicz (who has considerable fun at the expense of his pompous younger self) was going through was the rebel phase in which Dostoyevsky rules Planet Gloom, that stage during which the best available image of marriage is a prison gate.

Sardonic students do not, as Deresiewicz points out, make suitable shrine-­tenders for a female novelist whose books, while short on wedding scenes, never skimp on proposals. Emma Bovary fulfilled all the young scholar’s expectations of literary culture at its finest; Emma Woodhouse left him cold. “Her life,” he lamented, “was impossibly narrow.” Her story, such as it was, “seemed to consist of nothing more than a lot of chitchat among a bunch of commonplace characters in a country village.” Hypochondriacal Mr. Woodhouse, garrulous Miss Bates — weren’t these just the sort of bores Deresiewicz had spent his college years struggling to avoid? Maybe, he describes himself conceding, the sole redeeming feature of smug Miss Woodhouse was that she seemed to share his distaste for the dull society of Highbury.

The major difference is that I’m 27 and he describes himself at 26.

* A description of Facebook: “It seemed too much like tv, in reverse. Everybody transmits and nobody watches.” This is why I read Hacker News comments.

* Slate.com posted “The Longform.org Guide to the Porn Industry, which has a bunch of fascinating essays that are safe for work in the sense that they don’t have explicit photos on them. None are quite as good as David Foster Wallace’s “Big Red Sun” in Consider the Lobster, but that’s like accusing a basketball player of not being Michael Jordan. As I read the essays, I kept thinking of Philip K. Dick in some inchoate, ill-defined fashion—perhaps because he’s done so much to shape my thinking about reality and unreality.

Anyway, the next couple links stem from the Slate links:

* “Larry Flynt used to defend Hustler by calling the nude photo layouts “art.” I would come to joke that the porn video is indigenous Southern California folk art. The cheesy aesthetic — shag-carpet backdrops, tanning-salon chic, bad music, worse hairdos — and the everyman approach to exhibitionism are honest expressions of life in the land of mini-malls, vanity plates and instant stardom.” Evan Wright.

* “Those who enjoy whatever private pleasure is to be gained from receiving physical pain publicly would appear not to overlap at all with those who enjoy whatever private pleasure is to be gained from inflicting shame collectively.” From an article nominally about Sasha Grey and the porn industry, but really about expectations in cultural narratives of shame and redemption.

* “I’d call your right now, but I think you’re attending a retrograde ceremony for the artificial binding of two people in a legal contract regarding their sexual and financial behavior. I hope said ceremony at least has an open bar.”

* “Publishing—at least in general, and at least below the very top echelons of management—is not a fast-paced business, and the sense of urgency and desire for efficiency you might find in the offices of an investment bank or law firm don’t generally exist, simply because publishing doesn’t generally attract the sorts of people you often find in those fields.” This may bode ill for the future of the industry as it exists now.

* “I don’t know exactly what the future [of publishing] will look like, but I’m not too worried about it. This sort of change tends to create as many good things as it kills. Indeed, the really interesting question is not what will happen to existing forms, but what new forms will appear.”

* This is pretty funny: “A Game of Thrones” (the TV show) as a buddy comedy.

On blogging altruistically or narcissistically and why Facebook is simply easier

The New York Times has an article light on data and big on conjecture claiming “Blogs Wane as the Young Drift to Sites Like Twitter.” A sample: “Former bloggers said they were too busy to write lengthy posts and were uninspired by a lack of readers.” This Hacker News comment describes the blogging situation well:

I think there are two ways to blog: altruistically or narcissistically. If you’re blogging altruistically you’re blogging for others primarily and yourself secondarily. If you’re blogging narcissistically you’re mostly blogging for yourself.

Most of the great blogs that I visit are all done altruistically. They are well maintained, post useful information, and very rarely waste my time. They also require a huge amount of effort on the part of the blogger because they really have to do work to gather and present interesting and useful information for their readers.

What a lot of the press has referred to as blogging is “narcissistic.” Instead of coming up with interesting information and vetting it for their readers they mostly just spew whatever thoughts they had that day onto the page. It doesn’t take a huge amount of effort, but the signal to noise ratio is also very low.

It’s really hard to write stuff that will be interesting to people who don’t know you and have no real connection to you. I know because I’ve been writing The Story’s Story for three years and change. Over that time, it became obvious that producing at least one meaningful post a week is difficult. If writing in such a way that other people actually want to read your work weren’t so difficult, we wouldn’t have nearly as many professional writers as we do.

If your goal is mostly to bask in the relative adulation of others, you can probably do it more efficiently (and narcissistically) via Facebook. Look at the large number of girls who post bikini or MySpace shots and wait for the comments to roll in (note: they are doing this rationally). If your goal is mostly to communicate something substantive, you’re going to find that it’s not five or ten times harder than posting a 140-character message on FB or Twitter—it’s 50 or 100 times harder. Twitter is easier than “A list of N things” and “A list of N things” is easier than a blog post and a blog post is easier than an essay.

People who want to be real writers (or filmmakers or whatever) in the sense that people with no current relationship of any kind will find their work useful will probably still blog or use other equivalents. But most of those who think they want to be real writers will probably find out precisely how hard it is to come up with useful and interesting stuff regularly. Then they’ll quit, and the people who remain will be the ones who have the energy and skill to keep it up and write things people want to read.

I’m not against Twitter, but a while ago I posted this: “What can be said in 140 characters is either trivial or abridged; in the first case it would be better not to say it at all, and in the second case it would be better to give it the space it deserves.” The first part of that sentence can fit on Twitter, but the second part clarifies and reinforces the first.

Furthermore, real life can get in the way of substantive posts. At the moment, I’m recovering from the reading for my M.A. oral exam, which was Friday (I passed). As a result, I haven’t written a lot of deep, detailed posts about books over the last month. I haven’t written that many in general this year because the thing that used to primarily be my hobby—writing about books—has now been professionalized in the form of graduate school. So the energy that used to go into those posts is now more often going into my papers. Writing academic articles “counts” towards my career and toward eventually getting people to pay me money. Writing blog posts doesn’t. I don’t think the two are pure complements or pure substitutes, and I doubt I will ever stop writing a blog altogether because blogs are an excellent for ideas too short or underdeveloped for an article but still worth developing.

Plus, did I mention that good posts are hard to write? I think so, but I’ll mention it again here because I don’t think most people really appreciate that. Perhaps it’s best they don’t: if they did, they’d probably be less inclined to start a blog in the first place. The people who keep it up and keep doing it well have a mysterious habit of finding ways to get paid for it, either by writing books of their own or by finding an organizational umbrella (think of Megan McArdle or Matt Yglesias).

The number of people out there who have the inner drive to keep writing in the absence of external gratification is probably relatively small. I’ve made tens of dollars from “The Story’s Story.” The number of groupies who’ve flocked to me as a result of writing this blog is not notably large. Perhaps not surprisingly, most people will gravitate towards something easier, and I don’t think I’m writing this solely to raise my own status or show people how hard core or nice I am. I think I’m mostly writing it because it’s true.

Early January links: Renting, leasing, and owning books, measuring teachers, sex and female success, Facebook, Borders, and more

* Why you should never, ever use two spaces after a period.

* Books owned and leased.

* The Problem of Measurement in evaluating teachers, with these problems still being better than no measurement at all, which currently exists.

* Touching Your Junk: An Ontological Complaint.

* The sexual cost of female success. My favorite line:

Hookups happen outside of college just as much outside of college as in, if not more. But colleges that have Greek systems, people are more likely to hook up. I mean fraternities exist for this purpose — this is a cartel of men who have covenanted together to try to help the brothers access sex cheaply and without strings.

* And Now, For No Particular Reason, a Rant About Facebook, which is basically how I feel. Especially this, regarding why Scalzi uses Facebook: “Because other folks do, and they’re happy with it and I don’t mind making it easy for them to get in touch with me.” In college I also used it, like every other college student, to efficiently figure out which girls (in my case) or boys (in the case of some others) were single.

* The [Possible] Future of China? Look at Mexico.

* I had never considered the idea of moving to Latvia prior to reading this, from Marginal Revolution.

* Have we reached peak travel? (Here’s another view.)

* Apparently, the Nissan Leaf is pretty good.

* Borders may be about to die, and Megan McArdle precisely captures my feelings and practices regarding its demise. Like her, I like the idea of there being more bookstores, even as I order most of my books from Amazon and Abe Books because doing so is cheaper and more convenient.

* This is not good but, regardless of whether it’s good, may simply be the new state of things: “In essence, we have seen the rise of a large class of “zero marginal product workers,” to coin a term. Their productivity may not be literally zero, but it is lower than the cost of training, employing, and insuring them.”

* Is law school a losing game? Implied answer: yes. Actual answer for most people: also yes.

* What went wrong at Borders.

* Reading the book.

* Southwest Airlines pilot holds plane for murder victim’s family. Wow.

* Speaking of Slate, someone wrote asking, “Is it legal to booby trap my house?” I can answer this: no, at least not lethally. You can read some discussion of a South Carolina Statue here. Or see Wikipedia here.

%d bloggers like this: