Facebook, go away—if I want to log in, I know where to find you

Facebook keeps sending me e-mails about how much I’m missing on Facebook; see the image at the right for one example. But I’m not convinced I’m missing anything, no matter how much Facebook wants me to imagine I am.

In “Practical Tips on Writing a Book from 23 Brilliant Authors,” Ben Casnocha says that writers need to “Develop a very serious plan for dealing with internet distractions. I use an app called Self-Control on my Mac.” Many other writers echo him. We have, all of us, a myriad of choices every day. We can choose to do something that might provide some lasting meaning or value. Or we can choose to tell people who are often effectively strangers what we ate for dinner, or that we’re listening to Lynyrd Skynyrd and Lil’ Wayne, or our inconsidered, inchoate opinions about the political or social scandal of the day, which will be forgotten by everybody except Wikipedia within a decade, if not a year.

Or we can choose to do something better—which increasingly means we have to control distractions—or, as Paul Graham puts it, “disconnect” them. Facebook and other entities that make money from providing distractions are, perhaps not surprisingly, very interested in getting you more interested in their distractions. That’s the purpose of their e-mails. But I’ve becoming increasingly convinced that Facebook offers something closer to simulacra than real life, and that the people who are going to do something really substantial are, increasingly, going to be the people who can master Facebook—just as the people who did really substantial things in the 1960 – 2005 period learned to master TV.

Other writers in the “Practical Tips” essay discuss the importance of setting work times (presumably distinct from Facebook times) or developing schedules or similar techniques to make sure you don’t let, say, six hours pass, then wonder what happened during those six hours—probable answers might include news, e-mail, social networks, TV, dabbling, rearrange your furniture, cleaning, whatever. All things that might be worthwhile, but only in their place. And Facebook’s place should be small, no matter how much the site itself encourages you to make it big. I’ll probably log on Facebook again, and I’m not saying you should never use Facebook, or that you should always avoid the Internet. But you should be cognizant of what you’re doing, and Facebook is making it increasingly easy not to be cognizant. And that’s a danger.

I was talking to my Dad, who recently got on Facebook—along with Curtis Sittenfeld joining, this is a sure sign Facebook is over—and he was creeped out by having Pandora find his Facebook account with no active effort on his part; the same thing happened when he was posting to TripAdvisor under what he thought was a pseudonym. On the phone, he said that everyone is living in Neuromancer. And he’s right. Facebook is trying to connect you in more and more places, even places you might not necessarily want to be connected. This isn’t a phenomenon unique to Facebook, of course, but my Dad’s experience shows what’s happening in the background of your online life: companies are gathering data from you that will reappear in unpredictable places.

There are defenses against the creeping power of master databases. I’ve begun using Ghostery, a brilliant extension for Firefox, Safari, and Chrome that lets one see web bugs, beacons, and third-party sites that follow your movements around the Internet. Here’s an example of the stuff Salon.com, a relatively innocuous news site, loads every time a person visits:

What is all that stuff? It’s like the mystery ingredients in so much prepackaged food: you wonder what all those polysyllabic substances are but still know, on some level, they can’t be good for you. In the case of Salon.com’s third-party tracking software, Ghostery can at least tell you what’s going on. It also gives you a way to block a lot of the tracking—hence the strikethroughs on the sites I’ve blocked. The more astute among you will note that I’m something of a hypocrite when it comes to a data trail—I still buy stuff from Amazon.com, which keeps your purchase history forever—but at least one can, to some extent, fight back against the companies who are tracking everything you do.

But fighting back technologically, through means like Ghostery, is only part of the battle. After I began writing this essay, I began to notice things like this, via a Savage Love letter writer:

I was briefly dating someone until he was a huge asshole to me. I have since not had any contact with him. However, I have been Facebook stalking him and obsessing over pictures of the guys I assume he’s dating now. Why am I having such a hard time getting over him? Our relationship was so brief! He’s a major asshole!

I don’t think Facebook is making it easier for the writer to get over him or improve your life. It wouldn’t be a great stretch to think Facebook is making the process harder. So maybe the solution is to get rid of Facebook, or at least limit one’s use, or unfriend the ex, or some combination thereof. Go to a bar, find someone else, reconnect with the real world, find a hobby, start a blog, realize that you’re not the first person with these problems. Optimal revenge, if you’re the sort of person who goes in that direction, is a life well-lived. Facebook stalking is the opposite: it’s a life lived through the lives of others, without even the transformative power of language that media like the novel offer.

Obviously, obsessive behavior predated the Internet. But the Internet and Facebook make it so much easier to engage in obsessive behavior—you don’t even have to leave your house!—that the lower friction costs make the behavior easier to indulge. One solution: remove the tool by which you engage in said obsessive behavior. Dan Savage observes, “But it sounds like you might still have feelings for this guy! Just a hunch!” And if those feelings aren’t reciprocated, being exposed to the source of those feelings on a routine basis, even in digital form, isn’t going to help. What is going to help? Finding an authentic way of spending your time; learning to get in a state of flow; building or making stuff that other people find useful. Notice that Facebook is not on that list.

Some of you might legitimately ask why I keep a Facebook account, given my ambivalence, verging on antipathy. The answers are several fold: the most honest is probably that I’m a hypocrite. The next-most honest is that, if / when my novels start coming out, Facebook might be useful as an ad tool. And some people use Facebook and only Facebook to send out messages about events and parties. It’s also a useful to figure out when I’m going to a random city who might’ve moved there. Those people you lost touch with back in college suddenly become much closer when you’re both strangers somewhere.

But those are rare needs. The common needs that Facebook fulfills—to quasi-live through someone else’s life, to waste time, to feel like you’re on an anhedonic treadmill of envy—shouldn’t be needs at all. Facebook is encouraging you to make them needs. I’m encouraging you to realize that the real answers to life aren’t likely to be found on Facebook, no matter how badly Facebook wants to lure you to that login screen—they’re likely going to be found within.


By the way, I love In Practical Tips on Writing a Book from 23 Brilliant Authors. I’ve read it a couple times and still love it. It’s got a lot of surface area for such a short post, which is why I keep linking to it in various contexts.

Why these assignment sheets: The world isn't going to be a routine place, and writing projects shouldn't be either

Phil Bowermaster writes:

Increasingly, perhaps, a job is something that we each have to create. We can’t count on someone else to create one for us. That model is disappearing. We have to carve something out for ourselves, something that the machines won’t immediately grab.

Bowermaster is describing on a macro scale what I try to do a micro scale with the papers I assign to students. The important part of my assignment sheets for freshman composition papers are only two paragraphs long, and students sometimes find them frustrating, but I do them this way because the world is headed in a direction that offers less direction and more power to do the right or wrong thing. Here’s an example of an assignment sheet:

Purpose: To explain and interpret a possible message or messages suggested by a) a text or texts we have read for class, b) a text or texts in Writing as Revision, or c) a book of your own choosing. If you write on a book of your own, you must clear your selection with me first. Your goal should be persuade readers of your interpretation using the texts studied and outside reading material.

You should construct a thesis that is specific and defensible and then explicate it through points, illustrations, and explanation. See Chapters 8 and 9 of A Student’s Guide To First-Year Writing for more information on the nature of textual analysis.

That’s it. Students can read more about the assignment if they want to, and they have a lot of freedom in picking a topic. Students often want more direction, which I give to some extent, but I don’t give step-by-step instructions because a) step-by-step instructions yield boring papers and b) in their real-life writing, the real challenge isn’t the writing. It’s the deciding what to write about and how to write once you’ve decided to start. The writing assignment often isn’t given; the writing assignment is made.

It’s a big leap to go from “write-a-good-paper” assignment sheets to conceptualizing “a job [as] something that we each have to create.” Maybe too big a leap. But the thinking and rationale behind my decision is clear: jobs that can be easily codified and described as a series of steps—jobs that are easily explained, in other words—are increasingly going away, either to off-shoring or automation. The ones that persist will be the ones that don’t exist now because no one has thought to do them. But a lot of school still appears to consists of a person in front of the room saying, “Follow these steps,” having the students follow the steps, and then moving on.

That model isn’t totally wrong—you can’t create something from nothing—but maybe we should more often be saying, “Here’s the kind of thing you should be doing. What steps should you take? How should you take them? Do something and then come talk to me about it.” That kind of model might be more time consuming and less easily planned, and I wouldn’t want to use it in every hour of every day. Many basic skills still need to be taught along the lines or “This is how you use a comma,” or “this is how an array works.” But we should be collectively moving towards saying, “Here are some constraints. Show me you can think. Show me you can make something from this.” And class isn’t totally devoid of support: unlike the real world, class has mandatory drafts due, lots of discussion about what makes strong writing strong, and the chance to see other people’s work. The imposed, artificial deadlines are particularly important. It’s not like I hand out assignment sheets and shove students out to sea, to flounder or float completely on their own.

Still, from what I can see, the world is increasingly rewarding adaptability and flexibility. I don’t see that trend changing; if anything, it seems likely to accelerate. If schools are going to (collectively) do a better job, they probably need to work on learning how to teach adaptability in the process of teaching subject-specific material. Offering the kinds of assignments I do is a microscopically small step in that direct, but big changes usually consist of a series of small steps. The assignments are one such step. This post is another.

In “A Welcome Call to Greatness,” John Hagel discusses That Used to Be Us, a book by Tom Friedman and Michael Mandelbaum that discusses what Hagel calls “creative creators” – “people who do their nonroutine work in distinctively nonroutine ways.” And that’s what I’m trying to do above: train students into being able to do nonroutine writing of a sort that will be distinctive, interesting, and well-done, but without a great deal of obvious managerial oversight from someone else. Great writing seldom springs from someone micromanaging: it springs from discussions, ideas, unexpected metaphors, connections, seeing old things in new ways, and form a plenitude of other places that can’t be easily described.

In “The Age of the Essay,” Paul Graham says:

Anyone can publish an essay on the Web, and it gets judged, as any writing should, by what it says, not who wrote it. Who are you to write about x? You are whatever you wrote.

Popular magazines made the period between the spread of literacy and the arrival of TV the golden age of the short story. The Web may well make this the golden age of the essay. And that’s certainly not something I realized when I started writing this.

He’s right. The most challenging writing most of my students will do isn’t even going to have the opportunity for someone else to micromanage it. The writing will increasingly be online. It will increasingly be their own decision to write or not write. As Penelope Trunk says, it will increasingly be essential for a good career. It won’t be routine. As I said above, routine work that can be codified and described in a series of steps will be exported to the lowest bidder. Valuable work will be the work nobody has dreamt up. Jobs will be “something that we each have to create.” I’m sure a lot of people will be unhappy with the change, but the secular forces moving in this direction look too great to be overcome by any individual. You surf the waves life and society throws at you, or you fall off the board and struggle. The worst cases never get back on the board and drown. I want students to have the best possible shot at staying on the board, and that means learning they can’t assume someone else is going to create a job—or an assignment—for them. They have to learn to do it themselves. They need to be creative. As Hagel quotes Mandelbaum and Friedman as saying, “Continuous innovation is not a luxury anymore – it is becoming a necessity.” I worry that too few students are getting the message.

I think of some of my friends who are unemployed, and when I ask them what they do all day, they say they spend time searching for a job, hanging out, watching TV. To me, this is crazy. If I were unemployed, I’d be writing, or learning Python, or posting on Craigslist with offers to work doing whatever I can imagine doing. The last thing I’d be doing is watching TV. In other words, I’d be doing something similar to what I’m doing now, even when I am employed: building skills, trying new things, and not merely sitting around waiting for good things to come to me. They won’t. Good things are the things one makes. Most of my employed friends seem to get this on some level, or have found their way into protected niches like teaching or nursing. I wonder if my unemployed friends had teachers and professors who forced them to think for themselves, or if they had teachers and professors who were content to hand them well-defined assignments that didn’t require much thinking about the “how” instead of the “what.”

In Praise of William Deresiewicz

I’ve read three long, fascinating essays by English professor William Deresiewicz over the last two days: Solitude and Leadership:If you want others to follow, learn to be alone with your thoughts; Love on Campus: Why we should understand, and even encourage, a certain sort of erotic intensity between student and professor (and he’s not talking about the bed-shaking kind, unless one’s partner is in paper form); and The Disadvantages of an Elite Education: Our best universities have forgotten that the reason they exist is to make minds, not careers.

I don’t agree with everything he’s written in those pieces, but their scope and unexpectedness is refreshing: in all three cases, he takes potentially tired themes (people are distracted a lot today; a great deal of film and fiction depicts randy professors sleeping with students; and elite colleges are training too many hoop jumpers instead of thinkers) and goes with them to unexpected places: how Heart of Darkness depicts bureaucracy and finding yourself; the erotic intensity of ideas and how they can be mingled with erotic intensity of the more conventional variety; and the entitlement complex that paradoxically can scare people into hewing to the narrow path. Even my summaries of a small portion of where he goes in each essay is hopelessly inadequate, which is part of what makes those essays so good.

The three are not all that separate: they all deal with conformity, individuality, college life, and the place of the university in society. Read together, they have more cohesiveness than many entire books. Most importantly, however, they go places I haven’t even thought about going, which is their most useful and unusual feature of all.

Jeff Sypeck pointed me to one and Robert Nagle to another; I only know both through e-mail, which is a very small but real demonstration of the Internet’s true power to make connections. All three essays might play into my eventual dissertation; at the very least, they’ve changed the way I think about many of the issues discussed, which to me is more valuable still.

More words of advice for the writer of a negative review

Nigel Beale quotes Helen Gardner:

“Critics are wise to leave alone those works which they feel a crusading itch to attack and writers whose reputations they feel a call to deflate. Only too often it is not the writer who suffers ultimately but the critic…”

Beale asks: “Which is great and poetic and all, however, is silence enough?”

To me, the chief function of the critic ought to be explore a work as honestly as possible and to illuminate to the best of her abilities. This means openness and it means being willing to say that a work is weak (and why), as well as showing how it is weak. In other words, you should be able to answer the who, what, where, when, why, and how on it, with an emphasis on the last two.

One should squelch “a crusading itch to attack and writers whose reputations they feel a call to deflate,” if you’re attacking merely to attack, or merely because someone’s balloon is overinflated. For example, Tom Wolfe seems a frequent and, to my mind, unfair object of ridicule among critics. But if you’re rendering a knowledge opinion that happens to be negative, you’re doing what you should be, and what I strive to. Often this means writing about why a book fails—perhaps too frequently.

Good reviews and Updike

Every attempt at review and criticism ought to be good—but that doesn’t mean positive. A review should be “good” in the sense of well-done and engaging might be a negative one. In an ideal world, the book should decide that as much as the critic.

John Updike’s rules for reviewing are worth following to the extent possible. I would emphasize three of them:

1. Try to understand what the author wished to do, and do not blame him for not achieving what he did not attempt.

2. Give him enough direct quotation–at least one extended passage–of the book’s prose so the review’s reader can form his own impression, can get his own taste.

5. If the book is judged deficient, cite a successful example along the same lines, from the author’s ouevre or elsewhere. Try to understand the failure. Sure it’s his and not yours?

In the end, I think such rules are designed to keep the reviewer as honest as the reviewer can be. I keep coming back to the word “honesty” because it so well encapsulates the issues raised by Beale, Updike, Orwell, and others.

I especially like the “direct quotation” comment because there are no artificial word limits on web servers, meaning that you should give the reader a chance to disagree with your assessment through direct experience. Quoting of a sufficient amount of material will give others a chance to make their own judgments. Merit can be argued but not proven: thus, a critic can avoid silence and unfair attack.

As the above shows, I like Beale’s answer—”no”—which seems so obvious as to barely need stating. I’d rephrase Gardner’s assertion to this: “beware of relentlessly and thoughtlessly attacking.”

The Aeron, The Rite of Spring, and Critics

In Malcolm Gladwell’s book Blink: The Power of Thinking Without Thinking, he quotes Bill Dowell, who was the lead researcher for Herman Miller during the development and release of the now-famous Aeron in the early 1990s; I’m sitting in one as I type this. The Aeron eventually sold fantastically well and became a symbol of boom-era excess, aesthetic taste, ergonomic control, excessive time at computers, and probably other things as well. But Dowell says that the initial users hated the chair and expressed their displeasure in focus groups and testing sites. According to him, “Maybe the word ‘ugly’ was just a proxy for ‘different.’ ”

That’s a long wind-up for an analogy that explains how Helen Gardner might be telling us that when we instinctively dislike, we might be reacting against novelty rather than its real merit, as critics and listeners notoriously did during Stravinsky’s The Rite of Spring. She’s wise to warn us about that danger, because it’s how people who pride themselves on taste and knowledge become conservative, stuffy critics. If we’re saying something is “bad” merely because it’s “different,” then we’ve already effectively died aesthetically because we’re no longer able to expand what “good” means. One thing I like about Terry Teachout’s criticism and his blog, About Last Night, is that he has strong opinions but still very much seems to have aesthetic suppleness.

But the Aerons and Ulysses of the world are exceedingly rare. Dune and Harry Potter aren’t among them. Joseph O’Neill’s Netherland at least might be, which I concede obliquely in my post about it.

Most works of art are, by definition, average.

The question is: to what extent is that a bad thing? Maybe none at all: an average novel doesn’t cause the death or disfigurement of children, or propagate social inequality, or do any number of other pernicious things. Its chief ill is that it wastes time for the person who reads it and perceives it as average (as opposed to the person who reads it and judges it extraordinary, which many Harry Potter readers have evidently done).

Milan Kundera thinks otherwise—in The Curtain, he writes, “… a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.” He gives himself a key out here: the word “consciously.” I doubt many writers consciously set out to produce commonplace books, or do so with that intent, and so may be rescued from the burden of Kundera’s scorn. Like the criminal justice system, Kundera separates those who knowingly commit a crime from those who do so accidentally.

You need to have read widely, however, to be capable of knowing the average from the incredible, and those whose effusive praise for Harry Potter and Dan Brown splatters the web show they haven’t. Hence, perhaps, the hesitance many Amazon reviewers show toward low scores, which one of Beale’s commenters observes.

The Aerons of Art

I now look at the Aeron as beautiful, and to me the over-stuffed office chairs that used to symbolize lawyerly and corporate status look as quaint as black and white photos of Harvard graduation classes without women or minorities. If we’re open to seeing the new, I think we’ll be safe enough in condemning the indifferent and pointing towards the genuinely astonishing works that are very much out there.

Edit: The Virginia Quarterly Review weighs in.

The chronic fear of reading’s demise set against its benefits

As if you needed more on reading and its benefits (as I discuss here, here, here, and here), see People of the Screen from the New Atlantis. It’s a long article worth reading in full, but these paragraphs stand out:

Whether one agrees with the NEA or with Bloom, no one can deny that our new communications technologies have irrevocably altered the reading culture. In 2005, Northwestern University sociologists Wendy Griswold, Terry McDonnell, and Nathan Wright identified the emergence of a new “reading class,” one “restricted in size but disproportionate in influence.” Their research , conducted largely in the 1990s, found that the heaviest readers were also the heaviest users of the Internet, a result that many enthusiasts of digital literacy took as evidence that print literacy and screen literacy might be complementary capacities instead of just competitors for precious time.

[…]

Just as Griswold and her colleagues suggested the impending rise of a “reading class,” British neuroscientist Susan Greenfield argues that the time we spend in front of the computer and television is creating a two-class society: people of the screen and people of the book. The former, according to new neurological research, are exposing themselves to excessive amounts of dopamine, the natural chemical neurotransmitter produced by the brain. This in turn can lead to the suppression of activity in the prefrontal cortex, which controls functions such as measuring risk and considering the consequences of one’s actions.

Writing in The New Republic in 2005, Johns Hopkins University historian David A. Bell described the often arduous process of reading a scholarly book in digital rather than print format: “I scroll back and forth, search for keywords, and interrupt myself even more often than usual to refill my coffee cup, check my e-mail, check the news, rearrange files in my desk drawer. Eventually I get through the book, and am glad to have done so. But a week later I find it remarkably hard to remember what I have read.”

[…]

But the Northwestern sociologists also predicted, “as Internet use moves into less-advantaged segments of the population, the picture may change. For these groups, it may be that leisure time is more limited, the reading habit is less firmly established, and the competition between going online and reading is more intense.” This prediction is now coming to pass: A University of Michigan study published in the Harvard Educational Review in 2008 reported that the Web is now the primary source of reading material for low-income high school students in Detroit. And yet, the study notes, “only reading novels on a regular basis outside of school is shown to have a positive relationship to academic achievement.”

I realize the irony of sharing this on the Internet, where it’s probably being read on the same screens criticized by the study, and perhaps demonstrating the allegedly rising divide between screen readers and book readers.

Compare the section above to my post on Reading: Wheaties, marijuana, or boring? You decide, which discusses the innumerable articles on reading’s decline (or maybe not). Alan Jacobs has an excellent post on Frum and Literature in which he observes that reading, especially real books, has probably always been a minority taste and probably always will be. Orwell opens his 1936 essay “In Defence of the Novel” by saying “It hardly needs pointing out that at this moment the prestige of the novel is extremely low, so low that the words ‘I never read novels,’ which even a dozen years ago were generally uttered with a hint of apology are now always uttered in a tone of conscious pride.” The whole piece is available in the collection Essays.

Finally, consider From Books, New President Found Voice in the New York Times, which I’m sure every book/lit blogger has already linked to by now:

Much has been made of Mr. Obama’s eloquence — his ability to use words in his speeches to persuade and uplift and inspire. But his appreciation of the magic of language and his ardent love of reading have not only endowed him with a rare ability to communicate his ideas to millions of Americans while contextualizing complex ideas about race and religion, they have also shaped his sense of who he is and his apprehension of the world.

Mr. Obama’s first book, “Dreams From My Father” (which surely stands as the most evocative, lyrical and candid autobiography written by a future president), suggests that throughout his life he has turned to books as a way of acquiring insights and information from others — as a means of breaking out of the bubble of self-hood and, more recently, the bubble of power and fame. He recalls that he read James Baldwin, Ralph Ellison, Langston Hughes, Richard Wright and W. E. B. Du Bois when he was an adolescent in an effort to come to terms with his racial identity and that later, during an ascetic phase in college, he immersed himself in the works of thinkers like Nietzsche and St. Augustine in a spiritual-intellectual search to figure out what he truly believed.

Without his experience in books, Obama probably wouldn’t be where he is, and millions of others must silently share the same condition of achieving what they have thanks largely due to their learning. But they seldom get a voice in the pronouncements about reading’s decline, and those articles seldom acknowledge that, while society might lose a great deal from the allegedly decreasing literacy of its members, those members will lose vastly more on an individual level, and few will even realize what they’ve lost.

(Hat tip Andrew Sullivan.)

The chronic fear of reading’s demise

As if you needed more on reading and its benefits (as I discuss here, here, here, and here), see People of the Screen from the New Atlantis. It’s a long article worth reading in full, but these paragraphs stand out:

Whether one agrees with the NEA or with Bloom, no one can deny that our new communications technologies have irrevocably altered the reading culture. In 2005, Northwestern University sociologists Wendy Griswold, Terry McDonnell, and Nathan Wright identified the emergence of a new “reading class,” one “restricted in size but disproportionate in influence.” Their research , conducted largely in the 1990s, found that the heaviest readers were also the heaviest users of the Internet, a result that many enthusiasts of digital literacy took as evidence that print literacy and screen literacy might be complementary capacities instead of just competitors for precious time.

[…]

Just as Griswold and her colleagues suggested the impending rise of a “reading class,” British neuroscientist Susan Greenfield argues that the time we spend in front of the computer and television is creating a two-class society: people of the screen and people of the book. The former, according to new neurological research, are exposing themselves to excessive amounts of dopamine, the natural chemical neurotransmitter produced by the brain. This in turn can lead to the suppression of activity in the prefrontal cortex, which controls functions such as measuring risk and considering the consequences of one’s actions.

Writing in The New Republic in 2005, Johns Hopkins University historian David A. Bell described the often arduous process of reading a scholarly book in digital rather than print format: “I scroll back and forth, search for keywords, and interrupt myself even more often than usual to refill my coffee cup, check my e-mail, check the news, rearrange files in my desk drawer. Eventually I get through the book, and am glad to have done so. But a week later I find it remarkably hard to remember what I have read.”

[…]

But the Northwestern sociologists also predicted, “as Internet use moves into less-advantaged segments of the population, the picture may change. For these groups, it may be that leisure time is more limited, the reading habit is less firmly established, and the competition between going online and reading is more intense.” This prediction is now coming to pass: A University of Michigan study published in the Harvard Educational Review in 2008 reported that the Web is now the primary source of reading material for low-income high school students in Detroit. And yet, the study notes, “only reading novels on a regular basis outside of school is shown to have a positive relationship to academic achievement.”

I realize the irony of sharing this on the Internet, where it’s probably being read on the same screens criticized by the study, and perhaps demonstrating the allegedly rising divide between screen readers and book readers.

Compare the section above to my post on Reading: Wheaties, marijuana, or boring? You decide, which discusses the innumerable articles on reading’s decline (or maybe not). Alan Jacobs has an excellent post on Frum and Literature in which he observes that reading, especially real books, has probably always been a minority taste and probably always will be. Orwell opens his 1936 essay “In Defence of the Novel” by saying “It hardly needs pointing out that at this moment the prestige of the novel is extremely low, so low that the words ‘I never read novels,’ which even a dozen years ago were generally uttered with a hint of apology are now always uttered in a tone of conscious pride.” The whole piece is available in the collection Essays.

Finally, consider From Books, New President Found Voice in the New York Times, which I’m sure every book/lit blogger has already linked to by now:

Much has been made of Mr. Obama’s eloquence — his ability to use words in his speeches to persuade and uplift and inspire. But his appreciation of the magic of language and his ardent love of reading have not only endowed him with a rare ability to communicate his ideas to millions of Americans while contextualizing complex ideas about race and religion, they have also shaped his sense of who he is and his apprehension of the world.

Mr. Obama’s first book, “Dreams From My Father” (which surely stands as the most evocative, lyrical and candid autobiography written by a future president), suggests that throughout his life he has turned to books as a way of acquiring insights and information from others — as a means of breaking out of the bubble of self-hood and, more recently, the bubble of power and fame. He recalls that he read James Baldwin, Ralph Ellison, Langston Hughes, Richard Wright and W. E. B. Du Bois when he was an adolescent in an effort to come to terms with his racial identity and that later, during an ascetic phase in college, he immersed himself in the works of thinkers like Nietzsche and St. Augustine in a spiritual-intellectual search to figure out what he truly believed.

Without his experience in books, Obama probably wouldn’t be where he is, and millions of others must silently share the same condition of achieving what they have thanks largely due to their learning. But they seldom get a voice in the pronouncements about reading’s decline, and those articles seldom acknowledge that, while society might lose a great deal from the allegedly decreasing literacy of its members, those members will lose vastly more on an individual level, and few will even realize what they’ve lost.

(Hat tip Andrew Sullivan.)

John Barth’s Further Fridays is Recommended

I’m about halfway through Further Fridays, John Barth’s second “essay, lecture, and other nonfiction” collection and find it as pleasurable and intelligent as The Friday Book, his first. Perhaps my favorite essay thus far is “A Few Words About Minimalism,” which is anything but minimalist and contains this gem:

But at least among those of our aspiring writers promising enough to be admitted into good graduate writing programs… the general decline in basic language skills over the past two decades is inarguable enough to make me worry in some instances about their teaching undergraduates. Rarely in their own writing, whatever its considerable other merits, will one find a sentence of any syntactic complexity, for example, inasmuch as a language’s repertoire of other-than-basic syntactical devices permits its users to articulate other-than-basic thoughts and feelings, Dick-and-Jane prose tends to be emotionally and intellectually poorer than Henry James prose.

(Link (obviously) added by me.)

That second sentence is delicious: perhaps Barth overindulges on other-than-basic syntax to make a point, but the way the structure of the sentence helps make the point that the sentence’s content conveys makes it so impressive. Not only that, but it makes a case without over-making it: that key word “tends” gives Barth enough wiggle room to concede that one can find emotionally and intellectually powerful writing in relative simple prose, and he never states that complex prose must be more emotionally and intellectual more powerful.

That virtue of statement and qualification is present throughout; Further Fridays is the rare collection that doesn’t overstate its claims (I’m still thinking of you, John Armstrong, although it does so at the cost of necessary complexity. You can’t make nuanced arguments about the nature of literary categorization, or movements, or literature, in soundbites and slogans, and it’s also hard to do so from a dogmatic political or philosophical position. Fortunately, Barth seems to occupy none—or, as he might say, his lack of position is his position—and the result is a feeling, no doubt illusory, that I read from the perspective of someone who simply likes to read and likes stories. And I learn from him: I can throw in that “no doubt illusory” comment to protect myself from obvious criticisms while still making the overall point about the nature of criticism.

Expect more on Barth shortly.

Life: dreaming edition

“[…] just because you have stopped believing in something you once were promised does not mean that the promise itself was a lie.”

—Michael Chabon, Maps and Legends

The Best Software Writing — Joel Spolsky

Well-written, insightful books on subjects I know nothing about often impart some lasting and surprising ideas. The biggest problem is finding them, since you don’t know they’re well-written or insightful till it’s too late. Pleasant surprises have abounded recently, one being The Devil’s Candy: The Bonfire of the Vanities Goes to Hollywood. Another comes from Joel Spolsky, who writes a popular blog on software called Joel on Software and edited The Best Software Writing I. In an industry where books age date so fast as to be almost pointless, like the hardware that runs software, one astonishing aspect is how The Best Software Writing, published in 2005 and composed of many essays written earlier, is still relevant and fascinating—and will probably be so for a long time yet.

Take Danah Boyd’s “Autistic Social Software,” which, like most of The Best Software Writing, explains how computers and people interact. It was published around 2004, which represented a societal turning point not widely recognized at the time, as virtually everyone my age hopped on what we now call “social networking sites.”* She observes that those sites weren’t very good because they’re not focused on users, even drawing a not entirely apt analogy similar to the one I made Science Fiction, literature, and the haters:

While many science fiction writers try to convey the nuances of human behavior, their emphasis is on the storyline, and they often convey the social issues around a technology as it affects that story. Building universal assumptions based on the limited scenarios set forth by sci-fi is problematic; doing so fails to capture the rich diversity of human behavior.

Her comments about science fiction are accurate regarding much, but not all of it, just like her comments about the focus of programmers on computers and their limitations, forcing us to adapt to them rather than vice-versa. The market has a knack for giving people what they want, however, and that focus is changing over time as iterative generations of software improve and people move to sites that work better. Boyd says, “[…] there is a value in understanding social life and figuring out how to interact with people on shared terms.” Right: and those who figure out what that means will be rewarded. I’m reminded of a programmer friend whose e-mail signature says “Computers aren’t the future; people are,” and I suspect he would approve of the lessons in this essay and larger book.

That’s a single example of how you take offline phenomenon—how people congregate—and apply it to an online context. Other essays reverse that dynamic. Clay Shirky’s “A Group Is Its Own Worst Enemy” explains how online groups form and break apart in much the same fashion as offline groups. You could look at this in terms of clubs, families, countries or jobs, all of which have similar cohesive and destructive forces assailing them over different time periods. One thing the military has going for it is hundreds of years of experience in taking people and forcing them to work together toward a common goal. Many sports accomplish the same thing. But in both cases, the tasks—destroying things and killing people, moving a ball down a field—are narrow and well-defined compared to the wide-open field of artistic creation. Granted, both the military and sports have their wider, macro possibilities—what do we destroy and who do we kill and why? (this question is more often known as politics), or what rules should the game have and why?—but they’re not intrinsically undefined like software, or other forms of intellectual endeavor (Paul Graham wrote about this in Great Hackers.) The incentives are easier to get right. In software, like life, they’re not. Compensation becomes harder to get right when goals are less easily defined, which is a major subject in one essay and subsidiary in others. I wrote about it as applied to grant writing, using Spolsky as a launching pad, and if more people realized what he’s already discovered, we might not waste so much effort trying to reinvent the wheel or invent futile algorithms for what is inherently a tricky subject.

The Best Software Writing is, yes, about software, but it’s about more, including the future. Those interested in seeing it, and the inside of the most transformative industry of recent times, would do well to read it. It contains more thought than Literacy Debate: Online, R U Really Reading?, a New York Times article published yesterday (read it, or the rest of the paragraph won’t make much sense). Why hasn’t the reporter figured done enough background research? I wish I could say. It contrasts with Shirky’s other article, “User as Group,” which demonstrates much of what’s right about the new mediums without questioning the medium’s utility—something that the New York Times article utterly misses. Furthermore, on the individual level, the individual is going to suffer the pain of insufficient literacy or numeracy in the form of inferior jobs and a less intense life. Many seem happy to make such trade-offs, and we go on telling them to eat their Wheaties. If they don’t, they won’t be able to write at the level of skill and detail in The Best Software Writing, which would make the world a poorer place, but those involved don’t seem to care as a group. Oh well. What harm not reading Spolsky or Fred Brooks will harm the individual, but it will also cause splash damage to others who have to work with them. To the extent reading online ameliorates those problems, as Shirky implies, we’ve made improvements. He, Spolsky, and Brooks who write about programming only to the extent you’re unwilling to see programming as a metaphor.

The major fear articles like “Literacy Debate: Online, R U Really Reading?” express, I suspect, is that many people are getting along without books and stories. On a societal basis, this probably isn’t a good thing, since democracies depend on educated citizens with historical knowledge—but on a personal level, if you’re a mid-level account manager at some large company, how much does your familiarity with Tolstoy and Norman Rush really help or hurt you? On the other hand, if you want to be at the top of virtually any field, you need to read and understand the world. In software, that means books like The Best Software Writing, which, though it consists almost entirely of pieces that originally appeared online, is a physical, dead-tree book that I liked reading on paper far more than I would’ve on the screen, where I already spend entirely too much of my face time. I want what I find convenient, as do most people, and many of the essays point toward defining what that means. It’s got more about how fulfill human desires than most books, fiction or nonfiction. Volume II of The Best Software Writing might never appear. Given the strength of the first, I wish it would.


* I hope future readers find this strange phrase an anachronism showing how primitive we are, because it’s ugly and imprecise. If a phrase must be one, it at least shouldn’t be the other.

Oops, perhaps, and several points on The Logic of Life

* Carrie Frye quotes Neil Gaiman, who writes: “I think that rule number one for book reviewers should probably be Don’t Spend The First Paragraph Slagging Off The Genre.” I try not to but occasionally do, as with The Logic of Life. But maybe Gaiman and Frye are only carving out their rule for fiction, as with nonfiction it seems more appropriate to survey existing work to ascertain whether an author is merely duplicating what already exists. I’m also on the record agreeing with the gist of what they say.

* Two readers wrote to ask in effect why, if I didn’t like the idea behind The Logic of Life, I bought and read it. Several answers:

1) I haven’t read all the econ-for-dummies books I listed and so thought I would still benefit from another one.

2) I didn’t realize the problems with The Logic of Life until after I read it, at which point they became more apparent.

3) Tim Harford was visiting Seattle, and I wanted to have the background for his discussion before he arrived.

4) Some of the chapters are also helpful professionally because some topics Harford discusses are perennials in grant writing.

Without number three, I probably wouldn’t have bought it. Number four is probably just a post-purchase justification.

* A friend who edited my post on Logic of Life said apropos of it, “Your beginnings are always very abstruse and hard to follow.” Really?

If I accept the premise that they’re harder-than-some-kind-of-average to follow, I would say that it’s because they often set up important context for what’s to follow. I’ll be more cognizant of this, especially because I began keeping a list a while ago of things reviewers often do that can annoy me. Number one, was, naturally:

1) Reviewing the author’s preceding ouevre before getting to whatever the reviewer is supposed to be reviewing or discussing the genre/similar books more generally. I did it in my discussion of Alan Furst’s Night Soldiers. This is essentially what Frye and and Gaiman were discussing.

2) Developing grand theories: I found myself writing about what makes a good history book when I really wanted to deal with The Pursuit of Glory.

3) Tangentially discuss a book while instead focusing on political or social commentary. This essentially describes The New York Review of Books, to the extent they still write about book, as opposed to galleries, political essays, movies, the universe, pornography, navel gazing etc. And yes, I’m a subscriber.

I’m sure other patterns exist, and I might start pointing out examples as I see them. All three have happened in The New York Times Book Review and elsewhere, I’m sure.

* Overall, the issue of context for reviews makes me think about why trusted criticism and publishing gatekeepers are so important: you’re more likely to read a book or review about a subject if you have a preexisting indicators that you aren’t wasting your time and that someone has vetted whatever you’re reading. This could be generalized to the chicken-and-egg problem of blogs more generally: you don’t have credibility until you have enough fame to generate credibility.