The stupidity of what I’m doing and the meaning of real work: Reading for PhD comprehensive exams

Last weekend, I wrote a flurry of posts after months of relative silence because I needed to do real work.

This might sound strange: I am doing a lot of things, especially reading, but all of it is make-believe, pretend work. That’s because the primary thing I’m doing is studying for PhD comprehensive exams in English lit. The exam set is structured in four parts: three, four-hour written segments, and a single oral exam, on topics related to stuff that’s not very important to me and probably not very important to most people. The exams also aren’t very relevant to being an English professor, because the key skill that English professors possess and practice is writing long-form essays/articles that are published in peer-reviewed journals. The tests I’m taking don’t, as far as I can tell, map very effectively to that skill.

As a consequence, the tests, although very time consuming, aren’t very good proxies for what the job market actually wants me to do.*

Consequently, PhD exams—at least in English—aren’t real work. They’re pretend work—another hoop to be jumped through on the way to getting a union card. Paul Graham makes a useful distinction in “Good and Bad Procrastination,” when he says that “Good procrastination is avoiding errands to do real work.” That’s what I’ve done through most of grad school, and that’s part of the reason why I have a fairly large body of work on this blog, which you can obviously read, a fairly large body of fiction, which you can’t (at the moment, but that’s going to change in the coming months). To Graham, the kind of small stuff that represents bad procrastination is “Roughly, work that has zero chance of being mentioned in your obituary.” Passing exams has zero chance of being mentioned in my obituary. Writing books or articles does.** PhD exams feel like bad procrastination because they’re not really examining anything useful.

They’re also hard, but hard in the wrong way, like picking patterns out of noise. Being hard in the right way means the soreness you get after working out, or when a challenging math problem suddenly clicks. The quasi-work I’m doing is intellectually unsatisfying—the mental equivalent of eating ice cream and candy all day, every day. Sure, they’re technically food, but you’re going to develop some serious problems if you persist in the ice cream and candy diet. The same is true of grad school, which might be why so many people emerge from it with a lugubrious, unpalatable writing style. Grad school doesn’t select or train for style; it selects and trains for a kind of strange anti-style, in which the less you can say in more words is rewarded. It’s the kind of style I’m consciously trying to un-cultivate, however hard the process might be, and this blog is one outlet for keeping the real writer alive in the face of excessive doses from tedious but canonized work and literary theory. Exams, if anything, reinforce this bogus hardness. If I’m ever in a position of power in an English department with a grad program, I’m going to try and offer an alternative to conventional exams, and say that four to six publishable, high-quality papers can or should take their place. That, at least, mirrors the skills valued by the job market.

The bogosity of exams relates to a separate problem in English academia, which I started noticing when I was an undergrad and have really noticed lately: the English curriculum is focused on the wrong thing. The problem can be stated concisely: Should English department teach content (like, say, Medieval poetry, or Modernist writers), or skills (like writing coherently and close reading)? Louis Menand describes the issue in The Marketplace of Ideas:

[C]ompare the English departments at two otherwise quite similar schools, Amherst and Wellesley. English majors at Wellesley are required to take ten English department courses [. . .] All English majors must take a core course called ‘Critical Interpretations’; one course on Shakespeare; and at least two courses on literature written before 1900 [. . .] The course listing reflects attention to every traditional historical period in English and American literature. Down the turnpike at Amherst, on the other hand, majors have only to take ten courses ‘offered or approved by the department’—in other words, apparently, they may be course sin any department. Majors have no core requirement and no period requirements. (Menand 89-90)

Most departments right now appear to answer “content.” Mine does. But I increasingly think that’s the wrong answer. I’m not convinced that it’s insanely important for undergrads to know Chaucer, or to have read Sister Carrie and Maggie: Girl of the Streets, or to have read any particular body of work. I do think it’s insanely important for them to have very strong close reading skills and exceptional writing skills. Unfortunately, I appear to be in the minority of professional Englishers in this respect. And I’m in grad school, where the answer skill mostly appears to be “content,” and relatively few people appear to be focusing on skills; those are mostly left to individuals to develop on their own. I don’t think I’ve heard anyone discuss what makes good writing at conferences, in seminars, or in peer-reviewed papers (MFA programs appear to be very interested in this subject, however, which might explain some of their rise since 1945).

As Menand points out, no one is sure what an “‘English’ department or degree is supposed to be.” That’s part of the field’s problem. I think it’s also part of the reason many students are drawn to creative writing classes: in those, at least the better ones, writing gets taught; the reading is more contemporary; and I think many people are doing things that matter. When I read the Romantic Poets, I mostly want to do anything but read the Romantic Poets. Again, I have nothing against the Romantic Poets or against other people reading the Romantic Poets—I just don’t want to do it. Yet English undergrad and grad school forces the reading of them. Maybe it should. But if so, it should temper the reading of them with a stronger focus on writing, and what makes good writing.

Then again, if English departments really wanted to do more to reward the producing of real content, they’d probably structure the publishing of peer-reviewed articles better. Contrary to what some readers have said in e-mails to me, or inferred from what I’ve written, I’m actually not at all opposed to peer review or peer-reviewed publications. But the important thing these days isn’t a medium for publishing—pretty much anyone with an Internet connection can get that for free—but the imprimatur of peer-review, which says, “This guy [or gal] knows what he’s talking about.” A more intellectually honest way to go about peer-review would be to have every academic have a blog / website. When he or she has an article ready to go, he should post it, send a link to an editor, and ask the editor to kick it out to a peer-reviewer. Their comments, whether anonymous or not, should be appended to the article. If it’s accepted, it gets a link and perhaps the full-text copied and put in the “journal’s” main page. If it doesn’t, readers can judge its merits or lack thereof for themselves.

The sciences arguably already have this, because important papers appear on arXiv.org before they’re officially “published.” But papers in the sciences appear to be less status-based and more content-based than papers in the humanities.

I think this change will happen in the humanities, very slowly, over time; it won’t be fast because there’s no reason for it to be fast, and the profession’s gatekeepers are entrenched and have zero incentive to change. If anything, they have a strong incentive to maintain the system, because doing that raises their own status and increases their own power within the profession. So I don’t foresee this happening, even if it would be an important thing. But then again, academics are almost always behind the important thing: the important thing is happening in some marginal, liminal space, and academics inhabit a much more central area, where it’s easy to ignore stuff at the margins. I don’t see that changing either, especially in a world where many people compete for few academic slots. In that world, pointless hoop-jumping is going to remain.


* There’s a vast literature in industrial organization on the subject of hiring practices, and most of that literature finds that the most effective ways to hire workers is to give them an IQ test and a work-skills or work-practice test. The former is effectively illegal in the U.S., so the best bet is to give workers a test of the thing they’ll actually be called on to do.

** I also consciously ask myself this question set:

In his famous essay You and Your Research (which I recommend to anyone ambitious, no matter what they’re working on), Richard Hamming suggests that you ask yourself three questions:

1. What are the most important problems in your field?

2. Are you working on one of them?

3. Why not?

I have an answer to number three, but it doesn’t seem like a very good one.

How to do ersatz Ezra Pound:

I’ve had the misfortune of reading Ezra Pound’s The Cantos, and, while complaining about them to a friend, she explained how to produce ersatz Pound:

Start with something nautical. Then hyphenate two words that are only tangentially related to each other. Then use that word to describe something that it has nothing to do with, making it seem like a brisk, obscure metaphor. For extra points add a second line containing the name of a Greek hero and invoke an out-of-fashion God in the middle of a mundane line about the blandness of the morning gruel/porridge. If it reminds you of Barth meets Game of Thrones, you’re doing it right. If not, add an additional grizzled old man and a reference to the Iliad. (The Odyssey is an almost, but still, a miss.)

I wish I’d written down her spontaneous production of mock-Poundian verse.

The Facebook Eye and the artist’s eye

“We are increasingly aware of how our lives will look as a Facebook photo, status update or check-in,” according to Nathan Jurgenson in “The Facebook Eye,” and the quote stood out not only because I think it’s true, but because this kind of double awareness has long been characteristic of writers, photographers, artists, and professional videographers. Now it’s simply being disseminated through the population at large.

I’m especially aware of this tendency among writers, and in my own life I even encourage and cultivate it by carrying around a notebook. Now, a notebook obviously doesn’t have the connectivity of a cell phone, but it does still encourage a certain performative aspect, and a readiness to harvest the material of every day life in order to turn it into art. Facebook probably isn’t art—at least to me it isn’t, although I can imagine some people arguing that it is—and I think that’s the key difference between the Facebook Eye and what artists are doing and have been doing for a very long time. I’ve actually been contemplating and taking notes on a novel about a photographer who lives behind his (potentially magic) camera instead of in the moment, and that might be part of the reason why I’m more cognizant of the feeling being expressed.

Anyway, Michael Lewis’s recently gave an NPR interview about his recent Obama article (which is worth reading on its own merits, and, like Tucker Max’s “What it’s like to play basketball with Obama,” uses the sport as a way of drawing larger conclusions about Obama’s personality and presidency). In the interview, Lewis sees Obama as having that writer’s temperament, and even says that “he really is, at bottom, a writer,” and goes on to say Obama is “in a moment, and not in a moment at the same time.” Lewis says Obama can be “in a room, but detach himself at the same time,” and he calls it “a curious inside-outside thing.” As I indicated, I don’t think this is unique to writers, although it may be more prevalent or pronounced in writers. Perhaps that’s why writers love great art and, in some ways, sex, more than normal people: both offer a way into living in the present. If writers are more predisposed towards alcoholism—I’m not sure if they are or not, though many salient examples spring to mind—getting out of the double perspective might be part of the reason why.

I think the key differences between what I do, with a notebook, and what Facebook enables via phones, are distance and perspective. My goal isn’t to have an instantaneous audience for the fact that I just did Cool Activity X. Whatever may emerge from what I’m observing is only going to emerge in a wholly different context that obscures its origins as a conversation, a snatch of overheard dialogue, a thing read in a magazine, or an observation from a friend. The lack of immediacy means that I don’t think I’m as immediately performative in most circumstances.

But the similarities remain: Jurgenson writes that “my concern is that the ultimate power of social media is how it burrows into us, our minds, our consciousness, changing how we consciously experience the world even when logged off.” And I think writing and other forms of art do the same thing: they “burrow into us,” like parasites that we welcome, and change the way we experience the world.

Still, the way we experience the world has probably been changing continuously throughout human history. The idea of having “human history” is a relatively recent idea: most hunter-gatherers didn’t have it, for example. The changes Facebook (and its analogues; I’m only using Facebook as a placeholder for a broader swath of technologies) is bringing seem new, weird, and different because they are, obviously, new. For all I know, most of my students already have the Facebook Eye more than any other kind of eye or way of being. This has its problems, as William Deresiewicz points out in “Solitude and Leadership,” but presumably people who watch with the Facebook Eye are getting something—even a very cheap kind of fame—out of what they do. And writers generally want fame too, regardless of what they say—if they didn’t, they’d be silent.

I think the real problem is that artists become aware of their double consciousness, while most normal people probably aren’t—they just think of it as “normal.” But then again, very few us probably contemplate how “normal” changes by time and place in general.


Thanks to Elena for sending me “The Facebook Eye”.

The future of the city: the L.A. and New York models

Matt Yglesias wrote an implausible-sounding story about “How Los Angeles—Yes, Los Angeles—Is Becoming America’s Next Great Mass-Transit City.” It sounds like L.A. is (slowly) becoming a more palatable place to live, and the city’s mass-transit strategy makes sense to me because driving pretty much anywhere in L.A. right now is a hellacious, grinding experience, and that experience is only getting worse over time. Which means L.A. and its residents only really have two choices: accept the hellacious driving experience and accept that it’s going to get continually worse, or attempt to build some kind of alternative system, presumably modeled on New York.

At the moment, we only really have two “models” of cities: the New York-style, walking and public transit version, or the L.A. style of car-based transport. Most cities over the last 75 years have followed the L.A. model, but L.A. is now demonstrating the limits of that very model.* When Southern California first began growing in earnest in the 1920s, cars were just getting started, and for each marginal driver getting behind the wheel made a lot of sense. But we’re now at the point where each marginal driver makes the situation that much worse, and the net effect of all that driving is an awful lot of misery. The only real alternative is allowing much denser construction patterns and building mass-transit around those very dense developments. I just didn’t expect that L.A.’s politicians and bureaucrats—and, by extension, its voters—would actually embrace, or at least tolerate, this solution.


* I’ve written a little bit about this topic before, most notably in Cars and generational shift.

Why do Europeans dress better than Americans?

In “Europeans Dress Better Than Americans: Fact,” bangsandabun writes that “I stated on Twitter the other day that North Americans dress badly and it ruffled a few feathers. I don’t even see this as a debatable point. The evidence speaks for itself.” I think he’s right, despite the danger of assuming American and European differences based on casual observation.

Bob Unwin speculates about why Americans might dress worse and things the difference is historical, except for this: “different signaling aims (more internal cultural diversity and weaker class distinctions; male clothing needing to be less ‘gay’ and more conventional.”

I haven’t spent enough time in Europe to judge fashion. But I do have a certain amount of “fashion blindness” that I used to take pride in (more on that later), and even now I don’t care about the subject enough to notice fashion most of the time. Plus, living in Arizona makes fashion blindness much, much worse: it’s too damn hot to wear anything more than shorts and a t-shirt for five months of the year, which obviates much of the fashion impetus, especially for men. For women, “fashion” tends to mean “clingy and revealing” in the heat, which I like but don’t think is especially fashionable; it’s more of a physical fitness signal. Still, when I lived in Seattle I didn’t see a much stronger fashion impetus than I do here in Arizona, so I don’t think weather is the whole explanation.

My friend Derek Huang just got back from a year in Germany and mentioned wanting to learn how to tailor clothes a couple weeks ago, so I sent him the above links and asked for his thoughts about whether Europeans are more stylish:

Definitely. Especially in Paris, I felt really out of place. Part of it was just going to a bunch of museums, seeing vibrant colors and designs that pop out, and then seeing that in the stuff people wear. (“That scarf is so Monet”). It’s still a pretty big status symbol over there (Europe). Less so in Spain, where they dress more American. I think American fashion is very much centered around casual-individualism. And that can be pulled off well, but it is also a license to just DGAF. The fashion in Europe is much more constrained. There are certain things you just don’t do, so I feel attention to detail is so much important as a way of standing out.

And I’m at the stage right now where I can talk enthusiastically about this stuff, but every day I wear a T-shirt and walk around in flip flops. I want to tailor my own stuff so I can learn what works for me, what doesn’t. Fashion is like applied art + social theory which is potentially interesting.

But then again there are only so many hours in a day, and if I’m in a culture where time otherwise spent worrying about clothing can be spent working on physics, then that’s an advantage.

Also, there was a good point in one comment about greater stylistic diversity in the U.S. If there are lots of social groups, then it’s more important to signal that you’re in a certain group than it is to try to be “better dressed.” It’s analogous to the diversity of contemporary music. Back when music was all just white Europeans, we could talk a lot about who was clearly better, who was a genius composer, etc. but now music is so broad, that we have less collective effort funneled into trying to improve things along a narrow subset of possible styles. I have noticed that you see more creative, wacky stuff around the University of Arizona than around Heidelberg, but there’s also more general mediocrity and some truly hideous fashion choices.

I like the analogy about music: It’s hard to evaluate a rapper, an indie rocker, a DJ, and a classical composer, because, while they’re also doing something that stimulates the ear in an aesthetically pleasing audiological fashion, that’s about all they have in common. Car analogies are apt too: a truck is not automatically better than a sports car, but it may be appropriate in a different situation.

The point about clothes that actually fit is also a good one. I don’t even really know enough to know what good fit means for me, or for most people, although it might be one of these “I-know-it-when-I-see-it” kinds of things, in the same way most people will instinctively recognize good writing over bad writing, without necessarily being able to explain which is better.

In addition, in the U.S. I think there’s a West Coast, technical-elite culture that disdains fashion as fakery and that valorizes the cult of you-are-what-you-do. I can imagine my more technically minded friends saying that the compiler doesn’t care whether your socks and shoes match. Technical accomplishments are transparent in their effectiveness, while fashion is closer to sales or persuasion, in that it’s much more interpretive and less binary. Therefore your clothes shouldn’t matter (in my imagination, Apple employees are better dressed than their counterparts elsewhere, because of the firm’s focus on design; I have no idea if that’s accurate, however), and I don’t know of any technical people who are renowned for their fashion sense; if anything, fashion becomes anti-fashion, since Bill Gates is or was notoriously indifferent to fashion, and Mark Zuckerberg is famous for wearing hoodies and flipflops. Yet both are among the most important people of their generations.

It’s also possible that most people only have a certain amount of energy to devote to taste or aesthetic matters, and Americans allocate their energy elsewhere; in the case of tech people, the “elsewhere” might be the beauty of their code. Personally, I’m most concerned with writing prose that sounds like faint wind chimes being caressed on a cool autumn evening.* Derek wants to spend his limited mental energy and attention thinking about physics. I want to spend mine thinking about literature and writing. Fashion may be a distraction from these pursuits, rather than a complement to them. But I wouldn’t mind having a small number of simple, comfortable things to wear that don’t make me look like I rolled out of bed in the morning. I may have found those things—the perfect, perfect-fitting black T-shirt, and size-30 Lucky Jeans—but they were found mostly through serendipity.

Anyway, I didn’t used to think fashion important at all, but now I’m less sure: signaling is much more powerful than I used to imagine, and my priorities have shifted in important ways since I developed a strong anti-fashion bias as a teenager living outside of Seattle. Now I wonder if that anti-fashion bias is holding me back, because people are very quick to evaluate based on all kinds of things, including clothes.

When I began learning how to salsa dance, I wore rubber-soled shoes (this is not a good idea). Eventually a friend showed me how to pick better shoes, and I bought very nice dress shoes that were three to four times as expensive as any shoe I’d ever worn before. Women started complimenting me on my shoes. I assume that, for any person who vocalizes an opinion about fashion or related matters, at least another ten people probably notice but don’t say anything. But I also had to worry more about shoe maintenance: they need wooden inserts to maintain their shape when I’m not wearing them. They’re less comfortable than running shoes or Vibram Five-Fingers. They need to be polished regularly. Nonetheless, the difference in how people perceived me was and is undeniable. If something as simple as shoes can cause this much change, I wonder what else I’m missing by simply not being observant enough.

Still, I can’t deny the culture in which I live. In “The New Pants Revue,” Bruce Sterling points out that “Jeans and tactical pants are the same school of garment. They’re both repurposed American Western gear. I’m an American and it’s common for us to re-adapt our frontier inventions.” By way of full disclosure, I now wear “Tactical Shorts” regularly, as silly as I feel writing that out, and so far their durability has been excellent. Sterling is obviously pointing in the direction of hidden historical factors, and I think there is an American suspicion of highfalutin apparel that couldn’t be worn on farms or factories, despite the fact that most of us, myself include, spend much more time in Office Space-style offices with climate control, ergonomic chairs, and limited exposure to sharp objects.

Note that I don’t wear tactical shorts with the nice shoes.

I wonder if, over time, Europe is becoming more influenced by U.S. fashion indifference, or if the U.S. is becoming more influenced by European fashion, or if we’re in a fairly steady state of mutual difference. I also wonder if people in the U.S. would be better served by paying more attention to fashion, at least at the margin.

See also “Why Americans dress so casually.”


* Present sentence excluded.

EDIT: I wonder if technology will lower the cost of fashion sufficiently to encourage Americans to become more fashionable (or “dress better” in American parlance). Articles like “Hot Collars: I got three custom shirts online. I’ll never buy off the rack again,” about the online custom-clothing companies Indochino, J. Hilburn, and Blank Label, make me think this is at least possible.

EDIT 2: As commenter Marcus points out, I’m incorrectly conflating “fashion” and “aesthetics,” which are really separate issues. He’s correct, and his comment is worth reading.

To most people, reading and writing are boring and unimportant

Robin Hanson says: “… folks, late in life, almost never write essays, or books, on ‘what I’ve learned about life.’ It would only take a few pages, and would seem to offer great value to others early in their lives. Why the silence?”

He offers various explanations, like “People don’t want to hear the truth, and they won’t find lies useful, so why bother,” “Young folks already think they know all the answers, so won’t listen,” and “Few care what people will think of them after they are dead.” But he also says, “None of these explanations seem especially satisfactory. What’s going on?” I offered my theory in the comments section but will elaborate on it here: Most people don’t give a shit about writing or ideas. You can observe this from their behavior. People do things that are important to them (like watching TV, making money, or having sex) and don’t do things that aren’t important to them.

Let’s change the question a little: Why doesn’t Robin build furniture, or write vital open-source software, or feed the hungry in his spare time? Those would seem to offer great value to others. Actually, he might do some of this stuff—Robin seems like the sort of fellow with a lot of unusual hobbies and habits—but even if he does some of that stuff, the question becomes why he does that and not some other valuable thing. Maybe he’s doing the value maximizing thing for him, in which case he enjoys it, in which case he keeps doing it. The question and answers become circular and tautological very quickly, but in this case I don’t think “circular” is “wrong.”

To return to the original question about “What I’ve learned about life,” I think that, for most people, writing life lessons, or whatever, would be completely unimportant. Plus, as a corollary to that, writing is really hard for most people. It’s really hard for me, and I do it every day! So we probably shouldn’t be surprised that most people don’t bother doing hard, meaningless things. Starting from scratch in any skill is a challenge. I’d like to learn how to sew, but I don’t even really know how to start (outside of a Google search), and I don’t really have time to begin learning a complex new skill until October 5. So although I’d like clothes that fit better, I don’t want them badly enough to really do something about it and build domain knowledge in that field.

So, given that most people find new skills hard to learn, and find writing unimportant and boring, the better question is: Why do people write, especially blogs? Robin is the outlier, not the hypothetical old person imparting life lessons. You could reduce this question to, “Why isn’t reading and writing important to most people,” and beyond the obvious answer—they can survive and reproduce without them—I don’t have much.

Gwern’s answer in Robin’s comments seems sound to me: “Differing incentives and realities. Old adults give advice to teens which basically assume they can act like old adults; they forget just how painful things like waiting were, and wish away even the most transparently biological realities like shifts in circadian rhythms.” I would add that teens also don’t think they’ll ever be old. They live in the present.

Thinking back over my own life, I’m struck by how few old people have had useful advice for me. For adolescents and young adults, sex is tremendously important, yet few old people give real advice about it, or gave real advice to me; many of them also don’t seem to understand what the modern dating environment is like. In addition, old people might be worried about coming across as lascivious or inappropriate, when they’re really just trying to impart knowledge—I know that I seldom tell my students, for example, what the dating world is actually like.

I’m also really interested in being a writer, and have been for a long time, but very few adults know anything about being a writer. Those who do often don’t know anything about the Internet, which is now inextricably linked with most writers’ writing lives. So the limited advice that old people can offer often doesn’t seem applicable to me.

Perhaps some old people sense this, and sense that many younger people won’t listen to them anyway.

Facebook and cellphones might be really bad for relationships

There’s some possibly bogus research about “How your cell phone wrecks your relationships — even when you’re not using it.” I say “possibly bogus” because these kinds of social science studies are notoriously unreliable and unreproducible.* Nonetheless, this one reinforces some of my pre-existing biases and is congruent with things that I’ve observed in my own life and the lives of friends, so I’m going to not be too skeptical of its premises and will instead jump into uninformed speculation.

It seems like cell phones and Facebook cordon a large part of your life from your significant other (assuming you have one or aspire to have one) and encourage benign-seeming secrecy in that other part of your life. In the “old days,” developing friendships or quasi-friendships with new people required face-to-face time, or talking on the phone (which, at home, was easily enough overheard) or writing letters (which are slow, a lot of people aren’t very good at it or don’t like to write letters). Now, you can be developing new relationships with other people while your significant other is in the same room, and the significant other won’t know about the relationship happening via text message. You can also solicit instant attention, especially by posting provocative pictures or insinuating song lyrics, while simultaneously lying to yourself about what you’re doing in a way that would be much harder without Facebook and cell phones.

Those new relationships start out innocently, only to evolve, out of sight, into something more. Another dubious study made the rounds of the Internet a couple months ago, claiming that Facebook was mentioned in a third of British divorce petitions. Now, it’s hard to distinguish correlation from causation here—people with bad relationships might be more attached to their phones and Facebook profiles—but it does seem like Facebook and cellphones enable behavior that would have been much more difficult before they became ubiquitous.

I don’t wish to pine for a mythical golden age, which never existed anyway. But it is striking, how many of my friends’ and peers’ relationships seem to founder on the shoals of technology. Technology seems to be enabling a bunch of behaviors that undermine real relationships, and, if so, then some forms of technology might be pushing us towards shorter, faster relationships; it might also be encouraging us to simply hop into the next boat if we’re having trouble, rather than trying to right the boat we’re already in. Facebook also seems to encourage a “perpetual past,” by letting people from the past instantly and quietly “re-connect.” Sometimes this is good. Sometimes less so. How many married people want their husband or wife chatting again with a high school first love? With a summer college flame? With a co-worker discussing intimate details of her own failing relationship?

Perhaps relationship norms will evolve to discourage the use of online media (“Are we serious enough to de-active each other’s Facebook accounts?” If the answer is “no,” then we’re not serious and, if I’m looking for something serious, I should move on). Incidentally, I don’t think blogs have the same kind of effect; this blog, for instance, is reasonably popular by the standards of occasional bloggers, and has generated a non-zero number of groupies, but the overall anonymity of readers (and the kind of content I tend to post) in relation to me probably put a damper on the kinds of relationship problems that may plague Facebook and cell phones.

EDIT: See also “I’m cheating on you right now: An admiring like on your Facebook page. A flirty late-night text. All while my partner’s right there next to me” mentions, unsurprisingly:

A study in 2013 at the University of Missouri surveyed 205 Facebook users aged 18–82 and found that “a high level of Facebook usage is associated with negative relationship outcomes” such as “breakup/divorce, emotional cheating, and physical cheating.”

Again, I want to maintain some skepticism and am curious about studies that don’t find a difference and thus aren’t published. But some research does match my own anecdotal impressions.


* If you’d like to read more, “Scientific Utopia: II – Restructuring Incentives and Practices to Promote Truth Over Publishability” is a good place to start, though it will strike horror in the epistemologist in you. Or, alternately, as Clay Shirky points out in “The Cognitive Surplus, “[…] our behavior contributes to an environment that encourages some opportunities and hinders others.” In the case of cell phones and Facebook, I think the kinds of behaviors encouraged are pretty obvious.

An economic model of paid sex: Coase’s “The Nature of the Firm,” gains from trade, and the gift economy

In Roosh’s “Orgasm or Money” story, he describes encountering yet another semi-pro prostitute in Latvia,* and he ends by wondering about sexual cultures around the world:

Then I thought about what she had said, how it was stupid for American girls not to ask for money before sex. Was it possible that the sexual culture in America and other Western countries is fantasy, and that the best move for women was to get as much as she could out of a guy?

It’s possible that getting “as much as she could out of a guy” in terms of money is optimal for an individual woman in some circumstances, but if she plays that game she’s likely to find guys who are unwilling to make long-term investments in her. A guy who pays for sex expects to dump the provider: as the philosopher Charlie Sheen once supposedly said regarding prostitutes, “I don’t pay them for sex. I pay them to leave.”

But there are deeper problems.

Moving unpaid “labor” into the “paid” labor can have the nasty, unintended effect of monetizing a lot of activity that’s better left outside the conventional economy. Roosh is really describing is the difference between a gift economy and market economy, which Lewis Hyde describes in his eponymous book. Moving all or a great deal of sexual activity to a market economy will result in fewer people forming mutually beneficial relationships in which both reap gains from trade and specialization. Monetizing such relationships increases transaction costs for both buyers (men, usually) and sellers (women, usually), which can leave both sides worse off for transaction costs and other reasons.

Ronald Coase’s famous essay “The Nature of the Firm” (alternately, here’s Wikipedia on it) points out that firms exist to reduce friction / transaction costs that arise from alternate arrangements—like having a large number of consultants work together. When individuals are try to gain every last monetary or other advantage at the margin, they aren’t working towards the good of the whole. It’s cheaper and better for large groups of people to get lump-sum payments and then work together, to the best of their abilities, to further the total enterprise. That’s true of firms and of marriages.

A relationship can be conceptualized as a very small firm, as people continually rediscover—in “Marry Him! The case for settling for Mr. Good Enough,” Lori Gottlieb realizes that often “Marriage isn’t a passion-fest; it’s more like a partnership formed to run a very small, mundane, and often boring nonprofit business. And I mean this in a good way.” She realizes this after having a child on her own, however. Instead of figuring out that she can reap major gains from finding a decent guy and marrying him, she decides to date around and has a child via a sperm seller, only to find what should really be obvious and only isn’t to someone who’s been taught to never “compromise.”

As noted above, both parties reap gains from specialization in a firm or marriage: one person might like to cook, for example, while the other person does dishes, or runs errands, or builds stuff. Granted, this only works if both parties actually have useful skills: one time I was in a bar, listening to a nasty-sounding woman complain to two friends, one male and one female, about the guy she was divorcing, and the woman was going on about how she wouldn’t cook, or run errands, or acquiesce to any number of normal-seeming things. Finally I asked her, “What do you bring to the relationship beyond your vagina?” The guy started laughing, hard, and the other woman began chuckling, and the complaining woman didn’t know what to say, so she told me I was rude (true, although I prefer to call it “honest”) and asked how I dare ask her that sort of thing—I dare many things. But she didn’t and probably couldn’t answer the obvious question, perhaps because she already knew the answer in her heart.

The larger point contained in that anecdote, however, is that both parties gain, or should gain, from not having search for sex partners or pay for sex. In addition, long-term plans, like offspring, are easier to make.

Most of us don’t want to live in a purely market economy with every potential transaction: when our significant others come over, we don’t charge them for dinner, and, if we did, the charge would create very different expectations. Dan Ariely describes the expectation issue in Predictably Irrational and elsewhere. Once market norms, as opposed to gift-based norms, are activated, they’re very hard, and perhaps impossible, to de-activate.

Plus, to return to Coase, it’s very inefficient to price everything, and to continually think of pricing. It’s better for individuals and the economy as a whole if people trust each other and create value for each other without (always) charging for it. Relationships are, in part, a movement from market economies to gift economies, and in the process they create a lot of value, along with love, trust, and assorted other positive feelings. If there’s a large-scale culture shift away from non-paying relationships and towards women trying “to get as much as [they] can” out of guys, as Roosh describes, both sides lose, including the woman trading sex for money.

Granted, if a woman isn’t looking for long-term relationships or real help, it can make sense to move to a mercantile economy: this might be why a fair number of college girls get into stripping or even hooking, only to quit at or near graduation: they’re shifting from short-term expectations to long-term ones, and they know that violating social taboos can have a (major) economic payoff, but it’s easy enough for many of them to shift back into the “normal” relationship economy when their interests shift to long-term relationships.


* “Semi-pro” meaning someone who doesn’t explicitly advertise their wares but does eventually demand money for sex, or simply tries to drain guys through overpriced bar drinks and the like.

Comment when you have something to say

By now it’s well-known that most Internet forums devolve over time, even when the people running the forum take concrete steps to avoid devolution. But the main problem is not necessarily the trolls who deliberately attempt to degrade the quality of the conversation. It’s low-quality comments that aren’t necessarily malicious or even mean-spirited but do reflect shallow knowledge. Not only that, but such comments are often designed to appeal to groupish belief or to raise the status of the commenter, rather than sharing information and asking genuine questions.

Kens offered this insightful observation on HN:

My theory (based on many years of Usenet) is that there are three basic types of online participants: “cocktail party”, “scientific conference”, and “debate team”. In “cocktail party”, the participants are having an entertaining conversation and sharing anecdotes. In “scientific conference”, the participants are trying to increase knowledge and solve problems. In “debate team”, the participants are trying to prove their point is right.

Unfortunately, the people in scientific conference mode attract the cocktail people, but the latter don’t tend to attract the former. Debate team-types tend to be attracted to both—they’re the people exhibiting groupish and status-based behavior. In HN land, I’m probably closer to cocktail mode people than the scientific conference mode people, though I want to act more like a conference person.

Still, it’s worth looking more carefully at what the scientific conference-mode means. I don’t think scientific conference means a literal presentation of new results, but I think it does mean that the people commenting are deeply informed, deeply curious, reasonably respectful, and work to speak from a position of knowledge, rather than ignorance, about a subject. In this sense I fit the scientific conference mode when I discuss a small but real number of issues related to teaching, urban planning / development, and grant writing / government practices. The second one relates least to my day-to-day life but is a personal interest about which I’ve read a fair amount. Towards this end, I suspect a lot of people could improve the quality of the conversation simply by not commenting.

I distill this general idea to a simple behavior heuristic that might be valuable to others: don’t comment unless you have a special, unusual, or well-informed viewpoint. Many of my comments link to books and/or articles I’ve read that elaborate on whatever point I’m making or trying to make (here’s one example, linking to Bryan Caplan’s Selfish Reasons to Have More Kids, and here’s another, citing Edward Glaeser’s The Triumph of the City; in response to the second, someone even said, “I got these books simply because of this recommendation,” which makes me feel warm and fuzzy inside). Some of the ideas contained in the books or articles I cite might be wrong or badly argued, but at least I’m basing my comments on something specific rather than some general philosophical point. Too many people argue based on first principles or unsourced speculation. The latter isn’t always bad; for example, someone might work in a field and know something deep and important about it without having a link to a specific discussion of the idea being discussed.

Many of my comments that don’t link to books or articles still deal with specific issues in which I have above average expertise, knowledge, or experience. This comment discusses how I deal with a student who asks a question relating solely to his or her individual issue in a large group without sounding like a jerk (I think, anyway), this comment is about a specific product I’ve used (the Unicomp Customizer), and this comment is about specificity in writing and thinking. Again, I might be wrong, but in each case I’m writing based on experience.

You can find some exceptions to the principles I’ve discussed above. You should ask logical or reasonable follow-up questions, especially if you’d like more information (here’s one sample; here’s another.) Succinct is often beautiful. Focus on genuine questions, rather than challenging people because their beliefs don’t match yours.

You don’t always have to follow these rules—I don’t—but if you’re debating about whether you should post a comment, you should probably err on the side of silence and not intruding on other people’s time. Unfortunately, the kind of people who most need such internal self-restraint are probably also the ones least likely to use it, and I doubt anything can be done to solve this problem, which seems like a variant of the Dunning-Kruger Effect.

Despite all this, I don’t see a solution to the fundamental problems, at least beyond the person at the margin who might read this and change his or her behavior slightly.

In other words, one can appeal to community rules and norms, or resort to meta-posts (like this one).* Such an appeal shouldn’t be done too often, or a community will spend more time discussing its own rules and norms than it does discussing and reading the material that should be the purpose of its existence (I last wrote a post like this in January; that January post still seems relevant, but I feel like enough time has passed and that I’ve observed enough behavior to make this post relevant too). But perhaps the occasional reminder will, as I said, help at the margins.

Oh, and the other rule for commenting? When you’re done with substantive content, stop.


* Granted, these kinds of posts and comments can make a community deteriorate. For example, there are a set of overly long comments by jsprink_banned and josteink that fail to distinguish between an argument and how the argument is presented: I suspect they’re unhappy with the mod functions mostly because they haven’t focused on how an argument is delivered. Civility counts for a lot, at all levels of debate; see, for example, Tyler Cowen’s comments on civility and Paul Krugman (and his comments on the limits of binary, good versus evil thinking in general).

There are also comments like this, in which the poster argues from nothing, attempts to activate an anti-corporate ideology, and ignores the obvious, abundant evidence of the continued importance of firms. Alex-C, fortunately, did reply: “I almost can’t tell if this comment came from some sort of Markov text generator.” HN used to have many fewer of those kinds of comments, and when it did get those kinds of comments, they were much less likely to rise. It takes more effort than it should for me not to respond to them directly. Incidentally, the comment Alex-C was replying to meant to say something like this.

Martin Amis, the essay, the novel, and how to have fun in fiction

There’s an unusually interesting interview with Martin Amis in New York Magazine, where he says:

I think what has happened in fiction is that fiction has responded to the fact that the rate of history has accelerated in this last generation, and will continue to accelerate, with more sort of light-speed kind of communications. Those huge, leisurely, digressive, essayistic, meditative novels of the postwar era—some of which were on the best-seller lists for months—don’t have an audience anymore. [. . .]

No one is writing that kind of novel now. Well [. . . ] David Foster Wallace—that posthumous one looks sort of Joycean and huge and very left-field. But most novelists I think are much more aware than they used to be of the need for forward motion, for propulsion in a novel. Novelists are people too, and they’re responding to this just as the reader is.

I think people aren’t reading the “essayistic, meditative novels” because “essayistic, meditative novels” reads like code-words for boring. In addition, we’re living in “The Age of the Essay.” We don’t need novelists to write essays disguised as novels when we can get the real thing in damn near infinite supply.

The discovery mechanisms for essays are getting steadily better. Think of Marginal Revolution, Paul Graham’s essays, Hacker News, The Feature, and others I’m not aware. Every Saturday, Slate releases a link collection of 5 – 10 essays in its Longform series. Recent collections include the Olympics, startups, madness in Mexico, and disease. The pieces selected tend to be deep, simultaneously intro- and extrospective, substantive, and engaging. They also feel like narrative, and nonfiction writers routinely deploy the narrative tricks and voice that fiction pioneered. The best essay writers have the writing skill of all but perhaps the very best novelists.

As a result, both professional (in the sense of getting paid) and non-professional (in the sense of being good but not earning money directly from the job) writers have an easy means of publishing what they produce. Aggregators help disseminate that writing. A lot of academics who are experts in a particular subject have fairly readable blogs (many have no blogs, or unreadable blogs, but we’ll focus on the readable ones), and the academics who once would have been consigned to journals now have an outlet—assuming they can write well (many can’t).

We don’t need to wait two to five years for a novelist to decide to write a Big Novel on a topic. We often have the raw materials at hand, and the raw material is shaped and written by someone with more respect for the reader and the reader’s time than many “essayistic” novelists. I’ve read many of those, chiefly because they’ve been assigned at various levels of my academic career. They’re not incredibly engaging.

This is not a swansong about how the novel is dead; you can find those all over the Internet, and, before the Internet, in innumerable essays and books (an awful lot of novels are read and sold, which at the very least gives the form the appearance of life). But it is a description of how the novel is, or should be, changing. Too many novels are self-involved and boring. Too many pay too little to narrative pacing—in other words, to their readers. Too many novels aren’t about stuff. Too many are obsessed with themselves.

Novels might have gotten away with these problems before the Internet. For the most part, they can’t any more, except perhaps among people who read or pretend to read novels in order to derive status from their status as readers. But being holier-than-thou via literary achievement, if it ever worked all that well, seems pretty silly today. I suppose you could write novels about how hard it is to write novels in this condition—the Zuckerman books have this quality at times, but who is the modern Zuckerman?—but I don’t think anyone beyond other writers will be much interested.

If they’re not going to be essayistic and meditative, what are novels to be? “Fun” is an obvious answer. The “forward motion” and “propulsion” that Amis mentions are good places to start. That’s how novels differ, ideally, from nonfiction.

Novels also used to have a near-monopoly on erotic material and commentary. No more. If you want to read something weird, perverse, and compelling, Reddit does a fine job of providing it (threads like “What’s your secret that could literally ruin your life if it came out?” provides what novels used to).

Stylistically, there’s still the question of how weird and attenuated a writer can make individual sentences before the work as a whole becomes unreadable or boring or both. For at least a century and change, writers could go further and further in breaking grammar, syntax, and point of view rules while still being comprehensible. By the time you get to late Joyce or Samuel Beckett’s novels, however, you start to see the limits of incomprehensibility and rule breaking regarding sentence structure, grammar, or both.

Break enough rules and you have word salad instead of language.

Most of us don’t want to read word salad, though, so Finnegans Wake and Malone Dies remain the province of specialists writing papers to impress other specialists. We want “forward motion” and “propulsion.” A novel must delight in terms of the plot and the language used. Many, many novels don’t. Amis is aware of this—he says, “I’m not interested in making a diagnostic novel. I’m 100 percent committed in fiction to the pleasure principle—that’s what fiction is, and should be.” But I’m not sure his fiction shows this (as House of Meetings and Koba the Dread show). Nonetheless, I’m with him in principle, and, I hope, practice.