Philip Zimbardo and the ever-changing dynamics of sexual politics

A friend sent me a link to Philip Zimbardo’s talk, “The demise of guys?“, which recapitulates and shortens Hanna Rosen’s long Atlantic article, “The End of Men.” Based on the video and reading lots of material on similar subjects recently (like: Baumeister, Is There Anything Good About Men?, although I do not find all of it compelling), I replied to my (female) friend:

1) There is still a very strong preference for males in much of the developing world, including India and China.

2) Barring unpredictable improvements in reproductive technology that bring us closer to Brave New World, I do not see substantial numbers of women wanting to live without men. There are some, have always been some, and will always be some, but they’re in the minority and probably will be for a long time.

3) I wouldn’t be surprised if what’s actually happening is that we’re seeing an increasing bifurcation in male behavior, as we’re seeing in many aspects of society, where the winners win more and the losers lose more than they once did. I suspect you can see more guys getting a larger number of women—a la Strauss in The Game, guys in frats, and guys who want to play the field in major cities—but also more guys who substitute video games and porn for real women, or who are incarcerated, or otherwise unable to enter / compete in mating markets. This makes women unhappy because they have to compete for a smaller number of “eligible” guys, the word “eligible” being one women love to use without wanting to define it. Women on average aren’t punishing men as much as one might expect for playing the field—see, e.g., this Slate article. Notice how Baumeister is cited there too.

4) Guys are more likely to drop out of high school, but they’re also more likely to be in the top 1% of the income distribution. They’re overrepresented in software, engineering, novel writing, and lots of other high-octane fields. They’re also overrepresented in prisons, special ed classes, and so forth. If you concentrate on the far reaches of either end of the bell curve, you’ll find guys disproportionately represented. Feminists like to focus on the right side, Zimbardo is focusing on the left. Both might be right, and we’re just seeing or noticing more extreme variation than we used to.

5) I’m not convinced the conclusions drawn by Zimbardo follow from the research, although it’s hard to tell without citations.

6) If guys are playing 10,000 hours of video games before age 21, no wonder they’re not great at attracting women and women are on average less attracted to them. This may reinforce the dynamic in number 3, in which those guys who are “eligible” can more easily find available women.

7) Most women under the age of 30 will not answer phone calls any more and will only communicate with men via text. If I were on the market, I would find this profoundly annoying, but it’s true. Many women, at least in college, make themselves chiefly available for sex after drinking heavily at parties; this contributes to perceived problems noted by Zimbardo, instead of alleviating them. If women will mostly sleep with guys after drinking and at parties, that’s what guys will do, and guys who follow alternate strategies will not succeed as well. Despite this behavior, many women also say they want more than just a “hookup,” but their stated and revealed preferences diverge (in many instances, but not all). In other words, I’m not sure males are uniquely more anti-social, at least from my perspective. When stated and revealed preferences diverge, I tend to accept evidence of revealed preferences.

EDIT: At the gym, I was telling a friend about this post, and our conversation reminded me of a student who was a sorority girl. The student and I were talking and she mentioned how her sorority was holding an early morning event with a frat, but a lot of the girls didn’t want to go if there wasn’t going to be alcohol because they didn’t know how to talk to boys without it. Point is, atrophied social skills are not limited to one sex.

8) For more on number 7, see Bogle, Hooking Up: Sex, Dating, and Relationships on Campus; I read the interviews and thought, “A lot of these people, especially the women, must experience extreme cognitive dissonance.” But people on average do not appear to care much about consistency and hypocrisy, at least in themselves.

9) In “Marry Him!“, Lori Gottlieb argues that women are too picky about long-term partners and can drive themselves out of the reproductive market altogether by waiting too long. This conflicts somewhat with Zimbardo’s claims; maybe we’re all too picky and not picky enough at the same time? She’s also mostly addressing women in their 30s and 40s, while Zimbardo appears to be dealing with people in their teens and 20s.

10) If Zimbardo wrote an entire book the subject, I would read it, although very skeptically.

The Time Paradox — Philip Zimbardo and John Boyd

As with many great works of nonfiction, Philip Zimbardo and John Boyd’s The Time Paradox: The New Psychology of Time That Will Change Your Life has that paradoxical quality of being incredibly profound and yet, in retrospect, blindingly obvious. It encompasses philosophical debates that occur at all levels of art; fiction often represents our feelings about time, while The Time Paradox lists a few dozen pop songs that contain messages about forms of time orientation. Last weekend I saw Woody Allen’s new movie, Vicky Christina Barcelona, in which one character, Vicky, lives oriented toward the stable future: a nice house, a boring but wealthy husband, and a life that is unlikely to end in a crater but also unlikely to offer stimulating adventures. Christina, played by the luscious and perfectly cast Scarlett Johansson, is a sensual hedonist who pursues novelty and risk-taking. Their contrasting ways of life begin the story, with the two balanced against Juan Antonio’s foil.

The movie is more sophisticated than this, as any art that can be accurately captured in summary is not worth experiencing. Nonetheless, just as The Hero With A Thousand Faces explicitly analyzes the scaffolding of many adventure stories, The Time Paradox implicitly discusses the dominant time views of many works of art. Some, like The Great Gatsby, show opposing characters who see time, and hence one another, in different ways; in such a reading, Nick Carraway is a present-oriented fatalist with little personality of his own, while Jay Gatsby combines a past-positive perspective of Daisy with a future-oriented work ethic that he thinks will win her back. Gatsby on a larger level criticizes both views: in bending all his time orientations toward a particular person, Gatsby’s obsession ultimately leads to a ruinous car crash, destroying himself in crime, like the crime that his wealth is built on, while Nick, without the focus of his attention, seems to drift without learning. The novel’s last line, one of my favorites in all literature, soothes or terrifies the reader by reminding us of how life will continue for others even when it does not for us:

Gatsby believed in the green light, the orgastic future that year by year recedes before us. It eluded us then, but that’s no matter—tomorrow we will run faster, stretch out our arms farther. . . . And one fine morning—
So we beat on, boats against the current, borne back ceaselessly into the past.

Whether we are terrified by this receding light depends on our reaction to it and how we handle that past.

Zimbardo also wrote The Lucifer Effect: Understanding How Good People Turn Evil, which together with Dan Ariely’s Predictably Irrational, pokes holes in traditional economic thinking concerning man as as a rational actor. All three argue that things are not as simple. In Zimbardo and Boyd’s case, the problem is that we don’t consciously realize how we tend to think about past, present, and future, or if we do, we aren’t able to step outside ourselves to realize how we’re thinking. What is “rational?” in the context of past, present, and future? To enjoy the moment, or to work toward a future moment? Zimbardo and Boyd implicitly argue neither, and they point to the poorly understood trade-offs we make regarding how we orient ourselves chronologically. That I use the language of economics to present this parallels Zimbardo and Boyd, who discuss “The Economics of Time” along with the nature of opportunity costs—another well-known issue too little referenced in everyday discourse.

Learning about opportunity costs, including those of being oriented toward present, past, or future, gives one more information and hopefully leads to better decision making. This meta-critical force is powerful, if poorly understood, and what I like so much about Zimbardo’s books is their ability to take on this meta-critical function and put it to paper—like a good therapist or friend—pointing to the blind spots we don’t realize exist. Self-help books should do this but often don’t, or if they do—like Marti Olsen Laney’s The Introvert Advantage: How to Thrive in an Extrovert World*—they’re filled with clichés or otherwise poorly written. The Introvert Advantage is especially painful because it conveys a useful message to both introverts and extroverts, but is marred by stylistic problems. The Time Paradox’s promises as a self-help book are slightly deceiving: it is more like a book discussing research that happens to dress in self-help clothing. And aren’t all books, or all art, on some level designed to provide “self-help?” But no matter: the genre, if any, is transcended by the content, as happens here.

The Time Paradox is also clever in its examples of traps each kind of person creates for themselves, whether those focused on the past to the detriment of their daily lives, those focused on the present to the detriment of their belief in their own ability to change the future, or those focused on the future who lose their sense of joy. Regarding the latter, for example, the authors write that “[…] future-oriented workaholics who do not cultivate sensuality and sexuality have little interest in making friends or “wasting” time in playful activities—a recipe for sexual deprivation. In contrast, the present-oriented might be too focused on such aspects, resulting in pregnancy, disease, or awkward pictures on the Internet.

Elsewhere, regarding those who are oriented toward the future, Zimbardo and Boyd say “[…] they do not spend time ruminating on negative past experiences. They focus on tomorrow, not yesterday.” This has advantages, especially in societies that reward delayed gratification, but also problems, as such “futures” can appear callous, or uninterested in the past, or less capable of building friendships based on experiences—perhaps leading them to feel emotionally isolated, or even held back in work. Futures might succeed through plotting and the aforementioned delayed gratification, but they might also miss some aspects of creativity. For example, Zimbardo and Boyd describe a maze game in which futures tended to outperform presents in navigating a mouse through a maze. But, as the authors write:

Many of the presents who failed got frustrated at not finding the right path and ended up making a straight line to the gaol, bursting through the cul-de-sac barriers.

Perhaps some measure of conventional success is thanks not due to following rules and accepting constraints, but through redefining problems and solutions. As one character says to another in The Matrix, some rules can be bent; others, broken. Technological and artistic progress** often stem from such unconventionality. That isn’t to make a logic error and say that unconventionality automatically equates with progress, but channeled in the right area, it might be necessary if not sufficient.

The Sept. 1 issue of The New Yorker shows a cartoon in which a man says, “I’m not losing my memory. I’m living in the now,” implying a past orientation moved into the present caused by age. Mental faculty creates time impressions, and physical changes, including drugs, can alter them—and not necessarily for the worse. In a section regarding how to become more present-oriented, for example, Zimbardo and Boyd offer the recommendation “drink alcohol in moderation,” which is the sort of self-help I’m only too happy to indulge. Perhaps so many writers and artists are alcoholics because they need to get out of the past (Faulkner) or future.

In suggesting this, however, I’m succumbing to the book’s major potential weakness: presenting time disorders or problems as an overly major source of anxiety and in turn diagnosing time as a source of maladies, rather than perhaps an effect. For example, Zimbardo and Boyd come perilously close to implying that correlation is causation when they discuss the outcomes of the time scales they developed to measure one’s attitude; in an early section, they attribute a focus on immediate gratification, self-stimulation, and short-term payoffs to perhaps too great a degree.

Other sections should be qualified, as when Zimbardo and Boyd write that “Our scarcest resource, time is actually much more valuable than money.” That depends on, for example, how much money we have; if I had no food, I would very readily trade some time for money, and almost every day I engage in some transaction designed to turn time into money. For, say, billionaires, time is more scarce than money or virtually any other resource, and it’s worth noting here what economists call the backward bending shape of the labor curve—that is to say, as a person’s earnings increase, they tend to work more hours, but at a certain point, they tend to cut back in order to enjoy the results of those earnings. An extreme example of that tendency can open between generations: the hard-working parents provide so plentifully for their offspring that the offspring tend to adopt a hedonistic, present-oriented lifestyle that ultimately destroys the future-oriented values of work and thrift that led to creation of the fortune in the first place. Today, it’s Paris Hilton or the ceaseless articles about how we damn kids lack the work ethic of the old days; yesterday it was Vanderbilts and Astors whose descendants are now mostly middle-class, and tomorrow it will be the tech titans’ legacy.

Yet even if I don’t entirely agree with sections or nit-pick, merely raising the issues leads us to consider them, our own behavior, and most importantly, how to best lead our lives and allocate a resource Zimbardo and Boyd imply many barely consider. At the end of the last paragraph, I analogized time perspectives to family and social dynamics—an idea I wouldn’t have considered prior to reading The Time Paradox.

Zimbardo and Boyd rightly caution readers not to assume that a person is entirely one orientation, since all people have some level of all orientations within them. Instead, the reader should try applying their own (past, presumably) behavior to the models in order to evaluate them within the framework both offer. Perhaps their most powerful recommendation is one that echoes Viktor Frankl’s Man’s Search for Meaning and Stoic philosophers: that although we can’t always control events, we can control our reactions and try to influence them. Zimbardo and Boyd write:

[…] psychological principles are elastic: They bend and change according to the situation and frame of reference […] We have no control over the laws of physics, but we do have some control over the frames of reference in which we view time. Recognizing how and when these frames of reference are advantageous may allow you to get more out of life and help you recognize those occasions when time perspectives hinder and impede you.

The most valuable sections of the book can get buried: they don’t come later in that quote, but earlier, when Zimbardo and Boyd discuss how much our perceptions count and can change how we feel. Their biggest purpose is first to increase our sense of agency and our ability to believe in our own influence, limited as it might be. Call this the difference between science and The Secret, a book I won’t dignify with a link: one sees self-empowerment as a first step of many to come, while the latter is an excuse for the first step and then stopping in a myopic haze of wishful thinking.

Finally, if the book has an overarching, abstract message, it is that we should, like a character from a Herman Hesse novel, ask what we want from life and how to find it. The Time Paradox provides guidance in finding the answer by, for example, discouraging “a kind of learning helplessness,” but the actual journey belongs to the reader, not the authors.

* For decent coverage of the same idea, see Jonathan Rauch’s “Caring for Your Introvert” in The Atlantic.

** Assuming these aren’t simply two sides of the same coin.

The Lucifer Effect — Philip Zimbardo

Philip Zimbardo’s The Lucifer Effect: Understanding How Good People Turn Evil will probably have the misfortune of being an extremely important book that does not find the larger audience it deserves. Its author is most famous for conducting the Stanford Prison Experiment (SPE) in the 1970s, in which he divided two groups of normal Stanford students in “prisoners” and “guards” and observed the students assuming their respective roles with frightening quickness and, on the part of the guards, alacrity. The Lucifer Effect is the first time Zimbardo has detailed exactly what happened in the SPE, and he links it to the recent scandal in Abu Ghraib. To judge from recent events, it will not be the last time scandals like Abu Ghraib happen.

If I could sum up The Lucifer Effect, I’d change a quote I recently posted from Robert Heinlein, “secrecy begets tyranny,” to “bad systems beget bad results.” Zimbardo’s argument, made in meticulous detail on the SPE and then paralleled with Abu Ghraib, holds that in some situations normally healthy people can quickly take roles leading them toward brutality and that our personalities may play less of a role in the extent to which we fight injustice than many of us would like to think. These claims are extraordinary, and The Lucifer Effect must be read in full to understand them and the situations, which usually involve lax oversight by supposed authorities and arbitrary rules, that allow abuse to occur.

Some details from The Lucifer Effect haunt, as when Zimbardo says that when prisoners in the SPE were “released” early, other prisoners or guards often said nothing and made no mention of those who had come or gone, as though they were the trapped rabbits in the bizarre warren from Watership Down. The world the prison creates seems almost independent of the world prior to the prison, bringing to mind Kafka or Arthur Koestler’s Darkness at Noon The latter’s portrayal of psychological torture is political in nature, but the parallels between the SPE and it are there: the uncertainty, the apparent lack of thought on the part of guards, the sense of timelessness and the extent to which people become the role rather than vice-versa.

Despite these issues, Zimbardo’s last and too short section deals with how to combat bad systems. He writes: “Heroism often requires social support. We typically celebrate heroic deeds of courageous individuals, but we do not do so if their actions have tangible immediate cost to the rest of us and we can’t understand their motives.” Such was the case of rabble-rousing prisoners, and such is often the case with political reformers. Passages like this remind us of the larger ideas implicit in the particular actions, and Zimbardo skillfully generalizes from specific incidents and then brings the generalizations back to concrete examples, zooming in and out with the precision of a philosopher and the writing talent of a novelist. In the last and perhaps most important section Zimbardo discusses further research concerning how people disengage their moral senses and conform to communal norms and the like, and, in particular, dehumanization as it affects those in positions of power compared to those who are not.

Only occasionally does Zimbardo go too far afield with his theories, as happened with the long description of burnout inventories and the Abu Ghraib scandal. His puns sometimes elicit groans even when they’re appropriate, as when he has a headline asking, “A Bad Apple or a Chip off the Best Block?” concerning a guard named Chip. Yet the section’s content is so solemn that letting in the joke, even a bad one, prevents reader fatigue—a fascinating strategy in a section concerning how people suffer burnout as a result of stress. While the stress of the reader is nothing like the stress of a prison guard in Iraq, Zimbardo’s reminder of how principles remain the same even as the orders of magnitude of importance changes is reinforced by him using the techniques he describes in writing. That and his tendency to drift into academic language (I will argue x, and then I will argue y…) are the only weaknesses in what is otherwise an excellent book and one that contributes greatly to understanding how social and bureaucratic systems work and can dehumanize both those involved and those controlled.

EDIT: Zimbardo’s next book, The Time Paradox, is probably also of great interest to readers of The Lucifer Effect.

Two visions for the future, inadvertently juxtaposed: Nell Zink and Marc Andreessen

Last week’s New Yorker inadvertently offers two visions for the future: one in a profile of the writer Nell Zink and the other in a profile of the venture capitalist Marc Andreessen. Both profiles are excellent. One of their subjects, however, is mired in a fixed, contemporary mindset, while the other subject looks to a better future.

This is Schultz’s description of Zink: “Zink writes about the big stuff: the travesty of American apartheid; the sexual, economic, and intellectual status of women; the ephemerality of desire and its enduring consequences.” Is any of that stuff really big? Does it matter? Or is it just a list of somewhat transitory issues that obsess modern intellectuals who are talking to each other through magazines like The New Yorker? The material well-being of virtually any American is so much higher than it was in, say, 1900, as to diminish the relative importance of many of the ideas Zink or Schultz considers “big.” At one point Zink “delivered a short lecture on income stagnation: a bird ridiculing its fellow-bird for stupidity.” But global inequality is falling and, moreover, the more interesting question may be absolute material conditions, rather than relative ones. One gets the sense that Zink is a more parochial thinker than she thinks. I sense from The Wallcreeper that she writes about the motiveless and pathless.

Here, by contrast, is Andreessen as described by Tad Friend:

Andreessen is tomorrow’s advance man, routinely laying out “what will happen in the next ten, twenty, thirty years,” as if he were glancing at his Google calendar. He views his acuity as a matter of careful observation and extrapolation, and often invokes William Gibson’s observation “The future is already here—it’s just not very evenly distributed.” Jet packs have been around for half a century, but you still can’t buy them at Target.


The game in Silicon Valley, while it remains part of California, is not ferocious intelligence or a contrarian investment thesis: everyone has that. It’s not even wealth [. . . .] It’s prescience. And then it’s removing every obstacle to the ferocious clarity of your vision: incumbents, regulations, folkways, people. Can you not just see the future but summon it?

Having a real vision counts, and it seems that too few people have a vision for the future. Andreessen is thinking not of today but of what can be made better tomorrow. I would not deny the impact of slavery on contemporary culture or the importance of desire on life, but I would ask Zink: if the U.S. is doing things poorly, who is doing them better? And if the U.S. is doing things poorly why then is Silicon Valley the center of the future?

One of these people reads as an optimist, the other as a pessimist. One reads as someone who makes things happen and the other as someone who complains about things that other people do. One reads as a person with a path. The other doesn’t.

Don’t get me wrong. I liked The Wallcreeper when I read it a couple months ago. I didn’t have much to say about it on this blog because it seems kind of interesting but left me without much feeling. But I can’t help thinking that Andreessen’s vision for the future is big, while Zink’s vision of the present is small.

As a bonus, check out “All Hail the Grumbler! Abiding Karl Kraus,” which is poorly titled but describes Jonathan Franzen’s relationship to art, technology, and other matters. He’s in the Zink school; perhaps something about studying German inculcates an anti-technology backlash among writers, since Germany and the U.S. are both among the most technophilic societies in the world (for good reasons, I would argue). From the article:

Kraus’s savage criticism of popular newspapers, suspicion of technology, and defense of art all appeal to Franzen, whose nonfiction essays strike similar notes. For instance, in the spirit of Kraus, Franzen has attacked the intrusiveness of cellphones and the loss of private space as people bark out the dreck of their lives.

But even “privacy” is a relatively new idea: being alone to read books only really got going in the 18th Century, when books got cheap enough for normal people to borrow them from libraries. The luddites of the day lamented the withdrawal from the private sphere into onanistic privacy. They asked: Why wrap yourself in some imaginary world when the big real world is out there?

As you may imagine I’m more neutral towards these developments. Like many literary types I think the world would be a better place with more reading and less reality TV, but I’ll also observe that the kind of people who share that view are likely to read this blog and the kind of people who don’t aren’t likely to give a shit about what I or anyone like me says.

Much later in the essay its author, Russell Jacoby, writes: “Denouncing capitalist technology has rarely flourished on the left, which, in general, believes in progress.” I get what he’s saying, But denouncing technology in general has always been a fool’s game because a) pretty much everyone uses it and b) to the extent one generation (or a member of a generation) refuses a given technology, the next generation takes it up entirely. Franzen may not like technology circa 2015 but he is very fond of the technology of the printing press. At what point does Franzen think “good” technology stopped?

I’m reminded, unfairly perhaps, of the many noisy environmentalists I’ve known who do things like bring reusable bags to grocery but then fly on planes at least a couple times a year. Buy flying pollutes more than pretty much anything anyone else does. A lot of SUV-drivers living in exurbs actually create less pollution than urban cosmopolitans who fly every two months. By the same token, the same people who denounce one set of technical innovations are often dependent on or love some other set of technical innovations.

Almost no one wants to really, really go backwards, technologically speaking, in time. Look at behaviors rather than words. I do believe that Franzen doesn’t use Facebook or write a blog or whatever, but he probably uses other stuff, and, if he has kids, they probably want smart phones and video games because all their friends have smart phones and video games.

I’m not saying smart phones and video games are good—quite the opposite, really—and I’m sympathetic to Zimbardo’s claim that “video games and porn are destroying men.” But I am saying that the claims about modern technology doing terrible things to people or culture goes back centuries and has rarely if ever proven true, and the people making such claims are usually, when viewed in the correct light, hypocrites on some level. Jacoby does hit a related point: “Presumably, if enough people like SUVs, reality TV, and over-priced athletic footwear, little more may be said. The majority has spoken.” But I want to emphasize the point and say more about not the banal cultural stuff like bad TV (and obviously there is interesting TV) but the deeper stuff, like technology.

The Andreessens of the world are right. There is no way back. The only way is forward, whether we want to admit it or not. The real problem with our cultural relationship to technology—and this is a Peter Thielian point—is that we’re in denial about dependence, need, and the need to build the future.

Why we like characters who battle institutions

David Brin’s “Our Favorite Cliché: A World Filled With Idiots… or Why Film and Fiction Routinely Depict Society and its Citizens as Fools” is great, and you should read it. He observes that novels, TV shows, and movies routinely depict heroic individuals standing up to corrupt or evil institutions or organizations. This tendency is “a reflex shared by left and right” to associate villainy with organization. Moreover, “Even when they aren’t portrayed as evil, bureaucrats are stupid and public officials short-sighted.” Brin notes some exceptions (Contagion is another, and the link goes to an article titled “Bureaucratic Heroism”), but those exceptions are exceptionally exceptional.

Nonetheless, I’d like to posit a reason why institutions and organizations are often portrayed as evil: they behave in ways that are evil enough with shocking regularity, and few of us have the means or fortitude to resist broken, evil, or corrupt institutions. The most obvious and salient example, much taught in schools, is Nazi Germany; while some individuals fought against the state murder apparatus, the vast majority went along with it, leading pretty much everyone who learned afterwards about the Holocaust to ask, “What would I do in that situation?” Most of us want to think we’d be heroic resisters, but the push to conform is strong—as the Milgram Experiment and Philip Zimbardo’s research shows. The Soviet Union murdered tens of millions of its own citizens.

Other examples exist closer to home: the Civil Rights movement fought corrupt institutions in the U.S. All the President’s Men exposed criminal actions, cruelty, and simple mendacity at the heart of the White House. The Vietnma War got started based on the invented Gulf of Tonkin. More recently, the Bush Administration made up evidence (or incompetently accepted made-up evidence) to justify the Iraq War. On a smaller basis, many of us have gotten caught in various nasty government bureaucracies in schools, universities, or elsewhere. Here’s one example from Megan McArdle’s struggles with the DMV.

Brin observes:

Now imagine that your typical film director ever found herself in real trouble, or the novelist fell afoul of deadly peril. What would they do? They would dial 9-1-1! They’d call for help and expect — demand — swift-competent intervention by skilled professionals who are tax-paid, to deal with urgent matters skillfully and well.

He’s right. I called the cops when a random asshole pounded on my door and threatened to kill me. They did show up (albeit later than I would’ve hoped!) and did arrest the guy. I’m grateful to them and for the police in that circumstance. But a lot of us are less grateful to cops, as reading Alice Goffman’s amazing book On the Run: Fugitive Life in an American City. Police as an institution have largely failed inner cities. Ask black people about their interactions with the police, and you’ll get a very different view of police than that of many white Americans.

So we may be getting stories of (exaggerated) institutional incompetence both due to history and due to everyday experience with institutions that (sometimes) don’t work well. Nonetheless it’s worthwhile for those of us who write stories to contemplate the truth of Brin’s observations about cliché on the level of plot, because we should try to be aware of our own dependence on cliché and to break that dependence whenever possible.

Anytime someone describes sexual behavior as “dumb,” ask: Dumb in what timeframe?

In writing about the David Petraeus non-scandal, Adam Gopnik says, correctly, that “Benghazi is a tragedy in search of a scandal; the Petraeus affair is a scandal in search of a tragedy,” and, perhaps less correctly, this:

The point of lust, not to put too fine a point on it, is that it lures us to do dumb stuff, and the fact that the dumb stuff gets done is continuing proof of its power. As Roth’s Alexander Portnoy tells us, “Ven der putz shteht, ligt der sechel in drerd”—a Yiddish saying that means, more or less, that when desire comes in the door judgment jumps out the window and cracks its skull on the pavement.

But whether lust “lures us to do dumb stuff” depends on timeframe we’re looking at: if we do “dumb stuff” that results in our genes still existing, say, 200 years from now, then what’s dumb in the context of the next month may be “smart” from the context of a couple centuries from now. We’re evolutionarily primed to propagate our genes—that’s Richard Dawkins’ point in The Selfish Gene.

We also have to ask what happens in the very short term: presumably, in the minutes to hours that Petraeus and Broadwell were doing it (or anyone is “doing it”), they were making a very smart decision for themselves over those few minutes. One might be able to look at the quality of their decision making in terms of Philip Zimbardo and John Boyd’s The Time Paradox, and as being very good for the immediate present when they were doing it, not very good in the months or years after the scandal comes to light, and, depending on conception, very good over the very long term.

Don’t read this post and the books linked, then go out and cheat on your significant other only to say that your selfish genes and hedonistic time perspective “made” you do it. But do think about the intellectual context in which Portnoy’s claim exists, and how desire can function in the very long and short run.

Time preferences, character, and The Novel (in my novel)

A friend was reading a novel I wrote called The Hook and asked: “I’m curious. . . Do you believe this?” of this passage, in which the speaker is a teenage girl describing her teacher:*

But Scott sometimes said that if we do something, it shows that we wanted to at that time, even if we regret it later. So other people can’t really “make” us do anything. He said that people want different things over different courses of time—so in the short term, you might want one thing, in the long term, something else, and when you’re in the heat of the moment, the short term is pretty sweet.

The answer to my friend’s question is: mostly but not entirely. Zimbardo and Boyd wrote The Time Paradox, which describes how some people default to “past,” “present,” or “future” orientations or dispositions; hedonic people tend to be present-oriented, high achievers (probably a lot of engineers) tend to be future-oriented, and nostalgic, content, family-centered people tend to be past-oriented. These categories obviously aren’t hard and fast, and everyone has some of all of them, but I think the overall idea stands. And people who have one central orientation probably don’t understand others well, just like extroverts tend not to understand introverts; I think reading helps people better understand others not like themselves.

People are also pretty strongly biased by random emotions, feelings, and environments; for example, in Dan Ariely’s book Predictably Irrational, he describes how people in a sexually aroused state make very different or predictions decisions from those in a “cold” state—one might say they become much more present-oriented, which is probably obvious to those of us who have been in that state and are willing to think consciously and rationally about it afterwards. Most of us have probably been in that state, but relatively few of us want to admit what it’s like when we’re not in it. On a separate note, Ariely speculates that this may apply to hunger and other states too.

Daniel Kahneman’s book Thinking, Fast and Slow describes the numerous biases that we’re prone to, including a bias towards present consumption in lieu of future consumption. So if we’re in the moment being offered the pleasures of alcohol, drugs, sex, gambling, spending, or whatever, the “future” might seem very far away and uncertain (that’s what Karl Smith gets when he writes “If I Were A Poor Black Kid” that so many other commenters miss). So people are inclined to do things they say they “regret” or say “wasn’t them,” even when it probably was: it’s just that the person who gave into their craving was thinking in a different frame of mind, and the person in a “cold” frame of mind probably wants to present themselves differently than a person in a “hot” frame of mind acts. You may notice that a lot of people say, “I was drunk,” as if that means they had no control over what they were doing, but their rational self decided to take the first drink. It seems that many people go through a two-step process to get what they really want: they drink, which gives them an excuse to decry their actions while drunk at a future date while achieving their hedonic ends—which are often sexual.

This is how you get people suffused with regret for acts they very much enjoyed previously. Sex is the most obvious example here, but there are others. What a lot of people call “attraction” or “chemistry” looks to me more like people being attracted to specific behavioral or physical traits they then cloak in other words. This, basically, is what Neil Strauss explains in The Game and other self-proclaimed pickup people discuss in different venues. But it only works if women are attracted to the kind of show that such guys put on; many women in clubs / bars appear to be, at least to some extent, because if they weren’t then “game” wouldn’t work. I find this stuff more intellectually interesting than immediately applicable to my day-to-day life, but it nonetheless shows that a lot of social life happens below the level of consciousness and in ways that I didn’t appreciate when I was younger.

As I said earlier, people who tend to be highly logical and future oriented (I’m somewhat like this; you seem like you are too, although I obviously can’t speak for you and am not totally sure) often don’t “get” or understand people who aren’t. And vice-versa. People who are hedonically oriented in one moment and disavow their hedonism the next seem like hypocrites—and they are. But most people seem to be hypocrites and don’t take the time to deeply analyze what their “feelings” are telling them. Kahneman develops the idea of two “systems” that people use: the first is a fast, heuristic system that guides us to make instant, snap decisions; the second slows us down to analyze situations, but it’s much more laborious and harder to engage. Most people live in system one most of the time, including us. It takes a lot of effort to motivate system two. So we get a lot of biases from system one that sometimes make our system two self unhappy later.

I think one problem intellectuals like me have is an unwillingness to be sufficiently present-oriented, to slip out of our eggheads and into the now. A lot of cultures and societies have festivals or rituals that encourage this sort of thing; you can see a contemporary example in Brazil’s Carnival and numerous examples in older cultures (Donna Tartt’s excellent novel The Secret History exploits this interest for its plot). But ours doesn’t, which might in part be a function of our wacky religious heritage. We don’t have a lot of space for ritual; the closest we get is something like Halloween and extreme drinking parties, where people get to release or transcend the self in ways that may produce great pleasure. But, again, what is pleasure? Merely neurochemical? Or something else? I don’t have good answers, though I’m very curious.

So: do I believe what Stacy says Scott asserts? Somewhat. I think Scott’s mistake is assuming there’s a single, unified person in there somewhere. Either that, or Stacy, who’s speaking in the section you marked, misunderstands Scott, or can’t apply what he says because she doesn’t have the background to do so.

As you can probably tell from the above, I don’t really know what I believe; I’m guided in my thinking by some of the things I’ve read and observed, but the issue is complex enough that I don’t think they tell the whole story. When I was younger, I believed in a unified self; if someone did one particular thing at one particular time, that was a revealed preference, that’s who they were, and that’s the end of the story. Now, a lot of the work of behavioral and evolutionary psychologists and economists has forced me to rethink those ideas, and consciousness is much stranger than I really appreciated!

If you want to judge for yourself, the books I cited above are a good and lucid place to start. But I don’t think they’re the end of the story; maybe the story has no end. That’s not a real satisfying statement, but it’s what I’ve got and where I’ve gotten with my own imperfect thinking. Deep, much-debated issues often are that way because there isn’t a “right” answer per se—only a range of possibilities that are continually deepened over time through research, observation, and writing.

Note: The next paragraph has some material germane to the novel but that won’t make a lot of sense outside the context of the novel.

I mostly wish someone had explained a lot of this to me when I was younger. But they didn’t, which might be why Stacy repeats what Scott says to her (there’s so much I try to convey to people who’re younger than me, but I suspect most of them don’t really have the framework necessary to situate what I’m telling them, and thus they can’t really deploy it in behavioral changes). In the context of The Hook, I think Stacy and Arianna make their video at Sheldon’s coaxing because they’re caught up in the moment, and they’re obviously unhappy when the video gets shown to the whole school. So is Stacy the girl who is willing to bare her stuff for the camera when she’s sexually excited and not really thinking about what comes next, or the girl who can stand up in front of the whole assembly and walk nobly down and out, transcending the moment and trying to show herself beyond high school bullshit?

Both and neither. Which is, I hope, what makes her interesting as a character, and why I suspect narrative fiction will continue to enchant us even when research has surpassed many of the nonfiction writers on whom I’m drawing when I’m drawing characters.

This post started life as an e-mail to my friend, and I’ve edited it some before publishing it here.

College graduate earning and learning: more on student choice

There’s been a lot of talk among economists and others lately about declining wages for college graduates as a group (for example: Arnold Kling, Michael Mandel, and Tyler Cowen) and males in particular. Mandel says:

Real earnings for young male college grads are down 19% since their peak in 2000.
Real earnings for young female college grads are down 16% since their peak in 2003.

See the pretty graphs at the links. These accounts are interesting but don’t emphasize, or don’t emphasize as much as they should, student choice in college majors and how that affects earnings. In “Student choice, employment skills, and grade inflation,” I said that colleges and universities are, to some extent, responding to student demand for easier classes and majors that probably end up imparting fewer skills and paying less. I’ve linked to this salary data chart before, and I’ll do it again; the majors at the top of the income scale are really, really hard and have brutal weed-out classes for freshmen and sophomores, while those at the bottom aren’t that tough.

It appears that students are, on average, opting for majors that don’t require all that much effort.

From what I’ve observed, even naive undergrads “know” somehow that engineering, finance, econ, and a couple other majors produce graduates that pay more, yet many end up majoring in simple business (notice the linked NYT article: “Business majors spend less time preparing for class than do students in any other broad field, according to the most recent National Survey of Student Engagement [. . .]”), comm, and other fields not noted for their rigor. As such, I wonder how much of the earnings picture in your graph is about declining wages as such and how much of it is really about students choosing majors that don’t impart job skills of knowledge (cf Academically Adrift, etc.) but do leave plenty of time to hit the bars on Thursday night. Notice too what Philip Babcock and Mindy Marks found in “The Falling Time Cost of College: Evidence from Half a Century of Time Use Data:” “Full-time students allocated 40 hours per week toward class and studying in 1961, whereas by 2004 they were investing about 26 to 28 hours per week. Declines were extremely broad-based, and are not easily accounted for by compositional changes or framing effects.”

If students are studying less, maybe we shouldn’t be surprised that their earnings decline when they graduate. I can imagine a system in which students are told that “college” is the key to financial, economic, and social success, so they go to “college” but don’t want to study very hard or learn much. They want beer and circus. So they choose majors in which they don’t have to. Schools, in the meantime, like the tuition dollars such students bring—especially when freshmen and sophomores are often crammed in 300 – 1,000-person lecture halls that are extraordinarily cheap to operate because students are charged the same amount per credit hour for a class of 1,000 as they are for a seminar of 10. Some disciplines increasingly weaken their offerings in response to student demand.

Business appears to be one of those majors. It’s in the broad middle of’s salary data, which is interesting given how business majors presumably go into their discipline in part hoping to make money—but notice too just how many generic business majors there are. The New York Times article says “The family of majors under the business umbrella — including finance, accounting, marketing, management and “general business” — accounts for just over 20 percent [. . .] of all bachelor’s degrees awarded annually in the United States, making it the most popular field of study.” That’s close to what Louis Menand reports in The Marketplace of Ideas: “The biggest undergraduate major by far in the United States is business. Twenty-two percent of all bachelor’s degrees are awarded in that field. Ten percent of all bachelor’s degrees are awarded in education.” If all these business majors graduate without any job skills, maybe we shouldn’t be all that surprised at their inability to command high wages when they graduate.

I’d like to know: has the composition of majors changed over the years Mandel documents? If so, from what to what? Menand has some coarse data:

There are almost twice as many bachelor’s degrees conferred every year in social work as there are in all foreign languages and literatures combined. Only 4 percent of college graduates major in English. Just 2 percent major in history. In fact, the proportion of undergraduate degrees awarded annually in the liberal arts and sciences has been declining for a hundred years, apart from a brief rise between 1955 and 1970, which was a period of rapidly increasing enrollments and national economic growth. Except for those fifteen unusual years, the more American higher education has expanded, the more the liberal arts sector has shrunk in proportion to the whole.

But he’s not trying to answer questions about wages. Note too that my question about composition is a genuine one: I have no idea of what the answer is.

One other major point: if Bryan Caplan is right about college being about signaling, then there might also be a larger composition issue than the one I’ve already raised: people who aren’t skilled learners and who don’t have the willingness or capacity to succeed after college may be increasingly attending college. In that case, the signal of a college degree isn’t as valuable because the people themselves going through college aren’t as good—they’re on the margins, and the improvement to their skillset is limited. Furthermore, colleges universities aren’t doing all that much to improve that skillset—see again Academically Adrift.

I don’t know what, if anything, can be done to improve this dynamic. Information problems about which college major pay the most don’t seem to be a major issue, at least anecdotally; students know that comm degrees are easy and other, more lucrative degrees are hard. There may be Zimbardo / Boyd-style time preference issues going on, where students want to consume present pleasure in the form of parties and “hanging out” now at the expense of earnings later, and universities are abetting this in the form of easy majors.

This is the part where I’m supposed to posit how the issues described above might be improved. I don’t have top-down, pragmatic solutions to this problem—nor do I see strong incentives on the part of any major actors to solve it. Actually, I don’t see any solutions, whether top-down or bottom-up, because I don’t think the information asymmetry is all that great and consumption preferences mean that, even with better information, students might still choose comm and generic business.

Mandel ends his post by saying, “Finally, if we were going to design some economic policies to help young college grads, what would they be?” The answer might be something like, “make university disciplines harder, so students have to learn something by the end,” but I don’t see that happening. That he asks the question indicates to me he doesn’t have an answer either. If there were one, we wouldn’t have a set of interrelated problems regarding education, earnings, globalization, and economics, which aren’t easy to disentangle.

Although I don’t have solutions, I will say this post is a call to pay more attention to how student choices and preferences affect education and earnings discussions.

EDIT: See also College has been oversold, and pay special attention to the data on arts versus science majors. I say this as someone who majored in English and now is in grad school in the same subject, but by anecdotal observation I would guess about 75% of people in humanities grad schools are pointlessly delaying real life.

Sexting and society: How do writers respond?

In a post on the relative quality of fiction and nonfiction, I mentioned that fiction should be affected by how society and social life changes. That doesn’t mean writers should read the news de jour and immediately copy plot points, but it does mean paying attention to what’s different in contemporary attitudes and expression. I got to thinking about “sexting,” an unfortunate but useful portmanteau, because it’s an example of a widespread, relatively fast cultural change enabled by technology. (Over a somewhat longer term, “From shame to game in one hundred years: An economic model of the rise in premarital sex and its de-stigmatisation” describes “a revolution in sexual behaviour,” which may explain why a lot of contemporary students find a lot of nineteenth century literature dealing with sexual mores to be tedious.)

Laws that cover sexting haven’t really caught up with what’s happening on the ground. Penelope Trunk wrote a an article called The Joys of Adult Sexting, in which she does it and thinks:

And what will his friends think of me? Probably nothing. Because they have women sending nude photos of themselves. It’s not that big a deal. You know how I know? Because the state of Vermont, (and other states as well) is trying to pass a law that decriminalizes sending nude photos of oneself if you are underage. That’s right: For years, even though kids were sending nude photos of themselves to someone they wanted to show it to, the act was illegal—an act of trafficking in child pornography.

But sending nude photos is so common today that lawmakers are forced to treat it as a mainstream courting ritual and legalize it for all ages.

Sending a naked photo of yourself is an emotionally intimate act because of the implied trust you have in the recipient. When you act in a trusting way—like trusting the recipient of the photo to handle it with care and respect—you benefit because being a generally trusting person is an emotionally sound thing to do; people who are trusting are better judges of character.

Trunk’s last paragraph explains why, despite all the PSAs and education and whatever in the world, people are going to keep doing it: because it shows trust, and we want significant others to prove their trust and we want to show significant others we trust them. You can already imagine the dialogue in a novel: “Why won’t you send me one? Don’t you trust me?” If the answer is yes, send them; if the answer is no, then why bother continuing to date? The test isn’t fair, of course, but since when are any tests in love and lust fair?

Over time, as enough kids of legislators and so forth get caught up in sexting scandals and as people who’ve lived with cell phone cameras grow up, I think we’ll see larger change. For now, the gap between laws / customs and reality make a fruitful space for novels, even those that don’t exploit present circumstances well, like Helen Shulman’s This Beautiful Life. Incorporating these kinds of social changes in literature is a challenge and will probably remain so; as I said above, that doesn’t mean novelists should automatically say, “Ah ha! Here’s today’s headlines; I’m going to write a novel based on the latest sex scandal/shark attack/celebrity bullshit,” but novelists need to be aware of what’s going on. I wrote a novel called Asking Alice that got lots of bites from agents but no representation, and the query letter started like this:

Maybe marriage would be like a tumor: something that grows on you with time. At least that’s what Steven Deutsch thinks as he fingers the ring in his pocket, trying to decide whether he should ask Alice Sherman to marry him. Steven is almost thirty, going on twenty, and the future still feels like something that happens to other people. Still, he knows Alice won’t simply agree to be his long-term girlfriend forever.

When Steven flies to Seattle for what should be a routine medical follow up, he brings Alice and hits on a plan: he’ll introduce her to his friends from home and poll them about whether, based on their immediate judgment, he should ask Alice. But the plan goes awry when old lovers resurface, along with the cancer Steven thought he’d beaten, and the simple scheme he hoped would solve his problem does everything but.

Asking Alice is asking questions about changes in dating and marriage; if you write a novel today about the agonies of deciding who to marry with the metaphysical angst such a choice engendered in the nineteenth century, most people would find that absurd and untrue: if you get married to a Casaubon, you divorce him and end up in about the same circumstance as you were six months before you started. But a lot of people still get married or want to get married, and the question is still important even if it can’t drive the plot of a novel very well. It can, however, provide a lot of humor, and that’s what Asking Alice does.

A lot of literature, like a lot of laws, is also based on the premise that women don’t like sex as much as men, don’t or won’t seek it out, and are automatically harmed by it or wanting it. This is a much more tenuous assertion than it used to be, especially as women write directly about sex. A novel liked Anita Shreve’s Testimony, discussed extensively by Caitlin Flanagan here and by me here, engages that idea and finds it somewhat wanting. So does the work of Belle de Jour (now revealed as Dr. Brooke Magnanti), who basically says, “I worked as a hooker for a long time, didn’t mind it, and made a shit ton of money because I made a rational economic decision.” A lot of academic fiction premised on professors having sex with students examines the idea that female students can want/use sex just as much as men; this is how Francine Prose’s Blue Angel works, and Prose is a canny observer of what’s going on and how it connects to the past.

Note that women wrote all these examples, which I don’t think is an accident, since they’re probably less likely to put other women on pedestals than men are. I’ve been reading a lot of sex memoirs / novels written by women (Never the Face; Nine and a Half Weeks; two of Mary Karr’s memoirs, which are good but overrated; Abby Lee (British sex blogger); Elisabeth Eaves’ Bare) in part because I want to write better female characters. After reading a lot of this stuff, I’m even less convinced than I was that there are stereotypically “male” or “female” ways of thinking or writing about the world, but knowledge itself never hurts and I don’t regret the time spent. On a similar note, Janice Radway’s Reading the Romance is totally fascinating, even when Radway tries to explain away retrograde features of romances or how women are often attracted to high-powered, high-status men.

She write in a time before sexting, but I wonder if she’s thought about doing a Young Adult version using similar methodology today. For writers and others, sexting shows that teenagers can make their own decisions as people too, even if those are arguably bad decisions. To me, this is another generational gap issue, and one that will probably close naturally over time. One older agent said on the phone that maybe I needed a younger agent, because her assistant loved Asking Alice but she didn’t want to rep it.


I’m old enough to have lived through a couple medium-scale social changes: when I was in high school, people still mostly talked to each other on the phone. In college, people called using cell phones and often communicated via IM. After college, I kept using phones primarily for voice, especially to arrange drinks / quasi-dates, until I realized that most girls have no ability to talk on the phone anymore (as also described Philip Zimbardo and the ever-changing dynamics of sexual politics). As I result, I’d now use text messages if I were arranging drinks and so forth. Around the time I was 23, I realized that even if I did call, women would text back. That doesn’t mean one should race out and change every phone conversation in a novel that features a contemporary 19-year-old to a text conversation (which would be tedious in and of itself; in fiction I write, I tend not to quote texts very often), but it’s the kind of change that I register. Things changed between the time I was 16 and 23.

I’m in the McLuhan, “the medium changes what can be said,” which means that the text is probably changing things in ways not immediately obvious or evident. Sexting is one such way; it lowers the cost of transmission of nude pictures to the extent that you can now do so almost instantly. Laws are predicated on the idea that balding, cigar-chomping, lecherous 40-year-old men will try and coerce 16-year-old girls outside cheer practice, not ubiquitous cell phone cameras. Most parents will instinctively hate the cigar-chomping 40-year-old. They will not hate their own 14-year-old. So you get for all sorts of amusement where laws, putative morals, conventional wisdom, technology, and desire meet. Still, when pragmatics meet parents, expect parental anger / protectiveness to win for the moment but not for all time. Nineteenth and twentieth century American culture is not the only kind out there. As Melvin Konner wrote in The Evolution of Childhood:

Contrary to some claims of cultural historians, anthropologists find that liberal premarital sex mores are not new for a large proportion of the cultures of the ethnological record and that liberal sexual mores and even active sexual lives among adolescents do not necessarily produce pregnancies. In fact, a great many cultures permit or at least tolerate sex play in childhood (Frayser 1994). Children in these cultures do not play ‘doctor’ to satisfy their anatomical curiosity—they play ‘sex.’ They do play ‘house’ as Western children do, but the game often includes pretend-sex, including simulated intercourse. Most children in non-industrial cultures have opportunities to see and hear adult sex, and they mimic and often mock it.

Perhaps our modern aversion to sex among adolescents is in part because of the likelihood of pregnancy, economic factors, and others. Given the slow but real outcry from places like the Economist and elsewhere, this might eventually change. That’s pretty optimistic, however. A lot of social and legal structures merely work “good enough,” and the justice system is certainly one of those: we’ve all heard by now about cases where DNA evidence resulted in exoneration of people accused of murder or rape. So maybe we’re now heading towards a world in which laws about sexting are unfair, especially given current practice, but the laws remain anyway because the law doesn’t have to be optimal: it has to be good enough, and most people over 18 probably don’t care much about it unless it happens to be their son or daughter who gets enmeshed in a legal nightmare for behavior that doesn’t result in tangible harm.

Something like a quarter to a third of American adults have smoked pot, but we still have anti-pot laws. America can easily afford moral hypocrisy, at least for now, and maybe sexting will be something like weed: widely indulged in, a rite of passage, and something not likely to result in arrest unless you happen to be unlucky or in the wrong situation at the wrong time. The force generation the prohibition—that is, parents engaging in daughter-guarding—might be much stronger than the force of individual rights, utilitarianism, or pragmatic observations about the enforcement of laws against victimless crimes that do not result in physical harm.

There’s more of the legal challenges around this in Ars Technica’s article “14-year old child pornographers? Sexting lawsuits get serious,” which should replace “serious” with “ridiculous.” In the case, a 14-year-old girl sent a 14-year-old boy a video of herself masturbating, and then her family sued his. But how does a 14-year-old be guilty of the sexual exploitation of children,” as is claimed by the girl’s family—if a 14 year old can’t consent to consent to this kind of activity, then a 14-year old also can’t have the state of mind necessary to exploit another one. Paradoxes pile up, of the sort described in Regulating Sex: The Politics of Intimacy and Identity, where the writers show how the age of consent has been rising as the age of being tried as an adult has been falling. Somewhere inside that fact, or pair of facts, there’s a novel waiting to be written.

Questions like “What happens when people do things sexually that they’re not supposed to? How does the community respond? How do they respond?” are the stuff novelists feed on. They motivate innumerable plots, ranging from the beginnings of the English novel at Pamela and Clarissa all the way to the present. When Rose and Pinkie are first talking to each other in Brighton Rock, Rose lies about her age: ” ‘I’m seventeen,’ she said defiantly; there was a law which said a man couldn’t go with you before you were seventeen.” Brighton Rock was published in 1938. People have probably been evading age-of-consent laws for as long as there have been such laws, and they will probably continue to do so—whether those laws affect sex or depictions of the body.

Adults have probably been reinforcing prohibitions for as long as they’ve existed. Consider this quote, from the Caitlin Flanagan article about Testimony linked above:

Written by a bona fide grown-up (the author turned 63 last fall), Testimony gives us not just the lurid description of what a teen sex party looks like, but also an exploration of the ways that extremely casual sex can shape and even define an adolescent’s emotional life. One-night stands may be perfectly enjoyable exercises for two consenting adults, but teenagers aren’t adults; in many respects, they are closer to their childhoods than to the adult lives they will eventually lead. Their understanding of affection and friendship, and most of all their innocent belief, so carefully nurtured by parents and teachers, that the world rewards kindness and fairness, that there is always someone in authority to appeal to if you are being treated cruelly or not included in something—all of these forces are very much at play in their minds as they begin their sexual lives.

In Testimony, the sex party occurs at the fictional Avery Academy; Shreve imagines Siena, the girl at the center of the event, as a grifter, eager to exploit her new status as victim so that she can write a killer college essay about it, or perhaps even appear on Oprah. For the most part, the boys are callous and self-serving.

Flanagan has no evidence whatsoever that “teenagers aren’t adults” other than bald assertion. That “they are closer to their childhoods than to the adult lives they will eventually lead” has more to do with culture than with biology, as Robert Epstein argues in The Case Against Adolescence: Rediscovering the Adult in Every Teen and Alice Schlegel and Herbert Barry argue in Adolescence: An Anthropological Inquiry, and even then, it depends on when a particular person hits puberty, how they react, and how old they are; nineteen-year olds are probably closer to their adults selves than thirteen-year olds. Saying that teenagers believe, according to an ethos created by teachers, that “the world rewards kindness and fairness,” indicates that Flanagan must have had a very different school experience than I did or a lot of other people did (for more, see “Why Nerds are Unpopular.”) As I recall, school was capricious, arbitrary, and often stupid; the real world rewards fulfilling the desires of others, whether artistically, financially, sexually, or otherwise, while the school world rewards jumping through hopes and mindless conformity. If I don’t like the college I go to, I can transfer; if I don’t like my job, I can quit; if I don’t like some other milieu, I leave it. In contrast, school clumps everyone together based on an accident of geography.

In Testimony, Shreve misses or chooses not to emphasize that Sienna enjoys the attention, and she’s not actually got much beyond that. She says that “I”m going to start a new life. I can be, like, Sienna. I can whoever I want” {Shreve “Testimony”@27}. In Rob’s voice, Sienna is described this way:

I remember that Sienna started moving to the beat, a beer in her hand, as if she were in a world of her own, just slowly turning this way and that, and moving her hips to the music, and little by little the raucous laughter started to die down, and we were all just watching her. She was the music, she was the beat. Her whole little body had become this pure animal thing. She might have been dancing alone in her room. She didn’t look at any of us, even as she seemed to be looking at all of us. There was no smile on her face. If it was a performance, it was an incredible one. I don’t think anyone in the room had ever seen anything like it. She was in this light-blue halter top with these tight jeans. The heels and her little jacket were gone already. You just knew. Looking at her, you just knew.

She took off her own clothes, and “We watched as she untied her halter top at the neck. The blue cloth fell to reveal her breasts. They were beautiful and firm and rounded like her face. You knew at that moment you were in for good [. . .]” Later, he says “It was group seduction of the most powerful kind.” Given how Mike, the headmaster, describes the video in the first section, it’s hard to see Sienna as lacking agency, or someone who’s coerced into her actions. That, in the end, is what I think makes the Caitlin Flanagans of the world so unhappy: if the Siennas will perform their dances and give it up freely and happily, does that mean other girls will have to chase the market leader? Will they have to acknowledge that a reasonably large minority of girls like the action, like the hooking up, like the exploring? If so, a lot of Western narratives about femininity go away, if they haven’t already. If you’re a novelist, you have to look at the diversity of people out there and the diversity of their desires. Shreve does this quite well. So does Francine Prose in Blue Angel. If you’re writing essays / polemics, though, you can questionable claim that teenagers are closer to their childhood selves all you want.

I like Flanagan’s writing because she’s good at interrogating what’s going on out there, but I’m not the first to notice her problems with politics; William Deresiewicz is more concise than I am when he writes Two Girls, True and False, but the point is similar. Flanagan wants to imply that all people, or all girls, are the same. They aren’t. The ones unhappy with the hookup culture are certainly out there, and they might be the majority. But the Siennas are too. To deny them agency because they’re 14 is foolish. Matthew, J. Dot’s father, says that “The irony was that if a few kids had done something similar at the college, they’d be calling it an art film.” He’s right. Things don’t magically change at 18. Our culture and legal system are designed around the fiction that everything changes at 18, when it actually does much earlier. The gap between puberty and 18, however, is a fertile ground for novelists looking for cultural contradictions.

Thinking and doing: Procrastination and the life of the mind

I finally got around to reading James Surowiecki’s “What does procrastination tell us about ourselves?” (answer: maybe nothing; maybe a lot), which has been going around the Internet like herpes for a very good reason: almost all of us procrastinate, almost all of us hate ourselves for procrastinating, and almost all of us go back to procrastinating without really asking ourselves what it means to procrastinate.

According to Surowiecki, time preferences help explain procrastination. For a good introduction on the topic, see Philip Zimbardo and John Boyd’s The Time Paradox. The short, non-technical version: Some people tend to value present consumption more than future consumption, while others are the inverse. And it’s not just time preferences that change who we are; as Dan Ariely documents in Predictably Irrational, we also change our stated behaviors based on whether, for example, we’re aroused. We also sometimes prefer to bind ourselves through commitments to deadlines or to external structures that will “force” us to behave a certain way. How many dissertations would be completed without the social stigma that comes from working on a project for years and failing to complete it, coupled with the threat of funding removal?

The basic issue is that we have more than one “self,” and the self closest to the specious present (which lasts about three seconds) might be the “truest.” This comes out in the form of procrastination. To quote at length from Surowiecki, who is nominally reviewing The Thief of Time: Philosophical Essays on Procrastination:

Most of the contributors to the new book agree that this peculiar irrationality stems from our relationship to time—in particular, from a tendency that economists call “hyperbolic discounting.” A two-stage experiment provides a classic illustration: In the first stage, people are offered the choice between a hundred dollars today or a hundred and ten dollars tomorrow; in the second stage, they choose between a hundred dollars a month from now or a hundred and ten dollars a month and a day from now. In substance, the two choices are identical: wait an extra day, get an extra ten bucks. Yet, in the first stage many people choose to take the smaller sum immediately, whereas in the second they prefer to wait one more day and get the extra ten bucks.

In other words, hyperbolic discounters are able to make the rational choice when they’re thinking about the future, but, as the present gets closer, short-term considerations overwhelm their long-term goals. A similar phenomenon is at work in an experiment run by a group including the economist George Loewenstein, in which people were asked to pick one movie to watch that night and one to watch at a later date. Not surprisingly, for the movie they wanted to watch immediately, people tended to pick lowbrow comedies and blockbusters, but when asked what movie they wanted to watch later they were more likely to pick serious, important films. The problem, of course, is that when the time comes to watch the serious movie, another frothy one will often seem more appealing. This is why Netflix queues are filled with movies that never get watched: our responsible selves put “Hotel Rwanda” and “The Seventh Seal” in our queue, but when the time comes we end up in front of a rerun of “The Hangover.”

The lesson of these experiments is not that people are shortsighted or shallow but that their preferences aren’t consistent over time. We want to watch the Bergman masterpiece, to give ourselves enough time to write the report properly, to set aside money for retirement. But our desires shift as the long run becomes the short run.

This probably explains why you have to like the daily process of whatever you’re becoming skilled at (writing, researching, law, programming) in order to get good at it: if you have a very long term goal (“Write a great novel” or “Write an entire operating system”), you’ll probably never get there because it’s very easy to defer that until tomorrow. But if you break the task down (I’m going to write 500 words today; I’m going to work on memory management) and fundamentally like the task, you might actually do it. If your short-term desires roughly align with your long-term desires, you’re doing something right. If they don’t, and if you can’t find a way to harmonize them, you’re going to be the kind of person who looks back in 20 years and says, “Where did the time go?”

The answer is obvious: minute by minute and second by second, into activities that don’t pass what Paul Graham calls “The obituary test” in “Good and Bad Procrastination” (like many topics others pass over, he’s already thought about the issue). Are you doing something that will be mentioned in your obituary? If so, then you’re doing something right. Most of us aren’t: we’re watching TV, hanging out on Facebook, thinking that we really should clean the house, waiting for 5:00 to roll around when we get off work, thinking we should go shopping for that essential household item. As Graham says, “The most impressive people I know are all terrible procrastinators. So could it be that procrastination isn’t always bad?” It isn’t, as long as we’re deferring something unimportant for something important, and as long as we have appropriate values for “important.”

So how do we work against bad procrastination and towards doing something useful? The question has been on my mind lately, because a friend who’s an undergrad recently wrote:

A lot of my motivation comes from a fantasy of myself-as-_____, where the role that fills the blank tends to change erratically. Past examples include: writer, poet, monk, philosopher, womanizer. How long will the physicist/professor fantasy last?

I replied:

This is true of a lot of people. One question worth asking: Do you enjoy the day-to-day activities involved with whatever the fantasy is? For me, the “myself-as-novelist” fantasy continues to be closer to fantasy than reality, although “myself-as-writer” is definitely here. But I basically like the work of being a novelist: I like writing, I like inventing stories, I like coming up with characters, plot, etc. Do I like it every single day? No. Are there some days when it’s a chore to drag myself to the keyboard? Absolutely. And I hate query letters, dealing with agents, close calls, etc. But I like most of the stuff and think that’s what you need if you’re going to sustain something over the long term. Most people who are famous or successful for something aren’t good at the something because they want to be famous or successful; they like the something, which eventually leads to fame or success or whatever.

If you essentially like the day-to-day time in the lab, in running experiments, in fixing the equipment, etc., then being a prof might be for you.

One other note: writer, poet, and philosopher have some aspect of money involved in it. So does physicist / professor. Unless you’re Neil Strauss or Tucker Max, “womanizer” is probably a hobby more than a profession. And think of Richard Feynman as an example: he sounds like he got a lot of play, but that wasn’t his main focus; it’s just something he did on the side, so to speak. (“You mean, you just ask them?!”). The more you have some other skill (being a writer, a rock star, whatever), the easier it seems to be to find members of your preferred sex to be interested in you. In Assholes Finish First, Max notes that women started coming to him after his website became successful (note that I have not had the same experience writing about books and lit).

As for the physicist/prof fantasy, I have no idea how long it will last. You sound like you’re staying upwind, per Paul Graham’s essay “What You’ll Wish You’d Known“, which is important because that will let you re-deploy as time goes on. To my mind, read/writing and math are upwind of almost everything else; if you work on those two – three subjects, you’ll probably be okay.

One nice thing about grad school in physics is that you can apparently leverage that to do a lot of other things: programming; becoming a Wall Street quant; doing various kinds of business analysis; etc. It’s probably a better fantasy than monk, poet, or philosopher for that reason. The “philosopher” thing is also (relatively) easy to do on the side, and I would guess it’s probably more fun writing a philosophy blog than writing peer-reviewed philosophy papers, which sounds eminently tedious, at least to me.

Oh: and I have a pile of unposted, half-written blog posts in my Textmate project drawer:

You can see a pile of them on the left. Most will eventually get written. Some will eventually be deleted. All were started with good intentions. Some have been sitting there for a depressingly long period of time. In fact, this post might have found its way among them, if not for the fact that I decided to write it in a single blaze of activity, and if not for the fact that I’m writing about procrastination, this post might have gone the way of many others: half-finished and eventually abandoned.

One reason I’ve had staying power with this blog, while so many of my friends have written a blog for a few months and then quit, is because I basically like blogging for its own sake. Blogging hasn’t brought me fame, power, money, groupies, or other markers of conventional success (so far, anyway!), and it appears unlikely to do so in the short- to medium-term (the long term is anyone’s guess). Sometimes I worry that blogging keeps me from more important work, like writing fiction, but I keep doing it because I like it and because blogging teaches me a lot about the subject I’m writing about and is an excellent forum for small ideas that might one day grow into much larger ones. This is basically the issue that “Signaling, status, blogging, academia, and ideas” discusses.

If the small projects lead to the big projects, you’re doing something right. If the small projects supplant, instead of supplementing, the big projects, you’re doing something wrong. But if you don’t like the small increments of whatever you’re working on, you’re not likely to get to the big project. You’re likely to procrastinate. You’re likely to skip from fantasy to fantasy instead of finding your place. You’re not likely to do the right kind of procrastinating. I wish I’d realized all this when I was younger. Of course, I wish I’d learned a lot of things when I was younger, but I didn’t have Surowiecki, Graham, Zimbardo, Max, and Feynman. Now I do, which enables me to say, “this blog post itself is a form of procrastination, but a productive one, and it’s therefore one I’m going to finish because I like writing it.” That sure beats improbable resolutions.

%d bloggers like this: