Cloverfield

Warning: spoilers ahead.

Normally this blog focuses on books, but Cloverfield is the rare film with sufficient depth and impact to make it worth a full post, with the second viewing more profound than the first. Cloverfield speaks to modern anxieties about fear, terrorism, and response more effectively than most movies, full stop, let alone horror movies.

The monster itself in Cloverfield is unexplained, much as 9/11 took the vast majority of Americans by surprise—even those who were nominally supposed to guard against such events. The only hint regarding the title comes at the beginning, with a brief video indicating that we’re about to watch a Department of Defense video related to “Cloverfield,” but with no other sign of the name’s meaning, if any. The shot functions like a false “translator’s preface” or statement of authenticity at the beginning of many older novels that claims historical authenticity. Still, it reassures us that civilization—or at least the Department of Defense—has survived the attack long enough to create the video.

The first twenty minutes are a party like too many I’ve been to, except, this being Hollywood, with more attractive participants. Filmed chiefly by Hud, a character notable chiefly for his passivity and lack of character, the movie really begins with reports of the monster and then the lights being extinguished. On the Manhattan streets, a wall of dust rolls toward people—like in videos of the World Trade Center’s collapse. The head of the Statue of Liberty rolls through the street, indicating that perhaps liberty itself has died, or at least has within the monster’s zone. A character says, “I saw it. It’s alive,” leaving the “it” floating in space, imagination filling in the details.

The monster’s purpose, aside from terror, if any, is mysterious, and the response to the unnamed monster becomes steadily more draconian as the movie continues. Over time, the responses to 9/11, especially regarding air traffic and civil rights have become more draconian, culminating to the point that airports, flying, and foreign travel are now burdens that grow more onerous over time (see here, here, and especially the discussion of the apt phrase “security theater” in Bruce Schneier’s philosophical book concerning the modern age, Beyond Fear, which is available free here). Books like The Lucifer Effect demonstrate the effects of systems designed to dehumanize people—and such books are, for the moment, mostly ignored, like distant shooting in a war zone. As Cloverfield continues the constant drone of war in the background becomes like modern cable news. I recently started teaching college freshmen, and the other day I was talking to a guy who made an offhand comment that in turn made me realize that, to him, we’ve virtually always been fighting wars in Afghanistan or Iraq.

In this atmosphere, movies are beginning to reflect the larger world, as art always does. Ross Douthat wrote wrote an excellent piece on contemporary movies called The Return of the Paranoid Style, which analyzes movies as a rerun of the 70s:

Conservatives such as Noonan hoped that 9/11 would bring back the best of the 1940s and ’50s, playing Pearl Harbor to a new era of patriotism and solidarity. Many on the left feared that it would restore the worst of the same era, returning us to the shackles of censorship and conformism, jingoism and Joe McCarthy. But as far as Hollywood is concerned, another decade entirely seems to have slouched round again: the paranoid, cynical, end-of-empire 1970s.

We expected John Wayne; we got Jason Bourne instead.

The essay is not easily excerpted, and is worth reading in full. Cloverfield doesn’t fit well in its thesis: the movie contains little in the way of overt politics, but whether intentionally or not, its manifestations of current fears about monsters that don’t die when we attack with airstrikes or even ground forces. Although Cloverfield is symbolic of fears regarding attack, one of its strengths is its refusal to be partisan. The military is depicted heroically, and there is little in Cloverfield that indicate self-flagellation. It is all immediate reaction and fear, and, like terrorism, tends to leave us with more questions than answers.

An essay in Terry Teachout’s Reader called “Beasts and Superbeasts” observes “nothing thrills us more than stories implying that there are dark forces in the world too powerful to be tamed by human hands.” This was in 1999; he also wrote that “Of late […] cinematic horror has entered a decadent phase in which vampires have mostly given way to serial killers whose murderous frenzies are coolly explained away by psychiatrist-sleuths, while semi-satirical movies like Scream openly spoof the all-too-familiar conventions of the genre […]” Maybe 9/11 has allowed us to return to the mystery of devils walking among us, the unexplained or poorly explained, and the terrifying unknown. It’s not the monster that scares us in Alien, but the fact that we don’t know where the monster is, don’t know why it operates as it does, and can’t reason with it. In Cloverfield, the monster scares us for our inability to understand it or attack it with bullets and bombs.

The impetus for “Beasts and Superbeasts” was The Blair Witch Project, a movie that, “[…] though hugely entertaining, is not especially scary, no doubt because it was all too clearly made by people who do not believe in the demons whose presence they have so cunningly implied.” Although Teachout overstates the case against The Blair Witch Project, as it is scary in more than a “gotcha!” way to me, recalling as it does those times in the woods, his general principle is true. If The Blair Witch Project reflects the decadent 90s in that respect, Cloverfield aesthetically and artistically benefits from the opposite in the 2000s, as the idea of an attack against New York isn’t a fantasy or goblin any longer. That’s bad for the United States but can lend heft to movies. Cloverfield takes its subject seriously, as Teachout argues The Sixth Sense. That’s not to say it has no jokes, usually relating to Hud’s obliviousness, but it has more emotional power thanks to its resonance with events.

Too many recent novels and movies take the first twenty minutes of Cloverfield and extend them onwards and upwards. The bored lassitude of 20-something partiers captured so well by Claire Messud in The Emperor’s Children is evident in the first fifth of Cloverfield, and its cameraman never escapes from the semi-hipster attitude of overgrown children. The characters are smaller-than-life, and their own motivations are barely more articulated than the monster’s—their inchoateness is itself a commentary on the kinds of unexamined lives that seem not uncommon. The difference between Cloverfield and its competitors, and one reason it passes Teachout’s “Beast and Superbeasts” tests, is that it is about something beyond itself, unlike, say, Garden State or London, the latter a smaller movie like Cloverfield but without the monster.

This essay has a central weakness built into its reading of horror and politics in that those who flew planes into buildings were human, as are those who order bombs dropped on cities from 20,000 feet. The motivation for either may appear foreign to those on the receiving end, but it is not wholly un-understandable; Al-Queda regularly posts video haranguing the West, however illogically or unfairly, and the toxic conditions of Afghanistan were a product of a long line of cultural and historical developments. As Charlie Wilson’s War observes, we did to aid in the construction of our Frankenstein’s monster, though we didn’t notice until after the fact. We blundered in Baghdad, as James Fallows argues, though Iraq might eventually become stable. We feel as if 9/11 came from nowhere, like the unnamed monster does in Cloverfield, whose very lack of identifier is appropriate: 9/11 has stuck to the event and day, but it’s an odd moniker, almost by default, especially compared to other infamous events that come with location signifiers (Pearl Harbor, Gulf of Tonkin). Still, it’s worth remembering the danger of creating an unknowable other who is easier to demonize in a Lord of the Flies style. The markers tying Cloverfield and terrorism are still there, however, and its warning of the dangers worth remembering.

It’s presidential campaign season, and candidates in both parties are eagerly trying to avoid being associated with the foreign policy snafus of the last five years that are the equivalent of shooting missiles that aren’t effective, as America veers dangerously between wanting to pull out altogether from our “adventure” in Iraq and the temptation to continue striding about the world without paying enough attention to whether we’re about to step on an unexpected landmine. Countries we should be paying more attention to, like many former Soviet Republics, get short shrift, as Douthat says in a blog post, while Iraq and Afghanistan pull more than their weight thanks to the relative size of our commitments there. The worrying thing is that the total focus on Al-Queda and Iraq might let another Cloverfield event occur, seemingly out of nowhere, in which a purely military response will be ineffective when we’re left confused and reacting instead of lifting our eyes from the collective party long enough to see the punch before we land, disoriented, on the floor.

In Cloverfield, to save us, we have to destroy Manhattan, and the ambiguous moral calculus remains just that: ambiguous. The most startling part of Cloverfield is its lack of conclusion or certainty. Characters constantly ask each other, “What was that?” and find no answers. The Brooklyn Bridge is destroyed by the monster, with an American flag falling with it. A TV monitor shows “Manhattan under attack,” followed by an image of military trucks responding to the carnage. But will the military be effective in this situation? At least using conventional, World War II-style tactics, the answer appears to be no. But the thing must be fought anyway, as it’s in Manhattan. Maybe if we can ask the right questions, we’ll eventually learn how to fight it—otherwise, we might have to destroy villages in order to save them.


While on the topic of movies, I was going to also pan The X-Files: I Want to Believe, but Slate provides such a solid hit that I’m left with nothing worth discussing:

The nefarious plot behind the agent’s abduction is so far-fetched I’m itching to spoil it. But I’ll limit myself to observing that, if ever I’m dying of a rare brain disease, I hope my surgeon won’t go home and frantically Google treatment options, as Scully does at one key moment. (Couldn’t she at least log on to Medscape?) The problem with the movie’s semisupernatural crime plot, though, isn’t that the resolution is completely outlandish; it’s that the outlandishness is insufficiently grounded in pseudoscience. If you’re going to posit stuff this crazy, you’d better have some solid-sounding bullshit to back it up.

[…]

I’m not quite of a mind with Slate’s Troy Patterson in finding the new movie “vomitously stupid”; rather, it’s a gorgeous, lulling, thoroughly unnecessary exercise in high-minded Anglophilia.

Renting Cloverfield and watching it even for the third, fourth, or fifth time is infinitely preferable than the second X-Files movie.

The Time Paradox — Philip Zimbardo and John Boyd

As with many great works of nonfiction, Philip Zimbardo and John Boyd’s The Time Paradox: The New Psychology of Time That Will Change Your Life has that paradoxical quality of being incredibly profound and yet, in retrospect, blindingly obvious. It encompasses philosophical debates that occur at all levels of art; fiction often represents our feelings about time, while The Time Paradox lists a few dozen pop songs that contain messages about forms of time orientation. Last weekend I saw Woody Allen’s new movie, Vicky Christina Barcelona, in which one character, Vicky, lives oriented toward the stable future: a nice house, a boring but wealthy husband, and a life that is unlikely to end in a crater but also unlikely to offer stimulating adventures. Christina, played by the luscious and perfectly cast Scarlett Johansson, is a sensual hedonist who pursues novelty and risk-taking. Their contrasting ways of life begin the story, with the two balanced against Juan Antonio’s foil.

The movie is more sophisticated than this, as any art that can be accurately captured in summary is not worth experiencing. Nonetheless, just as The Hero With A Thousand Faces explicitly analyzes the scaffolding of many adventure stories, The Time Paradox implicitly discusses the dominant time views of many works of art. Some, like The Great Gatsby, show opposing characters who see time, and hence one another, in different ways; in such a reading, Nick Carraway is a present-oriented fatalist with little personality of his own, while Jay Gatsby combines a past-positive perspective of Daisy with a future-oriented work ethic that he thinks will win her back. Gatsby on a larger level criticizes both views: in bending all his time orientations toward a particular person, Gatsby’s obsession ultimately leads to a ruinous car crash, destroying himself in crime, like the crime that his wealth is built on, while Nick, without the focus of his attention, seems to drift without learning. The novel’s last line, one of my favorites in all literature, soothes or terrifies the reader by reminding us of how life will continue for others even when it does not for us:

Gatsby believed in the green light, the orgastic future that year by year recedes before us. It eluded us then, but that’s no matter—tomorrow we will run faster, stretch out our arms farther. . . . And one fine morning—
So we beat on, boats against the current, borne back ceaselessly into the past.

Whether we are terrified by this receding light depends on our reaction to it and how we handle that past.

Zimbardo also wrote The Lucifer Effect: Understanding How Good People Turn Evil, which together with Dan Ariely’s Predictably Irrational, pokes holes in traditional economic thinking concerning man as as a rational actor. All three argue that things are not as simple. In Zimbardo and Boyd’s case, the problem is that we don’t consciously realize how we tend to think about past, present, and future, or if we do, we aren’t able to step outside ourselves to realize how we’re thinking. What is “rational?” in the context of past, present, and future? To enjoy the moment, or to work toward a future moment? Zimbardo and Boyd implicitly argue neither, and they point to the poorly understood trade-offs we make regarding how we orient ourselves chronologically. That I use the language of economics to present this parallels Zimbardo and Boyd, who discuss “The Economics of Time” along with the nature of opportunity costs—another well-known issue too little referenced in everyday discourse.

Learning about opportunity costs, including those of being oriented toward present, past, or future, gives one more information and hopefully leads to better decision making. This meta-critical force is powerful, if poorly understood, and what I like so much about Zimbardo’s books is their ability to take on this meta-critical function and put it to paper—like a good therapist or friend—pointing to the blind spots we don’t realize exist. Self-help books should do this but often don’t, or if they do—like Marti Olsen Laney’s The Introvert Advantage: How to Thrive in an Extrovert World*—they’re filled with clichés or otherwise poorly written. The Introvert Advantage is especially painful because it conveys a useful message to both introverts and extroverts, but is marred by stylistic problems. The Time Paradox’s promises as a self-help book are slightly deceiving: it is more like a book discussing research that happens to dress in self-help clothing. And aren’t all books, or all art, on some level designed to provide “self-help?” But no matter: the genre, if any, is transcended by the content, as happens here.

The Time Paradox is also clever in its examples of traps each kind of person creates for themselves, whether those focused on the past to the detriment of their daily lives, those focused on the present to the detriment of their belief in their own ability to change the future, or those focused on the future who lose their sense of joy. Regarding the latter, for example, the authors write that “[…] future-oriented workaholics who do not cultivate sensuality and sexuality have little interest in making friends or “wasting” time in playful activities—a recipe for sexual deprivation. In contrast, the present-oriented might be too focused on such aspects, resulting in pregnancy, disease, or awkward pictures on the Internet.

Elsewhere, regarding those who are oriented toward the future, Zimbardo and Boyd say “[…] they do not spend time ruminating on negative past experiences. They focus on tomorrow, not yesterday.” This has advantages, especially in societies that reward delayed gratification, but also problems, as such “futures” can appear callous, or uninterested in the past, or less capable of building friendships based on experiences—perhaps leading them to feel emotionally isolated, or even held back in work. Futures might succeed through plotting and the aforementioned delayed gratification, but they might also miss some aspects of creativity. For example, Zimbardo and Boyd describe a maze game in which futures tended to outperform presents in navigating a mouse through a maze. But, as the authors write:

Many of the presents who failed got frustrated at not finding the right path and ended up making a straight line to the gaol, bursting through the cul-de-sac barriers.

Perhaps some measure of conventional success is thanks not due to following rules and accepting constraints, but through redefining problems and solutions. As one character says to another in The Matrix, some rules can be bent; others, broken. Technological and artistic progress** often stem from such unconventionality. That isn’t to make a logic error and say that unconventionality automatically equates with progress, but channeled in the right area, it might be necessary if not sufficient.

The Sept. 1 issue of The New Yorker shows a cartoon in which a man says, “I’m not losing my memory. I’m living in the now,” implying a past orientation moved into the present caused by age. Mental faculty creates time impressions, and physical changes, including drugs, can alter them—and not necessarily for the worse. In a section regarding how to become more present-oriented, for example, Zimbardo and Boyd offer the recommendation “drink alcohol in moderation,” which is the sort of self-help I’m only too happy to indulge. Perhaps so many writers and artists are alcoholics because they need to get out of the past (Faulkner) or future.

In suggesting this, however, I’m succumbing to the book’s major potential weakness: presenting time disorders or problems as an overly major source of anxiety and in turn diagnosing time as a source of maladies, rather than perhaps an effect. For example, Zimbardo and Boyd come perilously close to implying that correlation is causation when they discuss the outcomes of the time scales they developed to measure one’s attitude; in an early section, they attribute a focus on immediate gratification, self-stimulation, and short-term payoffs to perhaps too great a degree.

Other sections should be qualified, as when Zimbardo and Boyd write that “Our scarcest resource, time is actually much more valuable than money.” That depends on, for example, how much money we have; if I had no food, I would very readily trade some time for money, and almost every day I engage in some transaction designed to turn time into money. For, say, billionaires, time is more scarce than money or virtually any other resource, and it’s worth noting here what economists call the backward bending shape of the labor curve—that is to say, as a person’s earnings increase, they tend to work more hours, but at a certain point, they tend to cut back in order to enjoy the results of those earnings. An extreme example of that tendency can open between generations: the hard-working parents provide so plentifully for their offspring that the offspring tend to adopt a hedonistic, present-oriented lifestyle that ultimately destroys the future-oriented values of work and thrift that led to creation of the fortune in the first place. Today, it’s Paris Hilton or the ceaseless articles about how we damn kids lack the work ethic of the old days; yesterday it was Vanderbilts and Astors whose descendants are now mostly middle-class, and tomorrow it will be the tech titans’ legacy.

Yet even if I don’t entirely agree with sections or nit-pick, merely raising the issues leads us to consider them, our own behavior, and most importantly, how to best lead our lives and allocate a resource Zimbardo and Boyd imply many barely consider. At the end of the last paragraph, I analogized time perspectives to family and social dynamics—an idea I wouldn’t have considered prior to reading The Time Paradox.

Zimbardo and Boyd rightly caution readers not to assume that a person is entirely one orientation, since all people have some level of all orientations within them. Instead, the reader should try applying their own (past, presumably) behavior to the models in order to evaluate them within the framework both offer. Perhaps their most powerful recommendation is one that echoes Viktor Frankl’s Man’s Search for Meaning and Stoic philosophers: that although we can’t always control events, we can control our reactions and try to influence them. Zimbardo and Boyd write:

[…] psychological principles are elastic: They bend and change according to the situation and frame of reference […] We have no control over the laws of physics, but we do have some control over the frames of reference in which we view time. Recognizing how and when these frames of reference are advantageous may allow you to get more out of life and help you recognize those occasions when time perspectives hinder and impede you.

The most valuable sections of the book can get buried: they don’t come later in that quote, but earlier, when Zimbardo and Boyd discuss how much our perceptions count and can change how we feel. Their biggest purpose is first to increase our sense of agency and our ability to believe in our own influence, limited as it might be. Call this the difference between science and The Secret, a book I won’t dignify with a link: one sees self-empowerment as a first step of many to come, while the latter is an excuse for the first step and then stopping in a myopic haze of wishful thinking.

Finally, if the book has an overarching, abstract message, it is that we should, like a character from a Herman Hesse novel, ask what we want from life and how to find it. The Time Paradox provides guidance in finding the answer by, for example, discouraging “a kind of learning helplessness,” but the actual journey belongs to the reader, not the authors.


* For decent coverage of the same idea, see Jonathan Rauch’s “Caring for Your Introvert” in The Atlantic.

** Assuming these aren’t simply two sides of the same coin.

Dan Ariely in Seattle

In addition to being an excellent economist and writer, Dan Ariely has among the best syllable-to-letter ratios for any last name I’ve heard. I only learned how to pronounce AR-EE-el-EE on Feb. 27, when he visited Seattle to discuss Predictably Irrational. He warmed the crowd with a visual illusion I fell for; this YouTube clip is a variation. Carefully count the number of one- versus two-handed passes in the video.

If you haven’t watched the clip, don’t read on. If you have, the question isn’t about passes: did you notice the guy with the cell phone walk up to the door behind the girls with the ball? Ariely’s video was more obvious: men in black and white shirts passed two basketballs and a guy in a gorilla suit walked through. Like most of the rest of the crowd, I didn’t notice the gorilla because I was busy counting passes (18 in all, though it depends on whether one counts a pass at the very end). To judge from the self-conscious laughter when Ariely pointed this out to us and the few hands that went up when he asked how many of us saw the gorilla, many others were in my situation. And with that, we were primed with a metaphor for the brain’s ability to create mental illusions.

Ariely gave many examples of such illusions and preferences. For example, opt-in versus opt-out retirement systems have widely varying degrees of participation, as do countries with organ donations, depending on whether people are enrolled by default or must opt-in. It turns out that we seem to have difficulty with multiple, complex choices and a tendency to fall back on defaults in the face of these choices. I’m reminded of Philip Zimbardo’s The Lucifer Effect: Understanding How Good People Turn Evil, which shows how otherwise normal people who receive arbitrary authority and limited oversight can do evil acts. That tendency might be an aspect of a default option: obeying perceived authority.

Both Zimbardo and, implicitly, Ariely, argue that by becoming aware of such tendencies we can better correct or fight them. The tendency towards defaults, initial choices, and authority might also explain why change in societal attitudes often happens slowly: it takes generations for tides to shift and first decisions to be made anew. Paul Graham says, “I suspect there is some speed limit to the evolution of an economy. Economies are made out of people, and attitudes can only change a certain amount per generation.” Ariely’s research supports that conclusion, but I can also see how and why change might be accelerating: as people become more accustomed to change as the norm and as the first choice, it becomes more natural for the individuals who make up societies to reorient themselves faster to new choices. This could also help explain some of the findings in Gregory Clark’s A Farewell to Alms, which argues that the Industrial Revolution took off more because of attitudes and culture in England than other conditions. England’s culture during the Industrial Revolution had finally reached a place where change and innovation became the norm, and where society could support that change rather than relying on defaults like superstition or religion to explain worldly phenomena. It’s an intriguing hypothesis, though off the top of my head I can’t immediately think of a clever way to test whether change becoming a default norm might help change in the future, perhaps explaining why I’m not a behavioral economics professor.

Ariely also showed how we’re constantly using imperfect and imprecise knowledge to make decisions, allowing first decisions their power to frame how we think about something. In an experiment, Ariely read poetry to students and then asked how much groups of students would either pay or agreed to be paid to hear him recite poetry again shortly. The group asked how much they would pay offered to pay to hear Ariely read, and the group that he offered to pay demanded money. It would appear that the way he framed the question caused them to offer or demand money—and offer more or demand more the longer the reading went on. I would also note that, although Ariely gave an excellent econ talk, I’m not sure I would go for his rendition of “Leaves of Grass.” But students who asked how much they would pay did offer money for it because of the way Ariely framed the question.

Now that I know, I wouldn’t pay to hear him read poetry regardless of whether he asked. But if he’s in town for economics, I’d see him, and so should you. You’ll laugh and learn, and the former might be the optimal way to induce the latter.

The Lucifer Effect — Philip Zimbardo

Philip Zimbardo’s The Lucifer Effect: Understanding How Good People Turn Evil will probably have the misfortune of being an extremely important book that does not find the larger audience it deserves. Its author is most famous for conducting the Stanford Prison Experiment (SPE) in the 1970s, in which he divided two groups of normal Stanford students in “prisoners” and “guards” and observed the students assuming their respective roles with frightening quickness and, on the part of the guards, alacrity. The Lucifer Effect is the first time Zimbardo has detailed exactly what happened in the SPE, and he links it to the recent scandal in Abu Ghraib. To judge from recent events, it will not be the last time scandals like Abu Ghraib happen.

If I could sum up The Lucifer Effect, I’d change a quote I recently posted from Robert Heinlein, “secrecy begets tyranny,” to “bad systems beget bad results.” Zimbardo’s argument, made in meticulous detail on the SPE and then paralleled with Abu Ghraib, holds that in some situations normally healthy people can quickly take roles leading them toward brutality and that our personalities may play less of a role in the extent to which we fight injustice than many of us would like to think. These claims are extraordinary, and The Lucifer Effect must be read in full to understand them and the situations, which usually involve lax oversight by supposed authorities and arbitrary rules, that allow abuse to occur.

Some details from The Lucifer Effect haunt, as when Zimbardo says that when prisoners in the SPE were “released” early, other prisoners or guards often said nothing and made no mention of those who had come or gone, as though they were the trapped rabbits in the bizarre warren from Watership Down. The world the prison creates seems almost independent of the world prior to the prison, bringing to mind Kafka or Arthur Koestler’s Darkness at Noon The latter’s portrayal of psychological torture is political in nature, but the parallels between the SPE and it are there: the uncertainty, the apparent lack of thought on the part of guards, the sense of timelessness and the extent to which people become the role rather than vice-versa.

Despite these issues, Zimbardo’s last and too short section deals with how to combat bad systems. He writes: “Heroism often requires social support. We typically celebrate heroic deeds of courageous individuals, but we do not do so if their actions have tangible immediate cost to the rest of us and we can’t understand their motives.” Such was the case of rabble-rousing prisoners, and such is often the case with political reformers. Passages like this remind us of the larger ideas implicit in the particular actions, and Zimbardo skillfully generalizes from specific incidents and then brings the generalizations back to concrete examples, zooming in and out with the precision of a philosopher and the writing talent of a novelist. In the last and perhaps most important section Zimbardo discusses further research concerning how people disengage their moral senses and conform to communal norms and the like, and, in particular, dehumanization as it affects those in positions of power compared to those who are not.

Only occasionally does Zimbardo go too far afield with his theories, as happened with the long description of burnout inventories and the Abu Ghraib scandal. His puns sometimes elicit groans even when they’re appropriate, as when he has a headline asking, “A Bad Apple or a Chip off the Best Block?” concerning a guard named Chip. Yet the section’s content is so solemn that letting in the joke, even a bad one, prevents reader fatigue—a fascinating strategy in a section concerning how people suffer burnout as a result of stress. While the stress of the reader is nothing like the stress of a prison guard in Iraq, Zimbardo’s reminder of how principles remain the same even as the orders of magnitude of importance changes is reinforced by him using the techniques he describes in writing. That and his tendency to drift into academic language (I will argue x, and then I will argue y…) are the only weaknesses in what is otherwise an excellent book and one that contributes greatly to understanding how social and bureaucratic systems work and can dehumanize both those involved and those controlled.


EDIT: Zimbardo’s next book, The Time Paradox, is probably also of great interest to readers of The Lucifer Effect.

%d bloggers like this: