The mandatory end-of-year post

In case you’re interested in pointless listmaking, the New York Times offers its 10 best books of 2008. Of them, I’ve read only Netherland, a novel I felt ambiguous about and still haven’t reread. Roberto Bolaño is on the list for 2666 and is highly praised by many good critics, but I didn’t like The Savage Detectives. The nonfiction side looks more worthwhile, especially given the books that delve into the unconstitutional, anti-democratic, and cruel things the United States is doing to people, but those things are already fairly well-known and the books seem more destined to be cited than read.

Last year, I expressed skepticism at the top 10 and 100 lists at the New York Times, and this year I’ll reiterate that (although I’ve read fewer books on the list this time). This year, I’ll link to a post from January 2008 that in turn linked to a number of my favorite (and much recommended) books. To that list I’ll add The Name of the Rose and The Time Paradox.

No novels published this year enraptured me; if you think I missed one that should, send an e-mail. Finally, if you’re going to read novels based on lists, you might try Modern Library’s Top 100 instead, although it has some clunkers (Appointment in Samarra at 22? Someone(s) must be sentimental for his (their?) youth).

Miguel de Cervantes, Don Quixote, and meaning in the novel

There are two distinct currents in the writing of novels that I would like to note in particular: the novel often described as “taut,” in which every word, sentence, paragraph, action, plot point, utterance, and the like has a central meaning utterly important to the meaning of the novel itself as a whole. Flaubert began this school in earnest, and it began somewhat after the other school, which a professor described as a “big bag of stuff,” containing a bit of everything and much that seems extraneous and wandering, though interesting. Dickens wrote such novels. The “big bag of stuff” school has never been my forte: 18th Century English novels like Clarissa and Pamela are a drag, and the hysterical realists who emulate them deserve the opprobrium they occasionally get. I generally prefer the Flaubert method of writers like Flaubert himself, Fitzgerald, Melville, and the like.

One novel that gets the balance nearly perfectly is Neal Stephenson’s Cryptonomicon, which succeeds in being pointed and yet digressive, and its meanderings are always illustrative of the characters and related, somehow, to the central plot—or, rather idea, which in the case of Cryptonomicon I can’t explain without including the ending. It’s an exceptions; John Barth’s The Sot-Weed Factor is another successful hybridized novel I like, which has characteristics of the big-bag novels without many of their faults. The temptation toward big-bag novels is clear, especially because the novel lacks a required form—as the Penguin Dictionary of Literary Terms & Literary says, the novel is “now applied to a wide variety of writings whose only common attribute is that they are extended pieces of prose fiction. But ‘extended’ begs a number of questions,” which it then goes on to enumerate. The problem with defining the novel is that the form itself arose as an original production and one major criterion for greatness continues to be originality, which becomes steadily harder to achieve as more novels are published. One could call this “contamination, as John Barth argues in The Friday Book.

If is by its nature a contaminated genre, then one of its chief progenitors is a sterling example of this general phenomenon. Don Quixote is a pastiche, and not just of allusion, but of poems, stories that would, on their own, qualify as “short stories,” and perceived history. Its eponymous protagonist acts as if stories are histories, and vice-versa; in Chapter VIII, a typical encounter whose broad outlines are repeated occurs with traveling shepherds. Quixote assures them he is a knight, and though they assume him mad, their own reasoning processes aren’t so different from his. These shepherds make questionable assumptions and use false heuristics as well—one says, “ ‘I think, Senor Vivaldo, that we are going to be well repaid for the delay it will cost us to see this famous funeral; for famous it must surely be, judging by the strange things that these shepherds have told us of the dead man and the homicidal shepherdess.” Are “strange things” enough to make a funeral worth attending, a film worth seeing, a text worth reading? Maybe, since the speaker implies that strange things can cause fame.

Fame itself lends some measure of reality to their perception, and their perception adds some measure of reality to the proceeding, as fame itself is an agglomeration of interested parties. I read once that a person is famous to the extent that more people know him or her than he or she knows. By that definition, Don Quixote (and, in italics, Don Quixote) has become very famous indeed; but even the funeral itself, within the text, becomes more important by way of its interest to the shepherds. The shepherds are astonished at Don Quixote, and “were likewise able to perceive the peculiar nature of his madness,” and yet his madness is like theirs, only to a greater degree. To be sure, quantity has a quality all its own, as Stalin infamously said, but nonetheless the principle remains even when the order of magnitude changes.

So it is with all novels: their parts reflect the wholes, in a recursive loop, just as perception can lead to changes in reality. The process is not perfect and doesn’t have a 1:1 correspondence—whether I “perceive” my computer levitating doesn’t make it levitate, and whether Don Quixote perceives King Author to be the figure made out in tales doesn’t mean he was. Yet when I perceive my computer levitating and use such an idea in a story that in turn becomes widely read as a metaphor for how working in a field one loves can make one accomplish more, or when Don Quixote perceives King Arthur to be a historical figure and then acts accordingly, our perceptions have changed and interacted with the real world—as fiction itself does. Umberto Eco writes in Reflections on The Name of the Rose, “However you choose to look at it, I arrived at scholarship by crossing symbolic forests inhabited by unicorns and gryphons […]” In Alan Lightman’s Einstein’s Dreams, different views of the reality of time affect different worlds that might or might not exist in different ways; in one such universe, “The world will end on 26 September 1907. Everyone knows it,” and they react accordingly. In another: “Suppose that people live forever.” In Don Quixote, one could have a false quote, a quote from Don Quixote in a different universe: “Suppose that Don Quixote believed himself to be a knight-errant.” He does, naturally, and its author or narrator says that details mean little, “providing that in the telling of [the story] we do not depart one iota from the truth.” One can’t depart from the truth of a made-up story.

Don Quixote continually emphasizes the “truth” in a way that’s merely ostentacious rather than clever. The book contains a note referencing the fictional layers that Umberto Eco mocked at the beginning of The Name of the Rose, but aside from hyperbole, there is little if any strong sense of mockery here: “He who translated this great history from the original manuscript left by its author, Cid Hamete Benegeli, states that when he came to the chapter dealing with the adventure in the Cave of Montesinos, he found in the margin, in Hamete’s own handwriting, these words: […]” The novel lets us count the layers of narrative contamination: Don Quixote is the principal actor, who is contaminated by Cid Hamete Benegeli, who is contaminated by (potentially) the translator, who is contaminated by Cervantes himself. Given such uncertainty, the need to draw more attention to Hamete’s uncertainty doesn’t have the effect of allaying uncertainty, as the plea for “how it is impossible for me to believe that Don Quixote lied.” Rather, by calling attention to the possibly fictive nature of Quixote’s adventures, he increases their uncertainty, like someone who guiltily overexplains an absence to a lover. Indeed, the very use of “contamination” so many times and in so many subtly different ways expands it the point of near meaninglessness, like Don Quixote’s constant citation of Romance as a drive to defend his numerous acts of folly.

Furthermore, much of the nature of “truth” in Don Quixote depends on personal reputations rather than any attempt at external verification. Don Quixote is believable “since he is the truest gentleman and noblest knight of his age and would not utter a falsehood if he were to be through with arrows.” In an age with no other gentlemen and no other knights, it isn’t difficult to be the truest and noblest—or the least true and least noble, especially without external checks and balances. If I pronounced myself a Ph.D. and proclaimed myself the truest doctor of the age, and by implication my work the most correct, others would correctly look askance at me: it generally takes the verification and seal of others who represent an institution as well as a large body of work to “prove” myself the finest doctor in the land. Conceivably, however, my work could still be the best, even without the external verification, but it would be harder for others to prove. Don Quixote lacks those proofs by others, and yet in his mind, he is still following their examples—and at bottom, he is testifying for himself, and others are believing him because he of his self-created status, not because of a widely agreed upon status. Cid Hamete Benegali is one flimsy shield against such charges—so flimsy, that he will not testify on Quixote’s behalf in Chapter XXIV, despite the myriad of other far more ridiculous events than the relatively benign one described in a chapter concerning “A Thousand Trifling Matters:” “[…] I would state that if the episode has the appearance of being apocryphal, the fault is not mine, and so, without asserting that it is either false or true, I write it down.” But sophisticated readers should assume such things, and understand implicitly such contamination; it, like the many others of its kind, should be assumed by the reader, rather than stated. Instead, it’s used as a form of paralipsis in drawing attention to the fictionality of the world by arguing over or testifying about its fictionality.

Perhaps this is a reflection of the contamination of Don Quixote by history and by legend, and the standards of truth each implies, as well as the standards of translators and others whose standards might be lower still. It is hard to believe Cid Hamete Benegali if he has accepted Don Quixote’s account of himself simply by the account itself; such tautological reasoning is no more persuasive than Don Quixote’s reasoning about the truth of historical romance. Yet perhaps this is besides the point: in a contaminated narrative, what matters is that characters believe and what it causes them to do, not what they believe. Arguing whether the ghosts are real or fake in Henry James’ The Turn of the Screw is of less importance than what those ghosts cause the Governess to do. Although I had not previously realized it, the same general principle animates the novel I spent most of this morning editing, which is tentatively titled A Winter-Seeming Summer’s Night.

Don Quixote still believes in the Romance narrative he lives, and he can only live through misunderstanding the nature of fact and fiction. Cid Hamete Benegali seems to believe Quixote. And yet, all this is contained in a chapter entitled “A Thousand Trifling Matters,” in which Sancho Panza marvels, “ ‘Is it possible that a man who can say as many wise things as you have just said could have told the nonsensical and impossible tale that you did of the Cave of Montesinos? Well, well, we shall see.’ ” Given that Sancho Panza believes them nonsensical, as does Cid Hamete Benegali (“in Hamete’s own handwriting”), we have bookends of disbelief around an event not so different than the many other. Such sections make literal the belief in Romance and demonstrate faulty reasoning more efficiently than the LSAT—for example, a group in white going to pray for rain causes “Don Quixote [to imagine] that this must be some adventure or other” only to have him “strengthened in this belief” by further misinterpreting what he sees. In the second half, he becomes more deeply enmeshed in both the reality of his unreality and in the reality outside the novel, further straining the epistemological ropes pulling his arms in each direction. This is because Quixote doesn’t accept standard explanations for truth. Don Quixote and Don Quixote are both quite famous, and they’re famous for exemplifying and defying the epistemological models we have imposed on the past. In defying them, they nonetheless have others apparently upholding them, but neither matters half so much as the end result: Quixote’s adventures fueled by his belief, and the contaminated beliefs of others. Too bad they never infect me, as I see Quixote as irritating above everything else.

November links and Success is Never Final: Empire, War, and Faith in Early Modern Europe — Geoffrey Parker

* Books Briefly Noted: Geoffrey Parker’s Success is Never Final: Empire, War, and Faith in Early Modern Europe is another book more likely to be cited than read and whose abstract generalizations are vastly more interesting than the particulars in which they’re mired. The important generalization is that success often carries within it decadence and decline through rigidity, over-extension and arrogance, and such principles apply across a wide range of fields from the national to the individual. One is reminded of Beowulf, a poem usually read as a tale about eventual destruction of the mightiest warrior by the ravages of time and nature. Perhaps that is why we seek “happily ever after” in fiction: as a veil on or eliding against the inevitable.

The specifics regarding early modern and late medieval European machinations can verge on the scholastically tedious; this is a book best sampled like hors d’oeuvre, rather than a full dinner. Learning about the spread of the artillery fortress is much less interesting than its effects on warfare and statecraft. But the last chapter, which is on the nature of law and its uses as seen specifically through the prosecution of sexual crimes by sixteenth-century “Kirks,” or tribunals, in Scotland, says a great deal in a short space. This is not the only essay in Success is Never Final with little if anything to do with the putative topic, but such minor sins can be forgiven.

* Those of you who are following markets—or just paying any attention to any contemporary news media whatsoever—are probably aware that we’re in the middle of a financial crisis that might be the worst since the Great Depression. The best commentary so far, however, comes by way of Megan McArdle:

From a senior who majored in English:

“Is it wrong to feel schadenfreude about my classmates who majored in Economics to get “safe” jobs at Lehman and Merrill Lynch?”

I heard about it second hand, so I’m paraphrasing, but this gave me hope for America’s youth.

* XKCD represents graphically why you should avoid the Amazon Kindle.

* Richard Woodward at the Wall Street Journal attempts A Nobel Undertaking: Getting to Know Le Clézio, who won the latest Nobel Prize in literature. After reading Woodward, I feel pretty good about not getting to know Le Clézio well.

* American Journalism Review argues that “A smaller, less frequently published version packed with analysis and investigative reporting and aimed at well-educated news junkies that may well be a smart survival strategy for the beleaguered old print product.” They should call such a beast a “magazine,” which could be a storehouse of useful or interesting information. Perhaps one based in New York would do well.

* Speaking of newspapers, see The New York Times recursively on Mourning Old Media’s Decline. A sample:

For readers, the drastic diminishment of print raises an obvious question: if more people are reading newspapers and magazines, why should we care whether they are printed on paper?

The answer is that paper is not just how news is delivered; it is how it is paid for.

More than 90 percent of the newspaper industry’s revenue still derives from the print product, a legacy technology that attracts fewer consumers and advertisers every single day. A single newspaper ad might cost many thousands of dollars while an online ad might only bring in $20 for each 1,000 customers who see it.

Ironically, by linking to this article I’m exemplifying the problem the article itself discusses. And the biggest issue actually gets saved until the end: “The blogosphere has had its share of news breaks, but absent a functioning mainstream media to annotate, it could be pretty darn quiet out there.”

The same is true of literary essays and analysis.

* Competent elites: happier and more alive? Maybe, but though I’m intrigued, I also can’t help think about sample size, cause/effect, and comparative problems. I might also title the article, “Competent elites: Happier and more alive and more arrogant?”

* How to lose friends and alienate people, global edition, courtesy of Clive Crook:

There has not been another attack – and Edward Alden, a former Washington bureau chief for the FT and now a scholar at the Council on Foreign Relations, recognises that foreign terrorists find it much harder to get in. The trouble is, so does everybody else, including people that the US needs. On balance, Alden argues, the new regime has done more harm than good even in narrow security terms, to say nothing of the wider human and economic costs. Few who read his compellingly argued and meticulously researched book will be inclined to disagree.

The cure might be worse than the disease. For more on related topics, see Schneier, Bruce.

* Although this has nothing to do with books, Freakonomics reports about the positive externalities of binge drinking for social security, among other unusual ideas.

The Irresponsible Self: On Laughter and the Novel — James Wood

James Wood’s The Irresponsible Self: On Laughter and the Novel is about comedy, yes, and the meaning that stands behind comedy, and the comedy that stands behind meaning, and so on in a potentially infinite loop. Like all his work, it is also about paradox: how words can become how real, and how the interior shows the exterior and vice-versa, and others discussed below. At one point, he says, “What seems to be a fleeting triviality is actually very important—this is both Verga’s subject and his mode of writing his banalities, like those of his characters, are never unimportant.” The seemingly trivial and banal become important, and the seemingly unimportant becomes exalted and majestic. Wood asks, and makes us ask, “why?”, searching for an answer that can never be had and yet also never seems futile. It’s a neat trick—call it the paradox of criticism, to go along with the paradoxes of the novel. If what we read isn’t significant in and of itself, perhaps we imbue it with significance through the nature of our interaction with the word, the sentence, the paragraph, the character, the story. Wood does, and in the process he sees what is too often missed.

What I like about Wood is how he doesn’t feel researched—he feels organic, inevitable, so natural that most critics and academics are closer to the harsh screams of heavy metal than to Moonlight Sonata. Not even Amis’ Wagnerian bombast compares. This organic-ness can only come, I suspect, from long and deep engagement with a narrow body of reference texts—for Wood, they seem to include Flaubert, Chekhov, Henry Green, Shakespeare, and a few others—complemented by wide breadth and an extraordinary comparative faculty. Once such conditions are in place, one has the potential for great criticism. Converting potential to actuality is hard. Few accomplish it, and few have the sight to discover what is so obviously there and yet that I have so often missed. It is a puzzle almost as significant as the many paradoxes of realism and idea in the novel itself, or in any form of representational art. The simultaneous merging and yet standing outside a character, discussed in Wood’s introduction, is one such example too long to quote at length and all the more incredible for the inability of one to slice a part out; this is a pie that can’t be cut without destroying the whole. This might be part of the organic effect I tried to describe above.

In contrast to Wood, consider a section from Geoffrey Hartman’s essay called “Christopher Smart’s ‘Magnificat:’ Toward a Theory of Representation,” which I began immediately after The Irresponsible Self. Smart writes writes:

What if someone cannot be presented [from one person to another]? The sense of distance has been thrown out of balance: either the self feels defective vis-a-vis the other, or the other appears magnified, unapproachable. The someone can be a something: certain subjects may not be introduced into discourse, certain taboos restrict or delimit the kinds of words used.
I introduce the example of words early, because word commonly help present us.

The idea Smart is trying to present is a reasonably good one: the psychology of social order, or interactions among people, and the individual voice addressing itself might be limited by our thoughts (incidentally, Paul Graham writes about both in What You Can’t Say). But the metaphor isn’t a very good one: how could a person not “be presented” to another real person? If I’m in the room with someone and wish to introduce them, there isn’t some way that such a person “can’t” be presented. If the “someone” is a “something,” that makes more sense, as some forms of social convention discourage contentious topics, although it’s also worth noting that some forms, like graduate student parties, encourage superficially contentious topics. And if we’re aware of taboo topics, or make an effort to become aware of them, then we’re no longer not mentioning them to ourselves because we’re aware of them. Notice too Hartman’s use of the term “vis-a-vis,” which seems showy and ostentatious; it’s a struggle and brings his sentence to a halt. It feels like the slash of a sword instead of the stroke of a brush: forced, not inevitable. If Smart’s essay hadn’t been assigned, I might’ve discarded it after that false note in the second paragraph, but I’ve continued, and though I might buy parts of his argument, that argument as a whole is so hard to follow that I mostly want to give up the attempt.

Now, back to Wood; in “How Shakespeare’s ‘Irresponsibility’ Saved Coleridge,” he writes:

Kant offered Coleridge a way of making the self both passive and active. One the one hand, the world was phenomenal: we gather and order the phenomena of perception. Coleridge called this the faculty of understanding, and in the Biographia it becomes, roughly, the “primary imagination.” On the other hand, said Kant, the world was noumenal: there were transcendent things-in-themselves, unknowable, and this domain is grasped by the practical reason or will. This practical reason asserts itself not by argument but by command and precept; it is how we believe in God. Coleridge bent and expanded Kant’s category, stripping it of its philosophical restraint and making it something closer to free will, and at other times closer to the decisive and controlling activity of the imagination.

Seldom have I read a better concise explanation of sophisticated, important ideas with as few sampling or compression errors. The passage moves according to its own logic, graceful as a dancer and yet purposeful, an economy of precision that Orwell could envy. Ideas I hadn’t perceived as connected I suddenly do, and in that moment something happens—a sense of distant has been thrown out of balance, maybe, but if so, it’s only to be regained better and stronger than before. And if it is a sense of distance, it is the distance between Coleridge, Shakespeare, and myself. I’ll happily be thrown out of balance by someone who knows how to pick me back up.

It’s not entirely fair to hold up these two passages, each on tremendously different topics, as comparisons, and yet I think they do demonstrate the difference between the two writers and the larger difference between Wood, who works so hard for intellectual depth and engagement, and many other critics, who sacrifice the latter in phantom pursuit of the former. Wood has a nearly perfect power curve, and even where I disagree, as with Tom Wolfe, I’m still dazzled by the clarity of his thinking and writing, to the extent those can be separated.

How Beautiful It Is and How Easily It Can Be Broken — Daniel Mendelsohn

After reading enough fiction—although how much constitutes “enough” probably varies by person—it seems natural to search for deeper meanings and connections in what you’ve read. Although I can’t pinpoint where I crossed that threshold, somewhere I did—hence Martin Amis’ The War Against Cliché, Stanislaw Lem’s Microworlds, most of James Wood’s books, including How Fiction Works, Milan Kundera’s criticism, and Francine Prose’s Reading Like a Writer. Add to that stack Daniel Mendelsohn’s How Beautiful It Is and How Easily It Can Be Broken. Most pieces hail from “The New York Review of Books,” and they reflect the trade-offs inherent in that magazine’s style, including lengthy introductions so elliptical relative to the main point that one can sometimes start at the first paragraph break, which is often a couple pages in, and miss something, but perhaps not much. It’s a bit like a politician whose great ideas don’t get quite entirely heard because an overly long disquisition looses his audience. Willie Stark suffered from that malady, and Barack Obama was criticized for the same tendency. Readers of criticism should and probably do have considerably longer attention spans than a voter’s, but even that can be stretched only so far. It’s not that a particular essay of Mendelsohn’s suffers from excessively from it, but rather that the overall effect is one of such relentless prep that one becomes weary by the time dinner is actually to be served. This sense of weariness is what led me to allow my subscription to lapse. But keep going through those introductions: the digging brings intellectual gold, and that goal is worth the pursuit.

This is especially true because How Beautiful It Is is tied together better than the average “New York Review of Books,” and its consistent interest in classics and their continuing interpretation and impact give it a sense of building, of constructedness, that helps alleviate the occasional sense of tediousness. As Mendelsohn says of some of the first “9/11 movies,” “The problem with all this realness is that [United 93] itself—like reality—has no structure: and without structure, without shaping, the events can have no large meaning.” So too with criticism, and his larger structure rotates around Greek and Latin classics. When Mendelsohn is on, he’s fantastic, and his impressive knowledge of classics lets him bring seemingly disparate works together, like a metaphysical poet yoking two images that at first appear opposites. They obviously play into some of the sword and sandal epics he mentions, and less obviously into say, Jeffrey Eugenides’ excellent The Virgin Suicides and Middlesex. I wish he’d written more about novels and less about theater, novels being my great interest, but what he does include is richer than many longer works of criticism and helps direct my own reading; Mendelsohn’s argument against The Lovely Bones, one briefly hot book, inspires me to avoid it with more diligence than I do Mitch Albom, another sentimental, schlocky, and vastly overrated bestseller who appeals to the Hallmark card reader in all of us. The Hours, however, is now on the list; one danger of reading How Beautiful It Is and James Wood’s The Irresponsible Self: On Laughter and the Novel is the perpetual extension of one’s reading list, practically giving you the tools to better perceive recent and ancient culture. And, perhaps more importantly, yourself.

Mendelsohn never abandons the critic’s ultimate purpose of judicious judgment, and one impressive thing is the way he manages to be unsparing but not mean, rooted in culture but not pedantic, and conveys his sense of joy, history, and sagacity. The three together are not easy. Some of his pieces seem like overkill, and so many words on the movies Troy, Alexander, 300, and Kill Bill seem wasted, as they’re not worth the skill Mendelsohn lavishes on them. A great critic can only reach his highest level when pitted against great works, and none of those reveal much about much of anything because they lack the depth necessary for the highest level of engagement. Still, Mendelsohn improves imperfect material, demonstrating the possibility better material gives us when he discusses writers, especially Virginia Woolf. The primary thing holding him back is the aforementioned habit of endless introduction and circling needlessly around the main point before he hits it: with James Wood’s criticism, you get the idea that every idea is essential to the argument. With Mendelsohn, you get the idea virtually every one is, but not quite every one: “Nailed!”, about the “Hatchet Jobs” of the writer Dale Peck, doesn’t nail the reader till three pages in. The habit isn’t fatal, and Mendelsohn is still worth reading, but he gets just a tad stuffy as he goes. Still, this is the worse thing I can repeat about Mendelsohn, and his essays convey so much insight that they’re worth reading even if you occasionally skim, because the wonderfully strong justify the others.

Mr. Playboy: Hugh Hefner and the American Dream — Steven Watts

The standard for general nonfiction books these days is Alex Ross’ The Rest is Noise: Listening to the Twentieth Century, which reaches astonishing depth in its use of music to explore history and culture as much as vice-versa. A book need not be as sophisticated as that one to still be worth reading, but less ambitious ones still ought to at least strive toward that standard. Steven Watts’ Mr. Playboy: Hugh Hefner and the American Dream doesn’t, or at least doesn’t obviously. It starts with a promising enough subject—a cultural symbol for much of the last 50 years—and an equally promising premise—that he will illuminate society based on one symbol. Alas, neither occurs, and we’re left with a book that does neither particularly well.

The reasons why a decent book that could be good isn’t aren’t always obvious, even if symptoms of its problems are. I keep coming back to James Fallows’ comment:

Here is something that is common knowledge in the publishing business but that few “normal” readers know: that the average article in a good magazine is much, much more carefully edited than almost any book. Yes, books can last forever while magazines go away after a week or month. But in a high-end magazine – like, well, the Atlantic, or the New Yorker, or the New York Review of Books, or one of a dozen others that invest in good copy editors and fact checkers – you’re far less likely to find typos, grammar errors, careless repetitions and contradictions, or simple made-up facts than you’ll find in books.

I don’t think it’s an accident that Ross normally writes for the New Yorker, as his book is impeccably edited. Before discussing the content of Mr. Playboy, its noxious style and innumerable mistakes have to be noted because they so distract from the reading of it. In Charlie Wilson’s War, such problems were relatively minor but noticeable. In Mr. Playboy, they’re glaring and enormous. We learn that: “[…] Hefner also emerged as a serious shaper of, and commentator on, modern American values.” He was also a “serious, influential figure in modern culture,” who “played a key role in changing American values, ideas, and attitudes” (all on 3). Hefner and Playboy shaped rather than just reflecting “American values” (4). He also helped transform “sexual values” (4). He personified “the mass-culture overhaul of modern society” and “he was a child of popular culture” (both on 5). The magazine became a “cultural litmus test [… for ….] modern American culture” (6). Playboy became a “cultural trendsetter” (6, again). Hefner positioned himself “as a dissenter in modern America” but “expressed many of the deepest impulses of mainstream American culture [… appearing on] the cultural skyline […]” (7). And he “presented a compelling vision of the good life in modern America” (7). I don’t know how often “modern” is used and in how many different ways and contexts, but the author or editor should do a “find” using a word processor and figure it out.

Enough of the introduction. The first chapter tells us Hefner’s boyhood fantasies “mirrored larger patterns in America’s emerging culture of self-fulfillment […]” (12). “The popular culture milieu of Depression-era America” helped shape Hefner (18). The Hefner family was susceptible to “modernizing influences” and “American popular culture” (19). “In certain ways they had embraced modernity.” Hefner’s mother “displayed a modern side” (both 21). Her modernity is mentioned again on page 26, where we also learn “American popular culture molded Hugh Hefner’s boyhood character,” and it’s mentioned one more time on 32. On 27, we learn more about “Popular culture.” After college, “Hefner’s emotional and ideological maturation received an added boost from American popular culture” (56). “Playboy’s appeal was rooted more deeply in the broad social and cultural milieu of postwar America” (72). You don’t say? I had no idea popular culture affected Hefner or Playboy.

On page 35, Hefner was dating a girl but “met someone else.” Two lines down, he “met a young woman who had been a classmate.” On page 40, “He became roommates with Bob Preuss, established a fresh circle of friends, and threw himself into a new round of experiences.” Why not just describe the circle and experiences? Further, we find out that “Bob Preuss, a roommate at the Granada House, was struck by [Hefner’s] candor in talking about sex” (46). Really? I had no idea this Bob guy existed.

On the consumer end, he advocated “consumer efflorescence” and “consumer products” and gave a model for the “stylish consumer” (all on 4). The early 1900s saw “the explosive growth of a consumer economy” (this phrase combining a cliche and repetition on 19). Alfred Kinsey’s findings shocked a society “committed to consumer conformity” (45). We learn about “an economy of abundance” and “material abundance” (the latter twice) on 73). On 74 we find the Cold War “molded these elements of abundance […]”, and that Life magazine ran photos showing “consumer amenities.” And on 75, we hear more of “people intoxicated with abundance.” Playboy encouraged “young men into a fuller enjoyment of American abundance in all of its material and emotional dimensions” (80). On page 83, we learn of “a climate of […] widespread abundance.” On page 104, we learn that postwar American has “consumer abundance.” Chapter seven is titled “An Abundant Life.” Mr. Playboy has an abundance of abundance.

On page 86, Playboy begins through “working in the small Superior Street town house in an atmosphere marked by common purpose and camaraderie […],” and we find out below that “A sense of closeness marked the office atmosphere.” At the top of the next page, “An early staffer observed, ‘There was a closeness there […]'” followed by, “Amid this warm atmosphere [….]”. Did anyone edit this book in a modestly serious fashion? If that weren’t enough, cliches occur too frequently, as when Hefner and Playboy “had taken the country by storm” (3). His first wife “scarred him for life” (48). “Everything seemed possible” (61). Something “captures [Hefner’s] imagination” (62). “It helped drive the final nails into the coffin of traditional Victorian morality […]” (121).

Watts chronically makes the kind of mistakes I mark in freshmen papers. He says, “[Consumer society] was intimately connected to a larger ethos of pleasure, leisure and entertainment” (129). How is it connected? He says “important elements of fantasy went into the presentation of these “real” young women.” That sentence isn’t needed because he goes into those element later in the paragraph. He says of one Playboy staffer who feels superior to the organization, “The reasons were complex” (92). Don’t say the reasons are complex—show why they are complex.

There’s more, but I don’t have the heart or, more importantly, the interest to observe every problem that could’ve come out of a student essay. Most of my examples came from the first half of the book because I didn’t read the second as carefully. Mr Playboy also shows why magazines like The New Yorker and The Atlantic are so good, aside from their editing: either might’ve taken the 70,000 or so words in this book, compressed them a 6,000 word article, and lost little if any meaning while giving the virtues of compression. If Watts had hired me, many of these problems could’ve been avoided. The above barrage is free, however, and if anyone (like his publicist, for example) knows how to forward said advice to Watts before the paperback edition, I’d highly encourage you to do so. It might alleviate some of the book’s problems. There is an inherent danger in studying a person wittier and deeper than you are in that quotes and jokes from one’s subject will upstage the writer. On page 106, surrounded by banal commentary, Watts quotes Hefner saying:

There’s nothing dirty in sex unless we make it dirty. A picture of a beautiful woman is something that a fellow of any age ought to be able to enjoy […] It is the sick mind that finds something loathsome and obscene in sex.

It’s the kind of elegant stylistic and intellectual formulation Watts seldom gets to. Perhaps the most self-referential part of Mr. Playboy and its author comes amid a discussion of Hefner’s enormous and apparently misguided effort to write a piece called “the Playboy philosophy” every month. Watts says, “While [Hefner’s] unadorned prose could be crisp and illuminated with flashes of insight and passion, more often it was turgid and repetitive.” This sentences applies to Mr. Playboy, and Watts shows no sense of the irony in his committing of the same sins he projects on Hefner.

Still, occasional passages, if not redemptive, do convey signifance. Watts likes the amusingly sophomoric through phrases about how “a new commitment to pleasure penetrated [tee-hee] into the most intimate, personal realm of human life…” Bits have surprising pathos, like a quote from one of Hefner’s former girlfriends described on page 205. He also reveals an original thought about Playboy and its creator on page 53 when he says:

Hefner also struggled to shape his views of the world into some kind of cohesive form. In typical adolescent fashion, this bright young man had soaked up a mishmash of ideas and theories during his high school and college years, ranging from Hollywood movies to Freud, popular cartoons to Darwin, Protestant theology to Tarzan.

Such random influences can’t be so unusual given American pop culture, and this section helps show some of the internal contradictions of Playboy’s later philosophy, or faux-philosophy. Such moments are too rare in Mr. Playboy, and I don’t think they’re the fault of the subject—they’re the fault of the writer. Maybe if Watts better connected the facets of Hefner’s life to anything besides themselves, the book would have been improved. As it was, the ten or so girlfriends listed through the latter half of the book only demonstrate that Hefner famously likes to date young. If there’s a better known facet of his life, I’m not sure what it is. Perhaps one day a better biographer will come along and show us what’s really new.

Walter Scott's Waverley, the intrusive narrator, and showing, not telling

In Walter Scott’s Waverley, a representative passage states:

Now I protest to thee, gentle reader […] and hold it the most useful quality of my pen, that it can speedily change from grave to gay, and from description and dialogue to narrative and character. So that, if my quill display no other properties of its mother-goose than her mutability, truly I shall be well pleased; and I conceive that you, my worthy friend, will have not occasion for discontent. From the jargon, therefore, of the Highland gillies, I pass to the character of their Chief. It is an appropriate examination, and therefore, like Dogberry, we must spare no wisdom.

I would have preferred to be spared much wisdom, and perhaps all of Scott’s wisdom regarding the character of Fergus Mac-Ivor save that which is imparted through action and dialog. Among fiction writers, the cliché goes, “Show, don’t tell,” and though, like all such rules it should be broken when the need arises, Scott violates it doubly here: first he tells us that he’s going to tell us the character of Fergus, and then he tells us instead of showing us what that character is. We don’t need to pass “From the jargon […] of the Highland gillies […]” to Fergus, but for him simply to do so without announcing it, and his quill’s output doesn’t have the attributes of a goose, but of whatever use its author puts it to. By protesting that I should have no reason for discontent, Scott makes me discontent; he cannot control my content or dis-, and as such, he need merely tell the story, not tell me of its telling. Such protests are not cause for me to be well pleased, but cause for my own displeasure. To quote the advertising slogan of a national athletic apparel company, he should “just do it.” Basketball players can speak of their skill on the court as much as they wish, but the results we care about are on the scorecard, and authors can trumpet what they’ll do as much as they wish, but the results we care about are the stories, not the explanations. In sparring no wisdom, Scott spares us much of the novelistic wisdom of the last two hundred years.

In How Fiction Works, James Wood attributes the small-m modern novel to Flaubert in a passage that is worth quoting at length:

Novelists should thank Flaubert the way poets thank spring: it all begins again with him. There really is a time before Flaubert and a time after him. Flaubert decisively established what most readers and writers think of as modern realist narration, and his influence is almost too familiar to be visible. We hardly remark of good prose that it favours the telling and brilliant detail; that it privileges a high degree of visual noticing; that it maintains an unsentimental composure and knows how to withdraw, like a good valet, from superfluous commentary […]

This is the standard by which Andrew Hook, who wrote the introduction to the Penguin edition of Waverley, probably judges the novel when he says that it “[…] may not be the best novel of the nineteenth century.” Scholarly introductions normally extol a book’s literary as well social/political merit, but in this case the first point is conceded in order to strengthen the second. Perhaps this is in part because of the kind of thing quoted above or what appears to be Scott’s direct address in the first chapter / introduction—the two have not been fully separated yet—when he writes that he tries to avoid writing what we would now call a period piece by “[…] throwing the force of my narrative upon the characters and passions of the actors; – those passions common to men in all stages of society […]” The same issue debated today, but within criticism rather than novels. In a recent New Yorker article titled “Regrets Only: Lionel Trilling and his Discontents,” Louis Menand says that “[Lionel Trilling] was a humanist who believe that works of literature can speak to us across time […]” before describing Trilling’s steady abandonment of that position, or at least that position in its strongest form. But Trilling argued it in nonfiction, not in fiction, and Menand argues about Trilling argues in nonfiction. Scott gives many of his theories within Waverley in a way that seems paternalistic to this post-Flaubert reader.

Wood probably overstates the case for Flaubert, but my quarrel with him is one of degree rather than fundamental alignment. One thing Flaubert accomplished in his endless quest for realism, which is itself a kind of artificial representation no matter how real, is to at least somewhat relegate the most odious and intrusive passages in Waverley into books like How Fiction Works, or Kundera’s The Art of the Novel, or the innumerable other works by author/critics who save their explicit theorizing for nonfiction studying fiction, rather than fiction itself. This can be avoided, as many contemporary authors do by using writers and critics as characters. Philip Roth did so in his Zuckerman novels and Michael Chabon does so in Wonder Boys. The protagonist, Grady Tripp, reads a troubled student’s first novel and says that

… like most good first novels it possessed an imperturbable, mistaken confidence that all the shocking incidents and extremes of human behavior it dished up would strike new chords of outrage and amazement in the reader. It was a brazen, ridiculous, thrilling performance, with a ballast of genuine sadness that kept the whole thing from keeling over in the gale-force winds of melodrama.

Although I’m not certain I would agree with the generalities expressed in Tripp’s commentary, it does at the very least hearken back to older novels like Don Quixote or Waverley. The difference is that in Wonder Boys, the digression is organic and part of the characterization of the novel itself, rather than a cutting and intrusive digression. The action for the characters themselves doesn’t freeze as a lecture gets dropped in, and the literary theory expressed has some resonance for the novel’s story. Tripp and his agent, Terry Crabtree, are going to decide what to do with Leer based in part on his novel, which they evaluate in part by their own aesthetic criteria. In Waverley, the didactic tone interrupts the action instead of being part of it and focuses on the reader themself, not the characters through the reader. Both are accomplished with layers—in Waverley, with the historian, and in Wonder Boys, with Tripp—but Wonder Boys has that additional facet of integration rather than separation.

Wonder Boys also assumes at least some familiarity with novels and novel theory; notice that Tripp is critical of “the gale-force winds of melodrama,” which simplifies and flattens characters in a way that strikes sophisticated readers as weak. The Penguin Dictionary of Literary Terms & Literary Theory, 4th Edition, sneers that 19th Century melodrama “[…] produced a kind of naively sensational entertainment in which the main characters were excessively virtuous or exceptionally evil.” Waverley succumbs to this trap in part, with characters like Edward and Colonel Talbot, but also escapes from it with Fergus and Flora, as the former is willing to turn against Edward while the latter doesn’t swoon like a stereotypical maiden as soon as the light hero arrives. In this respect, Scott is being more modern than I might want to give him credit for, but he’s still a long way from the evolution of a novel like Wonder Boys, whether in terms of plot, characterization, or, as discussed here, knowledge of literary theory and ideas. Later in the same passage, Tripp approvingly notes that Leer has “largely abandoned his silly experiments with syntax and punctuation,” giving us further theory of what makes a good novel as stated by the character of the novel, who would probably not care for stylistically ostentatious writers who are ostentatious for its own sake, like Alain Robbe-Grillet, or, in some novels, Percival Everett.

It’s possible that novels simply can’t avoid commenting on the form to some extent, just as novels can’t seem to avoid some aspect of epistemology and mystery—even the basic mystery of “what happens next,” though a similar drive might propel readers of essays or other nonfiction. Elmore Leonard is the literary novelist I’m familiar with who gets furthest from the recursive structure of novelists on novels within novels, but even he succumbs to that urge in The Hot Kid. Before starting this response, these ideas were rolling around my mind, and I began editing the novel I’m working on, and found a passage that could be about the ability to read a character in the novel:

I looked over my notes from the previous night and found Cassie’s Facebook profile—she was the keg-stand girl—which had a note about her hangover. Otherwise, her profile contained a long list of favorite music and TV shows, but no books, and also had many semi-literate wall notes. Some from DGs empathized regarding hangovers.

The difference between Chabon, Leonard, and others, versus Scott, however, is the difference between a death metal band and Beethoven’s Moonlight Sonata: subtlety and composition.

None of this essay depreciates Scott’s historical importance to the development of the novel as a genre or of Romance, which was Hook’s initial point. Scott remains historically important, however, chiefly because of influence. Some of his sins might sins of a new form without boundary, and thus Scott might have felt the need to explain the form so that it can be properly enjoyed. Two hundred years since, however, the form is much understood, and an ocean of reading exists beyond what any mortal given present expected longevity can expect to achieve. If Scott caused anxiety in through that influence, it has long been cast off because he, like any pioneer, did not reach the maximum potential of the form he helped establish. Perhaps no artist does, but others have reached further than Scott, and now the dust of archives clings to his prose, which too often offers justification when it doesn’t need to: that his “pen can speedily change from grave to gay,” and many other passages I want to strike with my own pen and write in the margins, “We know!”

Walter Scott’s Waverley, the intrusive narrator, and showing, not telling

In Walter Scott’s Waverley, a representative passage states:

Now I protest to thee, gentle reader […] and hold it the most useful quality of my pen, that it can speedily change from grave to gay, and from description and dialogue to narrative and character. So that, if my quill display no other properties of its mother-goose than her mutability, truly I shall be well pleased; and I conceive that you, my worthy friend, will have not occasion for discontent. From the jargon, therefore, of the Highland gillies, I pass to the character of their Chief. It is an appropriate examination, and therefore, like Dogberry, we must spare no wisdom.

I would have preferred to be spared much wisdom, and perhaps all of Scott’s wisdom regarding the character of Fergus Mac-Ivor save that which is imparted through action and dialog. Among fiction writers, the cliché goes, “Show, don’t tell,” and though, like all such rules it should be broken when the need arises, Scott violates it doubly here: first he tells us that he’s going to tell us the character of Fergus, and then he tells us instead of showing us what that character is. We don’t need to pass “From the jargon […] of the Highland gillies […]” to Fergus, but for him simply to do so without announcing it, and his quill’s output doesn’t have the attributes of a goose, but of whatever use its author puts it to. By protesting that I should have no reason for discontent, Scott makes me discontent; he cannot control my content or dis-, and as such, he need merely tell the story, not tell me of its telling. Such protests are not cause for me to be well pleased, but cause for my own displeasure. To quote the advertising slogan of a national athletic apparel company, he should “just do it.” Basketball players can speak of their skill on the court as much as they wish, but the results we care about are on the scorecard, and authors can trumpet what they’ll do as much as they wish, but the results we care about are the stories, not the explanations. In sparring no wisdom, Scott spares us much of the novelistic wisdom of the last two hundred years.

In How Fiction Works, James Wood attributes the small-m modern novel to Flaubert in a passage that is worth quoting at length:

Novelists should thank Flaubert the way poets thank spring: it all begins again with him. There really is a time before Flaubert and a time after him. Flaubert decisively established what most readers and writers think of as modern realist narration, and his influence is almost too familiar to be visible. We hardly remark of good prose that it favours the telling and brilliant detail; that it privileges a high degree of visual noticing; that it maintains an unsentimental composure and knows how to withdraw, like a good valet, from superfluous commentary […]

This is the standard by which Andrew Hook, who wrote the introduction to the Penguin edition of Waverley, probably judges the novel when he says that it “[…] may not be the best novel of the nineteenth century.” Scholarly introductions normally extol a book’s literary as well social/political merit, but in this case the first point is conceded in order to strengthen the second. Perhaps this is in part because of the kind of thing quoted above or what appears to be Scott’s direct address in the first chapter / introduction—the two have not been fully separated yet—when he writes that he tries to avoid writing what we would now call a period piece by “[…] throwing the force of my narrative upon the characters and passions of the actors; – those passions common to men in all stages of society […]” The same issue debated today, but within criticism rather than novels. In a recent New Yorker article titled “Regrets Only: Lionel Trilling and his Discontents,” Louis Menand says that “[Lionel Trilling] was a humanist who believe that works of literature can speak to us across time […]” before describing Trilling’s steady abandonment of that position, or at least that position in its strongest form. But Trilling argued it in nonfiction, not in fiction, and Menand argues about Trilling argues in nonfiction. Scott gives many of his theories within Waverley in a way that seems paternalistic to this post-Flaubert reader.

Wood probably overstates the case for Flaubert, but my quarrel with him is one of degree rather than fundamental alignment. One thing Flaubert accomplished in his endless quest for realism, which is itself a kind of artificial representation no matter how real, is to at least somewhat relegate the most odious and intrusive passages in Waverley into books like How Fiction Works, or Kundera’s The Art of the Novel, or the innumerable other works by author/critics who save their explicit theorizing for nonfiction studying fiction, rather than fiction itself. This can be avoided, as many contemporary authors do by using writers and critics as characters. Philip Roth did so in his Zuckerman novels and Michael Chabon does so in Wonder Boys. The protagonist, Grady Tripp, reads a troubled student’s first novel and says that

… like most good first novels it possessed an imperturbable, mistaken confidence that all the shocking incidents and extremes of human behavior it dished up would strike new chords of outrage and amazement in the reader. It was a brazen, ridiculous, thrilling performance, with a ballast of genuine sadness that kept the whole thing from keeling over in the gale-force winds of melodrama.

Although I’m not certain I would agree with the generalities expressed in Tripp’s commentary, it does at the very least hearken back to older novels like Don Quixote or Waverley. The difference is that in Wonder Boys, the digression is organic and part of the characterization of the novel itself, rather than a cutting and intrusive digression. The action for the characters themselves doesn’t freeze as a lecture gets dropped in, and the literary theory expressed has some resonance for the novel’s story. Tripp and his agent, Terry Crabtree, are going to decide what to do with Leer based in part on his novel, which they evaluate in part by their own aesthetic criteria. In Waverley, the didactic tone interrupts the action instead of being part of it and focuses on the reader themself, not the characters through the reader. Both are accomplished with layers—in Waverley, with the historian, and in Wonder Boys, with Tripp—but Wonder Boys has that additional facet of integration rather than separation.

Wonder Boys also assumes at least some familiarity with novels and novel theory; notice that Tripp is critical of “the gale-force winds of melodrama,” which simplifies and flattens characters in a way that strikes sophisticated readers as weak. The Penguin Dictionary of Literary Terms & Literary Theory, 4th Edition, sneers that 19th Century melodrama “[…] produced a kind of naively sensational entertainment in which the main characters were excessively virtuous or exceptionally evil.” Waverley succumbs to this trap in part, with characters like Edward and Colonel Talbot, but also escapes from it with Fergus and Flora, as the former is willing to turn against Edward while the latter doesn’t swoon like a stereotypical maiden as soon as the light hero arrives. In this respect, Scott is being more modern than I might want to give him credit for, but he’s still a long way from the evolution of a novel like Wonder Boys, whether in terms of plot, characterization, or, as discussed here, knowledge of literary theory and ideas. Later in the same passage, Tripp approvingly notes that Leer has “largely abandoned his silly experiments with syntax and punctuation,” giving us further theory of what makes a good novel as stated by the character of the novel, who would probably not care for stylistically ostentatious writers who are ostentatious for its own sake, like Alain Robbe-Grillet, or, in some novels, Percival Everett.

It’s possible that novels simply can’t avoid commenting on the form to some extent, just as novels can’t seem to avoid some aspect of epistemology and mystery—even the basic mystery of “what happens next,” though a similar drive might propel readers of essays or other nonfiction. Elmore Leonard is the literary novelist I’m familiar with who gets furthest from the recursive structure of novelists on novels within novels, but even he succumbs to that urge in The Hot Kid. Before starting this response, these ideas were rolling around my mind, and I began editing the novel I’m working on, and found a passage that could be about the ability to read a character in the novel:

I looked over my notes from the previous night and found Cassie’s Facebook profile—she was the keg-stand girl—which had a note about her hangover. Otherwise, her profile contained a long list of favorite music and TV shows, but no books, and also had many semi-literate wall notes. Some from DGs empathized regarding hangovers.

The difference between Chabon, Leonard, and others, versus Scott, however, is the difference between a death metal band and Beethoven’s Moonlight Sonata: subtlety and composition.

None of this essay depreciates Scott’s historical importance to the development of the novel as a genre or of Romance, which was Hook’s initial point. Scott remains historically important, however, chiefly because of influence. Some of his sins might sins of a new form without boundary, and thus Scott might have felt the need to explain the form so that it can be properly enjoyed. Two hundred years since, however, the form is much understood, and an ocean of reading exists beyond what any mortal given present expected longevity can expect to achieve. If Scott caused anxiety in through that influence, it has long been cast off because he, like any pioneer, did not reach the maximum potential of the form he helped establish. Perhaps no artist does, but others have reached further than Scott, and now the dust of archives clings to his prose, which too often offers justification when it doesn’t need to: that his “pen can speedily change from grave to gay,” and many other passages I want to strike with my own pen and write in the margins, “We know!”

The Stuff of Thought and Steven Pinker in Tucson

It’s sometimes harder to describe what comes naturally than it is what comes artificially. We learn to speak by virtue of being around adults who speak, and yet analyzing the languages humans have developed and what those languages represent is harder than it is for a toddler to intuitively learn them. Speaking develops with no schooling aside from the “school” of other humans—and yet its manifold distinctions are the subject of Steven Pinker’s The Stuff of Thought, a complex book that gives some answers leading toward still more questions as he tries to explain the paradoxical mysteries of consciousness and perception.

The subtext of The Stuff of Thought seems to be that language affects us more than we consciously realize and that our uses of language tends to occur in previously unexamined patterns that, once perceived, can be better used to our advantage. Such bold statements take some explaining: language reveals a great deal about us, Pinker argues, including theories of causation embedded inflectionally in some languages and syntactically in English. Some examples are simple: “John threw the ball” indicates who acted on what, and in that model it is difficult to misinterpret what is being said and who is doing what to what. Throw in prepositions and other spatial features encoded in language, however, and it becomes steadily harder to grasp precisely why “A sad movie makes you sad, but a sad person is already sad,” even if we understand the difference without being told the rule. The Stuff of Thought is a guided tour through what we didn’t know that we know. “I am exploring my sexuality; you are promiscuous; she is a slut,” and while all three phrases or words might describe the same fundamental behavior, and yet each has very different and apparent shades of meaning, from positive to pejorative.

This is an example of how we “flip frames,” or understand an event in multiple ways depending on its context. In Newsweek, Lynne Spears—the mother of children famous for celebrity and fecundity, in that order—said of one who recently gave birth at 17, “But [despite] a situation that has fallen in her lap, she’s doing exceptionally well[…]” Notice the phrase, “a situation that has fallen in her lap,” as if the person involved had no agency and was struck by a meteor on her way to school. Then again, maybe the girl in question didn’t have as much agency as classical economists would believe; in Dan Ariely’s excellent Predictably Irrational, he discusses an experiment in which students who were aroused admitted to considerably risk taking in an inventory of potential sexual behaviors than those who were not.* The frame Lynne Spears uses betrays at least some idea of her “frame,” but if we’re not paying attention to her statement, we’re likely to miss it. Furthermore, to be fair, Lynne Spears might refer to her daughter’s choice long after conception, at which point it’s too late to remake the past and one must deal with the options at hand. Temporal ambiguity—a subject Pinker discusses in Chapter 4, “Cleaving the Air”—becomes essential, and nothing about what Lynne Spears said indicates the precise time period she meant. It turns out that such relativity is inherent in language, which applies imprecise spatial metaphors to time, leaving us with the uncertainty much celebrated by Deconstructionists.

Other chapters in The Stuff of Thought deal with metaphors, naming, and game theory, but to go into each would expand this post into a weak shadow of the book, rather than a pointer in its direction. Some extra discussion is warranted, though, and Pinker also discusses swearing and how it changes over time in Chapter 7, “The Seven Words You Can’t Say on Television,” and especially why so much revolves around excretion, sex, and religion. The power of the latter has declined in much of the West along with belief in a literal manifestation of God, and Pinker speculates that phrases like “go to hell,” or “damnit,” that are sufficiently innocuous to be broadcast on television, might have been more threatening when people believed they were Sinners in the Hand of an Angry God (sample: “The God that holds you over the pit of hell, much as one holds a spider, or some detestable insect, over the fire, detests you, and is dreadfully provoked”). Those around excretion and sex, however, still hold more power because they’re both vectors for disease, literally, and the latter can also be a vector for emotional disease, as many pop songs and novels about jilted love attest.


The good news is that Pinker visited Tucson on his tour for the paperback edition of The Stuff of Thought. The bad news for readers is that he hewed so closely to the material in it as to render the talk itself redundant. The points were identical and the examples to support generalizations merely less frequent and deep. But he did expand slightly on issues of swearing and “how to identify and quantify the material world,” and perhaps the most interesting part of his talk was not the talk itself but the audience’s reaction to his discussion of swearing. It’s fairly unusual to hear an impeccably dress professor speculate about the tabooness of words like “fuck” and “cunt,” and the audience tittered appropriately. Pinker can euphemize with the best, referring to “the gynecological-flagellative term for uxorial dominance” at one point in The Stuff of Thought. He leapt between high and low registers with relative ease, and I suppose after discussing the issues numerous times it becomes easier to keep one’s equanimity around swearing. At the end Pinker discussed using language and knowledge of what others know as a way to redefine relationships, expressing the dangers of being too blunt or not blunt enough, and suffering the consequences in the form of missed opportunities or social blunders. One might avoid the kinds of problems from Chapter 8, “Games People Play,” by refusing to feel awkwardness or by reducing one’s susceptibility to societal influence. But he never went that far, and some problems he presents leaves us with the implied answers or ameliorations, like a coyer version of Machiavelli in The Prince.

The sense of Pinker giving only a small taste of his book was reflected in the question and answer period: someone would ask a question, Pinker would begin to elaborate, and then refer the questioner to the relevant chapter. Materials as complex as his can’t easily be summarized and grokked, particularly because one of his book’s major virtues is the wealth of examples and metaphors he uses to describe the general principles he and others have derived from language itself. It’s also a drawback of this post: I’ve tried to give a general overview of Pinker’s ideas, but my own writings are at such a surface level that they can do no more than point to the book. Call it the difference between something like Lily Koppel’s The Red Leather Diary, which would’ve been better left a newspaper article and The Stuff of Thought, a book whose teachings are easier to describe than to apply. Pinker has accomplished a difficult task in synthesizing so much research, but its readers have the harder work of deciding what to do with what we’ve learned.


* I won’t give away the experiment design; for that, you’ll have to read Predictably Irrational.

Cloverfield

Warning: spoilers ahead.

Normally this blog focuses on books, but Cloverfield is the rare film with sufficient depth and impact to make it worth a full post, with the second viewing more profound than the first. Cloverfield speaks to modern anxieties about fear, terrorism, and response more effectively than most movies, full stop, let alone horror movies.

The monster itself in Cloverfield is unexplained, much as 9/11 took the vast majority of Americans by surprise—even those who were nominally supposed to guard against such events. The only hint regarding the title comes at the beginning, with a brief video indicating that we’re about to watch a Department of Defense video related to “Cloverfield,” but with no other sign of the name’s meaning, if any. The shot functions like a false “translator’s preface” or statement of authenticity at the beginning of many older novels that claims historical authenticity. Still, it reassures us that civilization—or at least the Department of Defense—has survived the attack long enough to create the video.

The first twenty minutes are a party like too many I’ve been to, except, this being Hollywood, with more attractive participants. Filmed chiefly by Hud, a character notable chiefly for his passivity and lack of character, the movie really begins with reports of the monster and then the lights being extinguished. On the Manhattan streets, a wall of dust rolls toward people—like in videos of the World Trade Center’s collapse. The head of the Statue of Liberty rolls through the street, indicating that perhaps liberty itself has died, or at least has within the monster’s zone. A character says, “I saw it. It’s alive,” leaving the “it” floating in space, imagination filling in the details.

The monster’s purpose, aside from terror, if any, is mysterious, and the response to the unnamed monster becomes steadily more draconian as the movie continues. Over time, the responses to 9/11, especially regarding air traffic and civil rights have become more draconian, culminating to the point that airports, flying, and foreign travel are now burdens that grow more onerous over time (see here, here, and especially the discussion of the apt phrase “security theater” in Bruce Schneier’s philosophical book concerning the modern age, Beyond Fear, which is available free here). Books like The Lucifer Effect demonstrate the effects of systems designed to dehumanize people—and such books are, for the moment, mostly ignored, like distant shooting in a war zone. As Cloverfield continues the constant drone of war in the background becomes like modern cable news. I recently started teaching college freshmen, and the other day I was talking to a guy who made an offhand comment that in turn made me realize that, to him, we’ve virtually always been fighting wars in Afghanistan or Iraq.

In this atmosphere, movies are beginning to reflect the larger world, as art always does. Ross Douthat wrote wrote an excellent piece on contemporary movies called The Return of the Paranoid Style, which analyzes movies as a rerun of the 70s:

Conservatives such as Noonan hoped that 9/11 would bring back the best of the 1940s and ’50s, playing Pearl Harbor to a new era of patriotism and solidarity. Many on the left feared that it would restore the worst of the same era, returning us to the shackles of censorship and conformism, jingoism and Joe McCarthy. But as far as Hollywood is concerned, another decade entirely seems to have slouched round again: the paranoid, cynical, end-of-empire 1970s.

We expected John Wayne; we got Jason Bourne instead.

The essay is not easily excerpted, and is worth reading in full. Cloverfield doesn’t fit well in its thesis: the movie contains little in the way of overt politics, but whether intentionally or not, its manifestations of current fears about monsters that don’t die when we attack with airstrikes or even ground forces. Although Cloverfield is symbolic of fears regarding attack, one of its strengths is its refusal to be partisan. The military is depicted heroically, and there is little in Cloverfield that indicate self-flagellation. It is all immediate reaction and fear, and, like terrorism, tends to leave us with more questions than answers.

An essay in Terry Teachout’s Reader called “Beasts and Superbeasts” observes “nothing thrills us more than stories implying that there are dark forces in the world too powerful to be tamed by human hands.” This was in 1999; he also wrote that “Of late […] cinematic horror has entered a decadent phase in which vampires have mostly given way to serial killers whose murderous frenzies are coolly explained away by psychiatrist-sleuths, while semi-satirical movies like Scream openly spoof the all-too-familiar conventions of the genre […]” Maybe 9/11 has allowed us to return to the mystery of devils walking among us, the unexplained or poorly explained, and the terrifying unknown. It’s not the monster that scares us in Alien, but the fact that we don’t know where the monster is, don’t know why it operates as it does, and can’t reason with it. In Cloverfield, the monster scares us for our inability to understand it or attack it with bullets and bombs.

The impetus for “Beasts and Superbeasts” was The Blair Witch Project, a movie that, “[…] though hugely entertaining, is not especially scary, no doubt because it was all too clearly made by people who do not believe in the demons whose presence they have so cunningly implied.” Although Teachout overstates the case against The Blair Witch Project, as it is scary in more than a “gotcha!” way to me, recalling as it does those times in the woods, his general principle is true. If The Blair Witch Project reflects the decadent 90s in that respect, Cloverfield aesthetically and artistically benefits from the opposite in the 2000s, as the idea of an attack against New York isn’t a fantasy or goblin any longer. That’s bad for the United States but can lend heft to movies. Cloverfield takes its subject seriously, as Teachout argues The Sixth Sense. That’s not to say it has no jokes, usually relating to Hud’s obliviousness, but it has more emotional power thanks to its resonance with events.

Too many recent novels and movies take the first twenty minutes of Cloverfield and extend them onwards and upwards. The bored lassitude of 20-something partiers captured so well by Claire Messud in The Emperor’s Children is evident in the first fifth of Cloverfield, and its cameraman never escapes from the semi-hipster attitude of overgrown children. The characters are smaller-than-life, and their own motivations are barely more articulated than the monster’s—their inchoateness is itself a commentary on the kinds of unexamined lives that seem not uncommon. The difference between Cloverfield and its competitors, and one reason it passes Teachout’s “Beast and Superbeasts” tests, is that it is about something beyond itself, unlike, say, Garden State or London, the latter a smaller movie like Cloverfield but without the monster.

This essay has a central weakness built into its reading of horror and politics in that those who flew planes into buildings were human, as are those who order bombs dropped on cities from 20,000 feet. The motivation for either may appear foreign to those on the receiving end, but it is not wholly un-understandable; Al-Queda regularly posts video haranguing the West, however illogically or unfairly, and the toxic conditions of Afghanistan were a product of a long line of cultural and historical developments. As Charlie Wilson’s War observes, we did to aid in the construction of our Frankenstein’s monster, though we didn’t notice until after the fact. We blundered in Baghdad, as James Fallows argues, though Iraq might eventually become stable. We feel as if 9/11 came from nowhere, like the unnamed monster does in Cloverfield, whose very lack of identifier is appropriate: 9/11 has stuck to the event and day, but it’s an odd moniker, almost by default, especially compared to other infamous events that come with location signifiers (Pearl Harbor, Gulf of Tonkin). Still, it’s worth remembering the danger of creating an unknowable other who is easier to demonize in a Lord of the Flies style. The markers tying Cloverfield and terrorism are still there, however, and its warning of the dangers worth remembering.

It’s presidential campaign season, and candidates in both parties are eagerly trying to avoid being associated with the foreign policy snafus of the last five years that are the equivalent of shooting missiles that aren’t effective, as America veers dangerously between wanting to pull out altogether from our “adventure” in Iraq and the temptation to continue striding about the world without paying enough attention to whether we’re about to step on an unexpected landmine. Countries we should be paying more attention to, like many former Soviet Republics, get short shrift, as Douthat says in a blog post, while Iraq and Afghanistan pull more than their weight thanks to the relative size of our commitments there. The worrying thing is that the total focus on Al-Queda and Iraq might let another Cloverfield event occur, seemingly out of nowhere, in which a purely military response will be ineffective when we’re left confused and reacting instead of lifting our eyes from the collective party long enough to see the punch before we land, disoriented, on the floor.

In Cloverfield, to save us, we have to destroy Manhattan, and the ambiguous moral calculus remains just that: ambiguous. The most startling part of Cloverfield is its lack of conclusion or certainty. Characters constantly ask each other, “What was that?” and find no answers. The Brooklyn Bridge is destroyed by the monster, with an American flag falling with it. A TV monitor shows “Manhattan under attack,” followed by an image of military trucks responding to the carnage. But will the military be effective in this situation? At least using conventional, World War II-style tactics, the answer appears to be no. But the thing must be fought anyway, as it’s in Manhattan. Maybe if we can ask the right questions, we’ll eventually learn how to fight it—otherwise, we might have to destroy villages in order to save them.


While on the topic of movies, I was going to also pan The X-Files: I Want to Believe, but Slate provides such a solid hit that I’m left with nothing worth discussing:

The nefarious plot behind the agent’s abduction is so far-fetched I’m itching to spoil it. But I’ll limit myself to observing that, if ever I’m dying of a rare brain disease, I hope my surgeon won’t go home and frantically Google treatment options, as Scully does at one key moment. (Couldn’t she at least log on to Medscape?) The problem with the movie’s semisupernatural crime plot, though, isn’t that the resolution is completely outlandish; it’s that the outlandishness is insufficiently grounded in pseudoscience. If you’re going to posit stuff this crazy, you’d better have some solid-sounding bullshit to back it up.

[…]

I’m not quite of a mind with Slate’s Troy Patterson in finding the new movie “vomitously stupid”; rather, it’s a gorgeous, lulling, thoroughly unnecessary exercise in high-minded Anglophilia.

Renting Cloverfield and watching it even for the third, fourth, or fifth time is infinitely preferable than the second X-Files movie.