How I learned about assertiveness and reality from being a consultant

Like many people with such businesses, some friends with a design consulting business say they’re getting jerked around by potential clients. While they’re worried about offending potential clients and don’t want to lose the business, they also don’t like being plied for free samples and they don’t like long conversations that aren’t likely to go anywhere. In the course of talking to them, I realized that they’re discovering that the lessons they’ve taken from school and every day life are wrong or at least not optimal. So I described my own experiences as a consultant and how that taught me about reality and money.

A lot of us—including me—are told from an early age to be polite, take turns, be considerate of other people’s feelings, etc. This is good advice in many but not all circumstances. Among friends you do  want to take turns and reciprocate interests and be warm to other people who are warm. That’s how you build lasting friendship networks. In the business / consultant worlds, however, being overly polite and considerate often leads other people to take advantage of you. Consultants need one very important skill: they need to figure out who is going to give them money and who isn’t. They need to do so relatively quickly. Clients often press to get as much free stuff—often in the form of time and opinions that should cost hundreds of dollars an hour—as they can. They lose nothing by dallying and often gain stuff. Consultants need to learn the killer instinct necessary to know when to stop and say “send me a contract and check or don’t call me until you want to.” Almost all successful consultant learn how to do this and learn when to say no.

(c) Victor WeFoto.com

(c) Victor WeFoto.com

“Talk is cheap” is a cliché for a reason: it doesn’t mean anything. Any talk that’s not a billable hour should be leading, rapidly, to a billable hour. At some point—a point sooner than most novices realize—it’s time to pay or go away. Money talks and isn’t cheap: I’ve been on numerous calls about “collaborations” and what not, when the real thing happens is through subcontracts. I learned to end vapid conversations about “collaboration” that don’t go anyway. Show me the money, or it doesn’t exist.

Someone who wants to hire you knows relatively quickly whether they want to hire you. Anything other than “yes” means “no.” “Maybe” means no. “Later” means no. That’s a hard thing for many of us to accept. My parents founded Seliger + Associates 20 years ago and they learned, the hard way, about how potential clients dangle work that never arrives and waste a lot of valuable time and energy. That means consultants have to get to “no.”

Getting to “no” is actually quite useful and a big improvement over a nebulous maybe. Attention is often your most valuable resource. Don’t let it dissipate over weak leads.

Drawing a clear line can actually turn some “maybes” in “yeses.” Clients will respect you more if you eventually stop negotiating, talking, or communicating unless they pay.

Because of the issues described in the paragraphs above, anyone experienced learns when to stop talking and say “money or nothing.” That means continuing to flirt without cash in hand is also a signal of being inexperienced. The line between being brusque and being direct is thin but when it doubt err on the side of directness rather than meekness.

Directness can actually be a kind of politeness. “Professional courtesy” has an adjective before “courtesy” because it’s different from regular courtesy. Professional courtesy indicates that there’s a different way of being courteous than the conventional way, and one aspect of professional courtesy is there to avoid time wasting people.

That being said, it can be worth exploring new ventures even when those new ventures aren’t immediately remunerative. But money and contracts separate exploration from reality.

These lessons aren’t only applicable to consultant. They apply to almost any form of business and for that matter in dating: if she says “I like you but not in that way,” she means no. I think men tend to learn this faster then women do, in part  because men usually conduct the initial approach to women for dating and sex. There are of course exceptions to this, but as a general principle it holds.[1]

(c) looking4poetry

(c) looking4poetry

My friends are women, and from what I’ve observed guys in their teens have to learn to approach women and risk rejection if they’re going to get anywhere, and a lot of women wait for guys to approach them.

Consequently, guys who want to get anywhere have to get used to rejection in a way a lot of women don’t, and that socialization is probably part of the reason why women like Sheryl Sandberg write books like Lean In. Men figure out relatively early that they have to lean in—or suffer. Like a lot of guys I spent time suffering. I also learned, however, that with women too anything other than “yes” means “no” and that I should move on quickly. Sticking around to beg and plead only worsens the situation.

Disengagement is underrated. In many endeavors one important ingredient in success is fire and motion.


[1] See Tucker Max and Geoffrey Miller’s book Mate for a long description of how and why men tend to initially approach women (giving men the choice of who to approach), women tend to accept or decline sex (giving women the choice of saying yes or no) and men tend to accept or reject long-term relationships (giving men the choice of say yes or no to becoming “official” or “married” or otherwise socially sanctioned).

You may think these principles are bogus or unfair, which is fine, and if you want to change society itself, I wish you luck, but you should at least know they exist. Even among my female friends who identify as hard-core feminists, very rarely will make the initial approach to men in a sex / dating context.

The Authenticity Hoax: How We Get Lost Finding Ourselves — Andrew Potter

A lot of us are searching for something “real” and “authentic” in the same way that Jake Barnes is searching, fruitlessly, in The Sun Also Rises:

We ate dinner at Madame Lecomte’s restaurant on the far side of the island. It was crowded with Americans and we had to stand up and wait for a place. Some one had put it on the American Women’s Club list as a quaint restaurant on the Paris quais as yet untouched by Americans, so we had to wait forty-five minutes for a table.

As soon as Americans arrive, the place is spoiled, but, more importantly, a paradox emerges: when something is identified as “untouched,” it immediately becomes the focus of attention and is touched. The same phenomenon occurs with bullfighting: a nominally pure activity becomes contaminated by Americans seeking authenticity. Notice, however, that no one is directly responsible for putting Madame Lecomte’s on the list: it just happens. “Someone” does it, with no effort to identify that someone: the action is as natural as the dawn and perhaps as inevitable. There is no sense in fighting. It just is, which is part of the small joke, and a rare one in The Sun Also Rises. The meal ends: “After the coffee and a fine we got the bill, chalked up the same as ever on a slate, that was doubtless one of the ‘quaint’ features, paid it [. . .]” The supposed authenticity is inauthentic, and made so by people who are seeking the authentic. This leads us into a paradox that we can’t really get out of.

Unless we acknowledge that authenticity itself is a pernicious desire. That’s Andrew Potter’s main point in The Authenticity Hoax: How We Get Lost Finding Ourselves, which is as authentic a book I’ve read because it doesn’t strive to be authentic. He says:

In the end, authenticity is a positional good, which is valuable precisely because not everyone can have it. The upshot is that, like the earlier privilege given to the upper classes, or the later distinction gained from being cool, the search for the authentic is a form of status competition. Indeed, in recent years authenticity has established itself as the most rarified form of status competition in our society, attracting only the most discerning, well-heeled, and frankly competitive players to the game.
Any status hierarchy is socially pernicious when it is used to allocate scarce goods and resources on the basis of arbitrary or unearned qualities. It is good to be the kind, and almost as good to be a prince, or a duke, or a count, and on down the aristocratic chain. But not all forms of status are illegitimate: higher education is a status hierarchy that helps allocate wealth and privileges, yet for many people, the fact that the education system is for the most part a meritocracy makes it a fair, just, and even democratic form of status competition.

Once it becomes positional, it becomes fake. Still, I would argue that not everyone can have authenticity in the same way, but everyone can probably have it some way. Even the seemingly inauthentic can become authentic if pursued with sufficient vigor: think of the pop culture bubbles Paris Hilton or The Jersey Shore, in which crass commercialism becomes something like authentic. Las Vegas exists by being inauthentic and appropriating the styles of other places—and the pastiche has become a style of its own. Once aware, you can never become unaware:

Authenticity is like authority or charisma: if you have to tell people you have it, then you probably don’t. […] authenticity has an uneasy relationship with the market economy. This is because authenticity is supposed to be something that is spontaneous, nature, innocent, and ‘unspun,’ and for most people, the cash nexus is none of these. Markets are the very definition of that which is planned, fake, calculating, and marketed. That is, selling authenticity is another way of making it self-conscious, which is again, self-defeating.

The best you can do is fight back by not using the language of authenticity, because once one uses it, the thing itself becomes its opposite. Potter is pointing to something like The Gift, which deals with how people tend to have two modes: a commercial mode and a gift mode. Authenticity is supposed to correspond mostly to the gift mode, in opposition to the commercial one, except that this often doesn’t work out.

In The Sun Also Rises, there are long passages about “aficion” that can’t be stated exactly but can be seen. Once seen, it is not spoken of as such; it is only felt, as in this scene with Jake Barnes describing his friend Montoya introducing Jake to other aficionados:

Somehow it was taken for granted that an American could not have aficion. He might simulate it or confuse it with excitement, but he could not really have it. When they saw that I had aficion, and there was no password, no set questions that could bring it out, rather it was a sort of oral spiritual examination with the questions always a little on the defensive and never apparent, there was this same embarrassed putting the hand on the shoulder, or a ‘Beun hombre.’ But nearly always there was the actual touching. It seemed as though they wanted to touch you to make it certain.

In a world where the language of authenticity has been stolen by advertisers and whoever else happens along to appropriate it, we’re stuck striving for “conspicuous authenticity,” a play on Thorstein Veblen term, “conspicuous consumption.” Instead of merely consuming goods, we’re consuming status, which might come in the form of goods, but might also come in the form of experiences, behaviors, acts, postures, and the like. Potter gets this, and he hopes that once we get that the authenticity game—and it is a game—is a phony one, we’ll stop falling for it. And if we do, maybe we’ll also stop falling for some of the other major tropes of our time, in which everyone is striving to be unlike everyone else—and in the process is just like everyone else:

The idea that authority is repressive, that status-seeking is humiliating, that work is alienating, that conformity is a form of death. . . none of this is remotely original. We have heard every variation of the tune, from nineteenth-century bohemians to twentieth-century counterculturalists to twenty-first century antiglobalists, and we know every part by heart.

It is not the sheer persistence but rather the amazing popularity of the stance that ought to give us reason to pause and maybe reconsider our attitude toward modernity. Look around. Is there anyone out there who does not consider him or herself to be an ‘antihero of authenticity’? Anyone who embraces authority, delights in status-seeking, loves work, and strives for conformity?

My guess would be yes: people in the military or law enforcement embrace authority. A lot of celebrities or others of very high status seem to delight in status seeking. People who love work are common enough that we have a phrase for them: workaholics. And high school students either strive for conformity or for the anti-conformity of wearing all black as a group. But the overall point stands—like the point that

One reason I might find novels a more real or satisfying experience than cinema is because they feel further from the cash economy: although novels are obviously protected by copyright and charged for by their authors, many feel less crassly commercial. This is the problem with articles like “The Cobra: Inside a movie marketer’s playbook,” which detail exactly how calculating the movie industry is. Taken with Edward Jay Epstein’s The Hollywood Economist, and it’s hard not to feel bamboozled most of the time when you go to a big Hollywood movie.

Elif Batuman might agree with much of The Authenticity Hoax, especially after she spends a summer in Uzbekistan, which she describes this way in The Possessed: Adventures with Russian Books and the People Who Read Them:

I have never been so hungry in my life as I was that summer [in Uzbekistan]. I remember lying across the bed with Eric, fantasizing about buying anything we wanted from the twenty-four hour Safeway across from our apartment in Mountain View.
[…]
When we first moved to Mountain View, I used to think it was depressing to look out the window and see a gigantic Safeway parking lot, but that was before I spent any time in the ‘Fourth Paradise.’

If the authentic is starvation, give me McDonald’s. If the authentic is local vegetables, give me the avocados and bananas shipped halfway around the world so I can have salads and smoothies in December. In the case of Batuman, Safeway is banal and boring and symptomatic of soul-deadening consumer capitalism, right up to the point where you just want to buy some french fries and maybe one of those takeaway meals that aren’t very good, unless you’ve been subsisting on tea and rancid borscht in a third-world former Soviet republic. Modern life probably also looks sterile and boring up to the point when you’re kidnapped by pirates and die in the ensuring firefight. Some experiences are better left to the movies, unless you have to undergo them.

For example, one thing that makes The Lord of the Rings so effective is the reluctance of the hobbits to leave the Shire; they don’t really want to go on an adventure, or if they do they only half do, and would tarry a long time unless forced. Sam wants to go see Elves on an adventure chiefly because he doesn’t really conceive of what’s ahead. But if they must go, they will.

Their longing for home, rather than for power or for the misery that traveling entailed in a world before planes, trains, and automobiles, is what makes their experience so real. The Authenticity Hoax is partially about what happens if you try to take fantasy experiences and make them into messy realities without the many amenities that many people in developed countries now effectively assume will be there, invisibly woven into the fabric of our lives—like Safeway, and which so many generations have toiled so long in order to give us the standard of living we now enjoy (despite the anxiety still generated around status issues).

The book is worth reading, but skim sections. Some of the later chapters in The Authenticity Hoax are weak: there’s a gross misinterpretation of Harold Bloom’s The Anxiety of Influence at one point. The chapter “Vote for Me, I’m Authentic” is funny but overly focused on contemporary issues, like the 2008 election. At one point Potter says that “[…] it is dangerous for anyone, no matter what their partisan alliance, to have so much contempt for voters. Democracy is based on the premise that reasonable people can disagree over issues of fundamental importance, from abortion and gay rights to the proper balance between freedom and security.” The problem isn’t that voters disagree—the problem is how little voters know. If you read Bryan Caplan’s The Myth of the Rational Voter, it’s hard not to have contempt for voters: their ideology is incoherent, they don’t understand how economics or politics work, they know their individual votes are unlikely to affect the outcome and thus can vote irrationally or against their best interests without consequences, and they don’t know how the government they’re voting on is structured. As Caplan points out, the politicians who are elected are often substantially more knowledgeable than the people who elect them.

Later, Potter says that “[…] a great share of the blame [for politicians who massage the truth] lies with the media and its obsession with controversy and scandal at the expense of more difficult question of policy (sic) and other serious issues.” But the real issue is, once again, within us: a lot more people subscribe to People magazine than Foreign Affairs or The Atlantic, and a lot more voters (“consumers” might be a better word here) watch brain-dead network news shows than good-for-you special reports on the situation in Lebanon, or South Ossetia, or wherever. The problem isn’t the media or politicians—the problem is us. It always has been, and it probably always will be. You can gloss authenticity problems over political ones, but the political ones really point elsewhere.

Skip the last third of the book and pay great attention to the first half. If you read The Authenticity Hoax, maybe you’ll come out with a better conception of your self as an authentic person—which is to say, an inauthentic person. You’ll come out caring less. And when your friend comes back from an “exotic” location you’ll roll your eyes—as you should.

David Shields' Reality Hunger and James Wood's philosophy of fiction

In describing novels from the first half of the 19th Century, David Shields writes in Reality Hunger: A Manifesto that “All the technical elements of narrative—the systematic use of the past tense and the third person, the unconditional adoption of chronological development, linear plots, the regular trajectory of the passions, the impulse of each episode toward a conclusion, etc.—tended to impose the image of a stable, coherent, continuous, unequivocal, entirely decipherable universe.”

I’m not so sure; the more interesting novels didn’t necessarily have “the unconditional adoption of chronological development” or the other features Shields ascribes to them. Caleb Williams is the most obvious example I can immediately cite: the murderers aren’t really punished in it and madness is perpetual. Gothic fiction of the 19th Century had a highly subversive quality that didn’t feature “the regular trajectory of the passions.” To my mind, the novel has always had unsettling features and an unsettling effect on society, producing change even when that change isn’t immediately measurable or apparent, or when we can’t get away from the fundamental constraints of first- or third-person narration. Maybe I should develop this thought more: but Shields doesn’t in Reality Hunger, so maybe innuendo ought to be enough for me too.

Shields is very good at making provocative arguments and less good at making those arguments hold up under scrutiny. He says, “The creators of characters, in the traditional sense, no longer manage to offer us anything more than puppets in which they themselves have ceased to believe.” Really? I believe if the author is good enough. And I construct coherence where it sometimes appears to be lacking. Although I’m aware that I can’t shake hands with David Kepesh of The Professor of Desire, he and the characters around him feel like “more than puppets” in which Roth has ceased to believe.

Shields wants something made new. Don’t we all? Don’t we all want to throw off dead convention? Alas: few of us know how to successfully, and that word “successfully” is especially important. You could write a novel that systematically eschews whatever system you think the novel imposes (this is the basic idea behind the anti-novel), but most people probably won’t like it—a point that I’ll come back to. We won’t like it because it won’t seem real. Most of us have ideas about reality that are informed by some combination of lived experience and cultural conditioning. That culture shifts over time. Shields starts Reality Hunger with a premise that is probably less contentious than much of the rest of the manifesto: “Every artistic movement from the beginning of time is an attempt to figure out a way to smuggle more of what the artist thinks is reality into the work of art.” I can believe this, though I suspect that artists begin getting antsy when you try to pin them down on what reality is: I would call it this thing we all appear to live in but that no one can quite represent adequately.

That includes Shields. Reality Hunger doesn’t feel as new as it should; it feels more like a list of N things. It’s frustrating even when it makes one think. Shields says, “Culture and commercial languages invade us 24/7.” But “commercial languages” only invade us because we let them: TV seems like the main purveyor, and if we turn it off, we’ll probably cut most of the advertising from our lives. If “commercial languages” are invading my life to the extent I’d choose the word “invade,” I’m not aware of it, partially because I conspicuously avoid those languages. Shields says, “I try not to watch reality TV, but it happens anyway.” This is remarkable: I’ve never met anyone who’s tried not to watch reality TV and then been forced to, or had reality TV happen to them, like a car accident or freak weather.

Still, we need to think about how we experience the world and depict it, since that helps us make sense of the world. For me, the novel is the genre that does this best, especially when it bursts its perceived bounds in particularly productive ways. I can’t define those ways with any rigor, but the novel has far more going on than its worst and best critics imagine.

Both the worst and best critics tend to float around the concept of reality. To use Luc Sante’s description in “The Fiction of Memory,” a review of Reality Hunger:

The novel, for all the exertions of modernism, is by now as formalized and ritualized as a crop ceremony. It no longer reflects actual reality. The essay, on the other hand, is fluid. It is a container made of prose into which you can pour anything. The essay assumes the first person; the novel shies from it, insisting that personal experience be modestly draped.

I’m not sure what a “crop ceremony” is or how the novel is supposed to reflect “actual reality.” Did it ever? What is this thing called reality that the novel is attempting to mirror? Its authenticity or lack thereof has, as far as I know, always been in question. The search for realism is always a search and never a destination, even when we feel that some works are more realistic than others.

Yet Sante and Sheilds are right about the dangers of rigidity; as Andrew Potter writes in The Authenticity Hoax: How We Get Lost Finding Ourselves, “One effect of disenchantment is that pre-existing social relations come to be recognized not as being ordained by the structure of the cosmos, but as human constructs – the product of historical contingencies, evolved power relations, and raw injustices and discriminations.”

Despite this, however, we feel realism—if none of us did, we’d probably stop using the term. Our definitions might blur when we approach a precise definition, but that doesn’t mean something isn’t there.

Sante writes, quoting Shields, that “‘Anything processed by memory is fiction,’ as is any memory shaped into literature.” Maybe: but consider these three statements, if I were to make them to you (keep in mind the context of Reality Hunger, with comments like “Try to make it real—compared to what?”):

Aliens destroyed Seattle in 2004.

I attended Clark University.

Alice said she was sad.

One of them is, to most of us, undoubtedly fiction. One of them is true. The other I made up: no doubt there is an Alice somewhere who has said she is sad, but I don’t know her and made her up for the purposes of example. The second example might be “process by memory,” but I don’t think that makes it fiction, even if I can’t give you a firm, rigorous, absolute definition of where the gap between fact and interpretation begins. Jean Bricmont and Alan Sokal give it a shot in Fashionable Nonsense: “For us, as for most people, a ‘fact’ is a situation in the external world that exists irrespective of the knowledge that we have (or don’t have) of it—in particular, irrespective of any consensus or interpretation.”

They go to observe that scientists actually face some problems of definition that I see as similar to those of literature and realism:

Our answer [as to what makes science] is nuanced. First of all, there are some general (but basically negative) epistemological principles, which go back at least to the seventeenth century: to be skeptical of a priori arguments, revelation, sacred texts, and arguments from authority. Moreover, the experience accumulated during three centuries of scientific practice has given us a series of more-or-less general methodological principles—for example, to replicate experiments, to use controls, to test medicines in double-blind protocols—that can be justified by rational arguments. However, we do not claim that these principles can be codified in a definite way, nor that the list is exhaustive. In other words, there does not exist (at least present) a complete codification rationality, is always an adaptation to a new situation.

They lay out some criteria (beware of “revelation, sacred texts, and arguments from authority”) and “methodological principles” (“replicate experiments”) and then say “we do not claim that these principles can be codified in a definite way.” Neither can the principles of realism. James Wood does as good a job of exploring them as anyone. But I would posit that, despite our inability to pin down realism, either as convention or not, most of us recognize it: when I tell people that I attended Clark University, none have told me that my experience is an artifact of memory, or made up, or that there is no such thing as reality and therefore I didn’t. Such realism might merely be convention or training—or it might be real.

In the first paragraph of his review of Chang-Rae Lee’s The Surrendered, James Wood lays out the parameters of the essential question of literary development or evolution:

Does literature progress, like medicine or engineering? Nabokov seems to have thought so, and pointed out that Tolstoy, unlike Homer, was able to describe childbirth in convincing detail. Yet you could argue the opposite view; after all, no novelist strikes the modern reader as more Homeric than Tolstoy. And Homer does mention Hector’s wife getting a hot bath ready for her husband after a long day of war, and even Achilles, as a baby, spitting up on Phoenix’s shirt. Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation. The novel is peculiar in this respect, because while anyone painting today exactly like Courbet, or composing music exactly like Brahms, would be accounted a fraud or a forger, much contemporary fiction borrows the codes and conventions—the basic narrative grammar—of Flaubert or Balzac without essential alteration.

I don’t think literature progresses “like medicine or engineering.” Using medical or engineering knowledge as it stood in 1900 would be extremely unwise if you’re trying to understand the genetic basis of disease or build a computer chip. Papers tend to decay within five to ten years of publication in the sciences.

But I do think literature progresses in some other, less obvious way, as we develop wider ranges of techniques and social constraints allow for wider ranges of subject matter or direct depiction: hence why Nabakov can point out that “Tolstoy, unlike Homer, was able to describe childbirth in convincing detail,” and I can point out that mainstream literature effectively couldn’t depict explicit sexuality until the 20th Century.

While that last statement can be qualified some, it is hard to miss the difference between a group of 19th Century writers like Thackeray, Dickens, Trollope, George Eliot, George Meredith, and Thomas Hardy (who J. Hillis Miller discusses in The Form of Victorian Fiction) and a group of 20th Century writers like D.H. Lawrence, James Joyce, Norman Rush, and A.S. Byatt, who are free to explicitly describe sexual relationships to the extent they see fit and famously use words like “cunt” that simply couldn’t be effectively used in the 19th Century.

In some ways I see literature as closer to math: the quadratic equation doesn’t change with time, but I wouldn’t want to be stuck in a world with only the quadratic equation. Wood gets close to this when he says that “Perhaps it is as absurd to talk about progress in literature as it is to talk about progress in electricity—both are natural resources awaiting different forms of activation.” The word “perhaps” is essential in this sentence: it gives a sense of possibility and realization that we can’t effectively answer the question, however much we might like to. But both question and answer give a sense of some useful parameters for the discussion. Most likely, literature isn’t exactly like anything else, and its development (or not) is a matter as much of the person doing the perceiving and ordering as anything intrinsic to the medium.

I have one more possible quibble with Wood’s description when he says that “the basic narrative grammar—of Flaubert or Balzac without essential alteration.” I wonder if it really hasn’t undergone “essential alteration,” and what would qualify as essential. Novelists like Elmore Leonard, George Higgins, or that Wood favorite Henry Green all feel quite different from Flaubert or Balzac because of how they use dialog to convey ideas. The characters in Tom Perrotta’s Election speak in a much more slangy, informal style than do any in Flaubert or Balzac, so far as I know. Bellow feels more erratic than the 19th Century writers and closer to the psyche, although that might be an artifact of how I’ve been trained by Bellow and writers after Bellow to perceive the novel and the idea of psychological realism. Taken together, however, the writers mentioned make me think that maybe “the basic narrative grammar” has changed for writers who want to adopt new styles. Yes, we’re still stuck with first- and third-person perspectives, but we get books that are heavier on dialog and lighter on formality than their predecessors.

Wood is a great chronicler of what it means to be real: his interrogation of this seemingly simple term runs through the essays collected in The Irresponsible Self: On Laughter and the Novel, The Broken Estate: Essays on Literature and Belief, and, most comprehensively, in the book How Fiction Works. Taken together, they ask how the “basic narrative grammar” of fiction works or has worked up to this point. In setting out some of the guidelines that allow literary fiction to work, Wood is asking novelists to find ways to break those guides in useful and interesting ways. In discussing Reality Hunger, Wood says, “[Shields’] complaints about the tediousness and terminality of current fictional convention are well-taken: it is always a good time to shred formulas.” I agree and doubt many would disagree, but the question is not merely one of “shred[ing] formulas,” but how and why those formulas should be shred. One doesn’t shred the quadratic formula: it works. But one might build on it.

By the same token, we may have this “basic narrative grammar” not because novelists are conformist slackers who don’t care about finding a new way forward: we may have it because it’s the most satisfying or useful way of conveying a story. Although I don’t think this is true, I think it might be true. Maybe most people won’t find major changes to the way we tell stories palatable. Despite modernism and postmodernism, fewer people appear to enjoy the narrative confusion and choppiness of Joyce than do enjoy the streamlined feel of the latest thriller. That doesn’t mean the latter is better than the former—by my values, it’s not—but it does mean that the overall thrust of fiction might remain where it is.

Robert McKee, in his not-very-good-but-useful book Story: Substance, Structure, Style and The Principles of Screenwriting, gives three major kinds of plots, which blend into one another: “arch plots” that are causal in nature and finish their story lines; “mini plots,” which he says are open and “strive for simplicity and economy while retaining enough of the classical […] to satisfy the audience,” and antiplot, which are where absurdism and the like fall.

He says that as one moves “toward the far reaches of Miniplot, Antiplot, and Non-plot, the audience shrinks” (emphasis in original). From there:

The atrophy has nothing to do with quality or lack of it. All three corners of the story triangle gleam with masterworks that the world treasures, pieces of perfection for our imperfect world. Rather, the audience shrinks for this reason: Most human beings believe that life brings closed experiences of absolute, irreversible change; that their greatest sources of conflict are external to themselves; that they are the single and active protagonists of their own existence; that their existence operates through continuous time within a consistent, causally interconnected reality; and that inside this reality events happen for explainable and meaningful reasons.

The connection between this and Wood’s “basic narrative grammar” might appear tenuous, but McKee and Wood are both pointing towards the ways stories are constructed. Wood is more concerned with language; although plot and its expression (whether in language or in video) can’t be separated from one another, they can still be analyzed independently enough of one another to make a distinction.

The conventions that underlie the “arch plots,” however, can become tedious over time. This is what Wood is highlighting when he discusses Roland Barthes’ “reality effect,” which fiction can achieve: “All this silly machinery of plotting and pacing, this corsetry of chapters and paragraphs, this doxology of dialogue and characterization! Who does not want to explode it, do something truly new, and rouse the implication slumbering in the word ‘novel’?” Yet we need some kind of form to contain story; what is that form? Is there an ideal method of conveying story? If so, what if we’ve found it and are now mostly tinkering, rather than creating radical new forms? If we take out “this silly machinery of plotting and pacing” and dialog, we’re left with something closer to philosophy than to a novel.

Alternately, maybe we need the filler and coordination that so many novels consist of if those novels are to be felt true to life, which appears to be one definition of what people mean by “realistic.” This is where Wood parts with Barthes, or at least makes a distinct case:

Convention may be boring, but it is not untrue simply because it is conventional. People do lie on their beds and think with shame about all that has happened during the day (at least, I do), or order a beer and a sandwich and open their computers; they walk in and out of rooms, they talk to other people (and sometimes, indeed, feel themselves to be talking inside quotation marks); and their lives do possess more or less traditional elements of plotting and pacing, of suspense and revelation and epiphany. Probably there are more coincidences in real life than in fiction. To say “I love you” is to say something at millionth hand, but it is not, then, necessarily to lie.

“Convention may be boring, but it is not untrue simply because it is conventional,” and the parts we think of as conventional might be necessary to realism. In Umberto Eco’s Reflections on The Name of the Rose, he says that “”The postmodern reply to the modern consists of recognizing that the past, since it cannot really be destroyed, because its destruction leads to silence, must be revisited: but with irony, not innocently.” That is often the job of novelists dealing with the historical weight of the past and with conventions that are “not untrue simply because [they are] conventional.” Eco and Wood both use the example of love to demonstrate similar points. Wood’s is above; Eco says:

I think of the postmodern attitude as that of a man who loves a very cultivated woman and knows he cannot say to her, ‘I love you madly,’ because he knows that she knows (and that she knows that he knows) that these words have already been written by Barbara Cartland. Still, there is a solution. He can say, ‘As Barbara Cartland would put it, I love you madly.’ At this point, having avoided false innocence, having said clearly that it is no longer possible to speak innocently, he will nevertheless have said what he wanted to say to the woman: that he loves her, but he loves her in an age of lost innocence. If the woman goes along with this, she will have received a declaration of love all the same. Neither of the two speakers will feel innocent, both will have accepted the challenge of the past, of the already said, which cannot be eliminated […]

I wonder if every age thinks of itself as “an age of lost innocence,” only to be later looked on as pure, naive, or unsophisticated. Regardless, for Eco postmodernism requires that we look to the past long enough to wink and then move on with the story we’re going to tell in the manner we’re going to tell it. Perhaps Chang-Rae Lee doesn’t do so in The Surrendered, which is the topic of Wood’s essay—but like so many essays and reviews, Wood’s starts with a long and very useful consideration before coming to the putative topic of its discussion. Wood speaks of reading […] “Chang-Rae Lee’s new novel, “The Surrendered” (Riverhead; $26.95)—a book that is commendably ambitious, extremely well written, powerfully moving in places, and, alas, utterly conventional. Here the machinery of traditional, mainstream storytelling threshes efficiently.” I haven’t read The Surrendered and so can’t evaluate Wood’s assessment.

Has Wood merely overdosed on the kind of convention that Lee uses, as opposed to convention itself? If so, it’s not clear how that “machinery” could be fixed or improved on, and the image itself is telling because Wood begins his essay by asking whether literature is like technology. My own tastes in literature have changed: as a teenager I loved Frank Herbert’s Dune and now find it almost unbearably tedious. Other revisited novels hold up poorly because I’ve overdosed on their conventions and start to crave something new—a lot of fantasy flattens over time like opened soda.

Still, I usually don’t know what “something new” entails until I read it. That’s the problem with saying that the old way is conventional or boring: that much is easier to observe than the fix. Wood knows it, and he’s unusually good at pointing to the problems of where we’ve been and pointing to places that we might go to fix it (see, for example, his recent essay on David Mitchell, who I now feel obliged to read). This, I suspect, is why he is so beloved by so many novelists, and why I spend so much time reading him, even when I don’t necessarily love what he loves. The Quickening Maze struck me as self-indulgent and lacking in urgency, despite the psychological insight Adam Foulds offers into a range of characters’ minds: a teenage girl, a madman, an unsuccessful inventor.

I wanted more plot. In How Fiction Works, Wood quotes from Adam Smith writing in the eighteenth century regarding how writers use suspense to maintain reader interest and then says that “[…] the novel [as an art form; one could also say the capital-N Novel] soon showed itself willing to surrender the essential juvenility of plot […]” Yet I want and crave this element that Wood dismisses—perhaps because of my (relatively) young age: Wood says that Chang-Rae Lee’s Native Speaker was “published when the author was just twenty-nine,” older than I am. I like suspense and the sense of something major at stake, and that could imply that I have a weakness for weak fiction. If so, I can do little more than someone who wants chocolate over vanilla, or someone who wants chocolate despite having heard the virtues of cherries extolled.

When I hear about the versions of the real, reality, and realism that get extolled, I often begin to think about chocolate, vanilla, and cherries, and why some novelists write in such a way that I can almost taste the cocoa while others are merely cardboard colored brown. Wood is very good at explaining this, and his work taken together represents some of the best answers to the questions that we have.

Even the best answers lead us toward more questions that are likely to be answered best by artists in a work of art that makes us say, “I’ve never seen it that way before,” or, better still, “I’ve never seen it.” Suddenly we do see, and we run off to describe to our friends what we’ve seen, and they look at us and say, “I don’t get it,” and we say, “maybe you just had to see it for yourself.” Then we pass them the book or the photo or the movie and wait for them to say, “I’ve already seen this somewhere before,” while we argue that they haven’t, and neither have we. But we press on, reading, watching, thinking, hoping to come across the thing we haven’t seen before so we can share it again with our friends, who will say, like the critics do, “I’ve seen it before.”

So we have. And we’ll see it again. But I still like the sights—and the search.

%d bloggers like this: