Reading James Joyce's Ulysses for plunder

There’s a wonderful Paris Review interview with Robertson Davies, and the Interviewer says:

Bruce Chatwin once remarked that there were two ways of reading, reading for love and reading for plunder, in other words, reading to learn how writers accomplished certain effects, solved certain technical problems, or just in general went about doing their work. That’s a legitimate means of being influenced.

I’m precisely reading Ulysses (as previously discussed) for plunder. I find it hard to believe I will ever love Ulysses, but the number of technical effects (and the emotions they create) are astonishingly large and varied. More so perhaps than any other novel I’ve ever read. The amount of stuff worth plundering in Ulysses is tremendous, and its ability to convey a great deal in a small number of words through incomplete thoughts is showing me how to loosen up some in my own writing. At a few moments in the novel I’m working on now, I’ve come across sentences that make me say, “Yeah, that’s Ulysses‘ influence.”

Many of the novels I’ve read for grad school—The Crying of Lot 49, for instance—merely feel tedious. Ulysses, although I resisted it at first, feels like a trove of novelistic effects.

Note, however, that I’m not saying Ulysses is only good for those effects, as the kinds of emotional powers those effects create are equally impressive. But I’m reading much more for plunder.

Reading James Joyce’s Ulysses for plunder

There’s a wonderful Paris Review interview with Robertson Davies, and the Interviewer says:

Bruce Chatwin once remarked that there were two ways of reading, reading for love and reading for plunder, in other words, reading to learn how writers accomplished certain effects, solved certain technical problems, or just in general went about doing their work. That’s a legitimate means of being influenced.

I’m precisely reading Ulysses (as previously discussed) for plunder. I find it hard to believe I will ever love Ulysses, but the number of technical effects (and the emotions they create) are astonishingly large and varied. More so perhaps than any other novel I’ve ever read. The amount of stuff worth plundering in Ulysses is tremendous, and its ability to convey a great deal in a small number of words through incomplete thoughts is showing me how to loosen up some in my own writing. At a few moments in the novel I’m working on now, I’ve come across sentences that make me say, “Yeah, that’s Ulysses‘ influence.”

Many of the novels I’ve read for grad school—The Crying of Lot 49, for instance—merely feel tedious. Ulysses, although I resisted it at first, feels like a trove of novelistic effects.

Note, however, that I’m not saying Ulysses is only good for those effects, as the kinds of emotional powers those effects create are equally impressive. But I’m reading much more for plunder.

Keith Richards’ Life and what the world used to look like

I skimmed Keith Richards’ memoir Life, which might be of interest to virulent Rolling Stones fans and people interested in how to live despite ingesting massive quantities of poisonous substances in search of altered states (answer: luck). Although most of the memoir is forgettable, this passage stands out because it describes a kind of insanity that feels completely foreign and bizarre to me:

It was 1975, a time of brutality and confrontation. Open season on the Stones had been declared since our last tour, the tour of ’72, known as the STP. The State Department had noted riots (true), civil disobedience (also true), illicit sex (whatever that is), and violence across the United States. All the fault of us, mere minstrels. We had been inciting youth to rebellion, we were corrupting America, and they had ruled never to let us travel in the United States again. It had become, in the time of Nixon, a serious political matter. He had personally deployed his dogs and dirty tricks against John Lennon, who he thought might cost him an election. We, in turn, they told our lawyer officially, were the most dangerous rock-and-roll band in the world.

Must be gratifying to be the most dangerous rock band in the world. It’s also astonishing to imagine that a rock-and-roll band could marshall this kind of attention; these days, the youth who were rebelling in the 1970s have grown up and assumed the reins of power, such that rock-and-roll has grown up with them, becoming rock-and-roll instead of rock ‘n’ roll.

Now it’s no longer subversive, so we have to turn our attention to other topics, like rap, but even that doesn’t inspire so much fear as Richards says the Stones did; rap is regularly reviewed in the New Yorker. Today, nothing is worse than being square. Almost anything goes. 1975 looks bizarre from the perspective of someone born after it: what was all the fuss about? The real question is what subjects generate all the fuss today that will be the same way in the future. I could generate a list of them but choose not to, per Paul Graham’s “What You Can’t Say,” but I bet regular readers could imagine a few things that might end up on the list.

There are other moments of bizarre provincialism too:

When I was growing up, the idea of leaving England was pretty much remote. My dad did it once, but that was in the army to go to Normandy and get his leg blown off. The idea was totally impossible. You just read about other countries and looked at them on TV, and in National Geographic, the black chicks with their tits hanging out and their long necks. But you never expected to see it. Scraping up the money to get out of England would have been way beyond my capabilities.

Although many people today no doubt feel the same, the rise of deregulated air service makes leaving virtually any industrialized country within the reach of a large proportion of the population. Not everyone, to be sure, but it’s much more normal now than it once was. Many fewer find the idea “totally impossible.” It’s easy, at least for me, to forget what the past was like. I think we all have a tendency to assume that the present is “normal,” along with whatever our situation is, and the past different. Then I read about someone who “never expected to see” a foreign country and remember that the time and place I live in is very different from those others have lived in. Such moments are the most revealing part of Life. The book made it on the New York Times bestseller list. Prediction: a large number of copies hit the used book market within six months. If you want to read the book, wait and snag a used copy cheap, or get it from the library.

The Crying of Lot 49 — Thomas Pynchon

How do you describe the absence of coherence? It’s not easy, because you can’t really quote something only to point out what it is not. I bring up the point because The Crying of Lot 49 lacks coherence; it lacks a plot; it’s random in a way that is not random like life, but like life diced by a food processor; it’s the kind of tedious book you read primarily in order to tell others that you’ve read and understood it. I’m not the first to notice: James Wood cites Pynchon’s Mason & Dixon in “Human, All Too Inhuman: The smallness of the “big” novel.” The essay is now behind a paywall, but if you want a copy, send me an e-mail. And B.R. Myers has noticed the issue too, in A Reader’s Manifesto.

Let me try to cite an example. Chapter two of The Crying of Lot 49 conflates life and movies in something akin to parody. But it feels set nowhere—like most of the novel—and perhaps that’s intentional, because L.A. feels like nowhere; and one of the novel’s best sentences describes southern California well: “San Narciso lay further south, near L.A. Like many named places in California it was less an identifiable city than a grouping of concepts—census tracts, special purpose bond-issue districts, shopping nuclei, all overlaid with access roads to its own freeway.”

The nowhere of L.A., however, is a very particular kind of nowhere. I would give real context to the quote if I could figure out what the context might be. But we know that Oedipa is an executrix for an estate; Mertzger is an investigator or lawyer or something. Here’s the block:

‘Maybe it’s a flashback,’ Metzger said. ‘Or maybe he gets it twice.’ Oedipa removed a bracelet. So it went: the succession of film fragments on the tube, the progressive removal of clothing that seemed to bring her no nearer nudity, the boozing, the tireless shivaree of voices and guitars from out by the pool. Now and then a commercial would come in, each time Metzger would say, ‘Inverarity’s,’ or ‘Big block of shares,’ and later settled for nodding and smiling. Oedipa would scowl back, growing more and more certain, while a headache began to flower behind her eyes, that they found among all possible combinations of new lovers had found a way to make time itself slow down. Things grew less and less clear. At some point she went into the bathroom, tried to find her image in the mirror and couldn’t. She had a moment of nearly pure terror. Then remembered that the mirror had broken and fallen in the sink. ‘Seven years’ bad luck,’ she said aloud. ‘I’ll be 35.’ She shut the door behind her and took the occasion to blunder, almost absently, into another slip and skirt, as well as a long-leg girdle and a couple pairs of knee socks. It struck her that if the sun ever came up Metzger would disappear. She wasn’t sure if she wanted him to. She came back in to find Metzger wearing only a pair of boxer shorts and fast asleep with a hardon and his head under the couch. She noticed also a fat stomach the suit had hidden. On the screen New Zealanders and Turks were impaling one another on bayonets. With a cry Oedipa rushed to him, fell on him, began kissing him to wake him up. His radiant eyes flew open, pierced her, as if she could feel the sharpness somewhere vague between her breasts. She sank with an enormous sigh that carried all rigidity like a mythical fluid from her, down next to him; so weak she couldn’t help him undress her; it took him 20 minutes, rolling, arranging her this way and that, as if she thought, he were some scaled-up, short-haired, poker-faced little girl with a Barbie doll. She may have fallen asleep once or twice. She awoke at last to find herself getting laid; she’d come in on a sexual crescendo in progress, like a cut to a scene where the camera’s already moving. Outside a fugue of guitars had begun, and she counted each electronic voice as it came in, till she reached six or so and recalled only three of the Paranoids played guitars; so others must be plugging in.

The paragraph is one giant block in the novel as well. Notice the moments where the narrative skips: we get in the bathroom, impressionistic moments there, and then a sex scene that comes from nowhere, goes nowhere, and appears to mean nothing. Is this: “It struck her that if the sun ever came up Metzger would disappear” figurative? “Maybe,” which is the answer to most questions raised by The Crying of Lot 49, except for the question of whether you should read it.

There are moments of nice writing here: “a headache bean to flower behind her eyes.” I’d never thought about a headache that way, but it makes perfect sense, with the roots reaching into the mind. But it’s isolated from a larger narrative, or at least a larger narrative. It doesn’t connect to anything. We don’t know why the headache is important, unless it’s to signal the confusion of what’s coming next. But if everything is confusion, what are we supposed to take?

I’ve heard that The Crying of Lot 49 is about the corruption of all meaning, of the impossibility of escaping the system, the difficulty of representation, or something along those lines. I think such interpretations say more about the novel than it does about anything outside the novel. Perhaps The Crying of Lot 49 is a joke, chiefly on those who read it—which is to say, people taking literature classes in universities.

What would Tolstoy say about the iPhone?

In discussing Tolstoy, A.N. Wilson says “[. . .] Tolstoy’s death still challenges us to ask the deepest political and personal questions. It is hard to think of any of the great public questions facing the world today that Tolstoy did not anticipate and address in some way, whether we speak of the environmental crisis, religious debate (creationist versus atheist) or the anti-war movement.”

I’m most intrigued that he or she (I don’t know the gender of “A”) doesn’t include “how we relate to technological progress or a rapidly changing world,” which might be the greatest question or suite of questions facing individuals in the West today. At 26 I’m relatively young, yet I’ve already seen how computers insinuated themselves in people’s lives, the rise and fall of IM, Facebook addiction, the way we can now fight wars with relatively few troops, the integration of GPS devices into lives, sexting scandals, and probably more—those were generated in a minute. All them of them relate to technology. And the rate of change, as many commentators have observed, appears to be increasing.

Now, Tolstoy may address these questions: I’ve started War and Peace twice but haven’t made it through. But that’s not the point: the point is that A.N. Wilson doesn’t list them as some of “deepest political and personal questions facing the world today” is itself notable, because I think they are the central questions many of us have, and the central questions that many of our other dilemmas spring from.

The Authenticity Hoax: How We Get Lost Finding Ourselves — Andrew Potter

A lot of us are searching for something “real” and “authentic” in the same way that Jake Barnes is searching, fruitlessly, in The Sun Also Rises:

We ate dinner at Madame Lecomte’s restaurant on the far side of the island. It was crowded with Americans and we had to stand up and wait for a place. Some one had put it on the American Women’s Club list as a quaint restaurant on the Paris quais as yet untouched by Americans, so we had to wait forty-five minutes for a table.

As soon as Americans arrive, the place is spoiled, but, more importantly, a paradox emerges: when something is identified as “untouched,” it immediately becomes the focus of attention and is touched. The same phenomenon occurs with bullfighting: a nominally pure activity becomes contaminated by Americans seeking authenticity. Notice, however, that no one is directly responsible for putting Madame Lecomte’s on the list: it just happens. “Someone” does it, with no effort to identify that someone: the action is as natural as the dawn and perhaps as inevitable. There is no sense in fighting. It just is, which is part of the small joke, and a rare one in The Sun Also Rises. The meal ends: “After the coffee and a fine we got the bill, chalked up the same as ever on a slate, that was doubtless one of the ‘quaint’ features, paid it [. . .]” The supposed authenticity is inauthentic, and made so by people who are seeking the authentic. This leads us into a paradox that we can’t really get out of.

Unless we acknowledge that authenticity itself is a pernicious desire. That’s Andrew Potter’s main point in The Authenticity Hoax: How We Get Lost Finding Ourselves, which is as authentic a book I’ve read because it doesn’t strive to be authentic. He says:

In the end, authenticity is a positional good, which is valuable precisely because not everyone can have it. The upshot is that, like the earlier privilege given to the upper classes, or the later distinction gained from being cool, the search for the authentic is a form of status competition. Indeed, in recent years authenticity has established itself as the most rarified form of status competition in our society, attracting only the most discerning, well-heeled, and frankly competitive players to the game.
Any status hierarchy is socially pernicious when it is used to allocate scarce goods and resources on the basis of arbitrary or unearned qualities. It is good to be the kind, and almost as good to be a prince, or a duke, or a count, and on down the aristocratic chain. But not all forms of status are illegitimate: higher education is a status hierarchy that helps allocate wealth and privileges, yet for many people, the fact that the education system is for the most part a meritocracy makes it a fair, just, and even democratic form of status competition.

Once it becomes positional, it becomes fake. Still, I would argue that not everyone can have authenticity in the same way, but everyone can probably have it some way. Even the seemingly inauthentic can become authentic if pursued with sufficient vigor: think of the pop culture bubbles Paris Hilton or The Jersey Shore, in which crass commercialism becomes something like authentic. Las Vegas exists by being inauthentic and appropriating the styles of other places—and the pastiche has become a style of its own. Once aware, you can never become unaware:

Authenticity is like authority or charisma: if you have to tell people you have it, then you probably don’t. […] authenticity has an uneasy relationship with the market economy. This is because authenticity is supposed to be something that is spontaneous, nature, innocent, and ‘unspun,’ and for most people, the cash nexus is none of these. Markets are the very definition of that which is planned, fake, calculating, and marketed. That is, selling authenticity is another way of making it self-conscious, which is again, self-defeating.

The best you can do is fight back by not using the language of authenticity, because once one uses it, the thing itself becomes its opposite. Potter is pointing to something like The Gift, which deals with how people tend to have two modes: a commercial mode and a gift mode. Authenticity is supposed to correspond mostly to the gift mode, in opposition to the commercial one, except that this often doesn’t work out.

In The Sun Also Rises, there are long passages about “aficion” that can’t be stated exactly but can be seen. Once seen, it is not spoken of as such; it is only felt, as in this scene with Jake Barnes describing his friend Montoya introducing Jake to other aficionados:

Somehow it was taken for granted that an American could not have aficion. He might simulate it or confuse it with excitement, but he could not really have it. When they saw that I had aficion, and there was no password, no set questions that could bring it out, rather it was a sort of oral spiritual examination with the questions always a little on the defensive and never apparent, there was this same embarrassed putting the hand on the shoulder, or a ‘Beun hombre.’ But nearly always there was the actual touching. It seemed as though they wanted to touch you to make it certain.

In a world where the language of authenticity has been stolen by advertisers and whoever else happens along to appropriate it, we’re stuck striving for “conspicuous authenticity,” a play on Thorstein Veblen term, “conspicuous consumption.” Instead of merely consuming goods, we’re consuming status, which might come in the form of goods, but might also come in the form of experiences, behaviors, acts, postures, and the like. Potter gets this, and he hopes that once we get that the authenticity game—and it is a game—is a phony one, we’ll stop falling for it. And if we do, maybe we’ll also stop falling for some of the other major tropes of our time, in which everyone is striving to be unlike everyone else—and in the process is just like everyone else:

The idea that authority is repressive, that status-seeking is humiliating, that work is alienating, that conformity is a form of death. . . none of this is remotely original. We have heard every variation of the tune, from nineteenth-century bohemians to twentieth-century counterculturalists to twenty-first century antiglobalists, and we know every part by heart.

It is not the sheer persistence but rather the amazing popularity of the stance that ought to give us reason to pause and maybe reconsider our attitude toward modernity. Look around. Is there anyone out there who does not consider him or herself to be an ‘antihero of authenticity’? Anyone who embraces authority, delights in status-seeking, loves work, and strives for conformity?

My guess would be yes: people in the military or law enforcement embrace authority. A lot of celebrities or others of very high status seem to delight in status seeking. People who love work are common enough that we have a phrase for them: workaholics. And high school students either strive for conformity or for the anti-conformity of wearing all black as a group. But the overall point stands—like the point that

One reason I might find novels a more real or satisfying experience than cinema is because they feel further from the cash economy: although novels are obviously protected by copyright and charged for by their authors, many feel less crassly commercial. This is the problem with articles like “The Cobra: Inside a movie marketer’s playbook,” which detail exactly how calculating the movie industry is. Taken with Edward Jay Epstein’s The Hollywood Economist, and it’s hard not to feel bamboozled most of the time when you go to a big Hollywood movie.

Elif Batuman might agree with much of The Authenticity Hoax, especially after she spends a summer in Uzbekistan, which she describes this way in The Possessed: Adventures with Russian Books and the People Who Read Them:

I have never been so hungry in my life as I was that summer [in Uzbekistan]. I remember lying across the bed with Eric, fantasizing about buying anything we wanted from the twenty-four hour Safeway across from our apartment in Mountain View.
[…]
When we first moved to Mountain View, I used to think it was depressing to look out the window and see a gigantic Safeway parking lot, but that was before I spent any time in the ‘Fourth Paradise.’

If the authentic is starvation, give me McDonald’s. If the authentic is local vegetables, give me the avocados and bananas shipped halfway around the world so I can have salads and smoothies in December. In the case of Batuman, Safeway is banal and boring and symptomatic of soul-deadening consumer capitalism, right up to the point where you just want to buy some french fries and maybe one of those takeaway meals that aren’t very good, unless you’ve been subsisting on tea and rancid borscht in a third-world former Soviet republic. Modern life probably also looks sterile and boring up to the point when you’re kidnapped by pirates and die in the ensuring firefight. Some experiences are better left to the movies, unless you have to undergo them.

For example, one thing that makes The Lord of the Rings so effective is the reluctance of the hobbits to leave the Shire; they don’t really want to go on an adventure, or if they do they only half do, and would tarry a long time unless forced. Sam wants to go see Elves on an adventure chiefly because he doesn’t really conceive of what’s ahead. But if they must go, they will.

Their longing for home, rather than for power or for the misery that traveling entailed in a world before planes, trains, and automobiles, is what makes their experience so real. The Authenticity Hoax is partially about what happens if you try to take fantasy experiences and make them into messy realities without the many amenities that many people in developed countries now effectively assume will be there, invisibly woven into the fabric of our lives—like Safeway, and which so many generations have toiled so long in order to give us the standard of living we now enjoy (despite the anxiety still generated around status issues).

The book is worth reading, but skim sections. Some of the later chapters in The Authenticity Hoax are weak: there’s a gross misinterpretation of Harold Bloom’s The Anxiety of Influence at one point. The chapter “Vote for Me, I’m Authentic” is funny but overly focused on contemporary issues, like the 2008 election. At one point Potter says that “[…] it is dangerous for anyone, no matter what their partisan alliance, to have so much contempt for voters. Democracy is based on the premise that reasonable people can disagree over issues of fundamental importance, from abortion and gay rights to the proper balance between freedom and security.” The problem isn’t that voters disagree—the problem is how little voters know. If you read Bryan Caplan’s The Myth of the Rational Voter, it’s hard not to have contempt for voters: their ideology is incoherent, they don’t understand how economics or politics work, they know their individual votes are unlikely to affect the outcome and thus can vote irrationally or against their best interests without consequences, and they don’t know how the government they’re voting on is structured. As Caplan points out, the politicians who are elected are often substantially more knowledgeable than the people who elect them.

Later, Potter says that “[…] a great share of the blame [for politicians who massage the truth] lies with the media and its obsession with controversy and scandal at the expense of more difficult question of policy (sic) and other serious issues.” But the real issue is, once again, within us: a lot more people subscribe to People magazine than Foreign Affairs or The Atlantic, and a lot more voters (“consumers” might be a better word here) watch brain-dead network news shows than good-for-you special reports on the situation in Lebanon, or South Ossetia, or wherever. The problem isn’t the media or politicians—the problem is us. It always has been, and it probably always will be. You can gloss authenticity problems over political ones, but the political ones really point elsewhere.

Skip the last third of the book and pay great attention to the first half. If you read The Authenticity Hoax, maybe you’ll come out with a better conception of your self as an authentic person—which is to say, an inauthentic person. You’ll come out caring less. And when your friend comes back from an “exotic” location you’ll roll your eyes—as you should.

Six books I wish someone had handed to me:

1. Flow: The Psychology of Optimal Experience by Mihaly Csikszentmihalyi

2. The Guide to Getting It On by Paul Joannides

3. The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature by Geoffrey Miller

4. Hackers & Painters by Paul Graham; you can also get this material from his essays, which are posted online.

5. Man’s Search for Meaning by Viktor Frankl

6. Stumbling on Happiness by Daniel Gilbert

Each book profoundly shaped how not only I think, but how I live and view the world. All suddenly revealed new connections and ideas about the world I’d never experienced or expected to experience before. Their tendrils extend into a great deal of my thought and work.

Granted, no book can be removed from its context, and its possible that if I’d read some of these books as a younger person I wouldn’t have been ready to appreciate them. But Flow seems by far the most valuable of the choices listed above because it engulfs more of the content of the others than any other choice. Still, each one made me think so profoundly differently than I had before that I feel compelled to list them.

Scrivener or Devonthink Pro, with a side of James Joyce's Ulysses

James Fallows’ post about the writing program Scrivener “suggests broader truths about the ways computers help and hinder the way we think.” He’s right, although I’ve used Scrivener and didn’t love it enough to switch: for anything beyond blog posts I mostly use a combination of Microsoft Word and Mellel, a word processor that is very fast and stable but can’t track changes. This, for me, is not merely bad: I can’t use Mellel beyond first drafts.

The other problem with Mellel isn’t related to the program itself, but to the release cycle. It’s discouraging when a forum post from the developer says, “Yes, we have been slacking off. The pace of development of Mellel – that is, the number of new releases – have dropped significantly over the last three years.” That’s another way of saying, “We’re not really working on it.”

Word, in turn, gets used for any documents I have to share with others (since they already have Word).

Fallows describes how Scrivener offers “a ‘project’ organization system that makes it easy to amass many notes, files, quotes, research documents, etc related to the essay or article or book you’re writing.” I primarily use Devonthink Pro (DTP) for this kind of purpose, and it connects whatever ideas I have to other quotes, ideas, and the like. The “artificial intelligence” engine is surprisingly useful at making connections that I didn’t realize I had. Obviously I could use DTP with Scrivener, but the use of DTP makes the marginal value of Scrivener somewhat lower.

Scrivener 2.0, however, is intriguing; these videos demonstrate its power. More on that later, as I’d like to follow-up on the idea that computers can “help and hinder the way we work.” Scrivener enables one to rearrange large chunks of materials easily, which is how a lot of writers work in the off-line world. For example, I’ve been reading Critical Essays on James Joyce’s Ulysses for a seminar paper and came across this description of Joyce’s process in A. Walton Litz’s “The Design of Ulysses:”

[Joyce] did not write Ulysses straight through, following the final order the episodes. First it was necessary to determine the design of the novel, to visualize its characters and the course of the action, and this entailed putting scattered portions on paper in order to clarify them. Then, like the mosaic worker, Joyce collected and sorted material to fit the design. Finally, the fragments were placed in their proper positions through a process of rough drafts and revisions.

The “design” and the ability to “visualize its characters and the course of the action” corresponds roughly to Scrivener’s idea pane. The “scattered portions on paper” come next so they can be rearranged, “collected” and “sorted.” There’s nothing wrong with using pieces of paper, of course—it worked for Joyce!—but I wonder what the great novelist would think of working digitally.

Joyce used notecards, and Litz liked the mosaic-worker analogy so much that he uses it again a few pages later:

It was the function of the note-sheets to assure that patterns and relationships already visualized by Joyce reached their fore-ordained positions in the text. Like the mosaic worker, he was continuously sorting and re-grouping his raw materials, assigned each fragment to its proper place in the general design. The mechanical nature of this process emphasizes the mechanical nature of those ordering principles which give Ulysses its superficial unity [. . . ]

I used to write more like this and now I write less like this: it is often my goal to ensure that each chapter follows inexorably from the preceding chapter. The narrative threads and the desires of each character should force the novel in a particular direction. If I can rearrange the chapters relatively easily, then I feel like I’ve done something wrong. I still want “patterns and relationships” to reach conclusions, but I don’t want those conclusions “fore-ordained:” I want them to arise organically, and for them to be inevitable yet surprising. This is a difficult trick to pull off, but it means that the serial nature of the writing I do is probably less likely to be helped by the structure of Scrivener than the writing some others might do.

In the essay after Litz’s, Anthony Cronin’s “The Advent of Bloom” begins with the structure of Ulysses: “[. . .] if Ulysses can be said to have a plot, its plot is formless and does not give form to the book – it is not shaped to produce a series of dramatic sensations for purposes aesthetic or otherwise; it has no conclusion in event, only a termination in time [. . .]” If a plot “does not give form to the book,” then something must; for some writers, Scrivener might organize it and help find a way to present formlessness. The program helps one create a mosaic, but I’m not trying to create a mosaic in my work, at least right now: I’m trying to create a linear plot. So I don’t think the program will help me as much as it could.

Nonfiction books, on the other hand, might be much better with Scrivener: in my papers, I move material around much more frequently than I do in fiction. Since I haven’t written any nonfiction books, however, I can’t comment as much on those.

I suspect that large, high-resolution monitors enable programs like Scrivener: at 24″ or larger, one can have a broad enough swatch of material open to really make a (computer) desktop feel like a (physical) desktop. You can layout and rearrange items much more easily. The new 27″ iMacs in particular are appealing for this purpose, and one can now find 27″ external monitors from Dell, Apple, and others. As desktops become more like desktops, being able to visualize large amounts of information at once makes tools like Scrivener more useful.

At the moment, I’m about 80K words into a novel that I think will end up in the neighborhood of 100K – 110K words, which is a bit long for a first published work but not impossibly long. Using a 24″ iMac, I can easily have two pages of text open at a time, which is very convenient. That’s what I use for my “notes” section (miscellaneous stuff I want to remember but can’t immediately add to the main narrative) and my main window, which has the novel progressing from Chapter 1 to “### END ###.” On my second monitor, a 20″ cheapie Dell, I have an outline and character list open.

Some of those functions could be taken over by Scrivener, based on what I’ve seen in the videos. For my next novel—if there is another in the immediate future; I need to devote more time to academic writing—I’d be willing to try Scrivener long enough to know if version 2.0 is a good fit. For this one, however, the thought of changing tools in the middle of the process would be too disruptive. There’s no reason, after all, that I can’t use both Scrivener and Devonthink Pro.

Scrivener or Devonthink Pro, with a side of James Joyce’s Ulysses

James Fallows’ post about the writing program Scrivener “suggests broader truths about the ways computers help and hinder the way we think.” He’s right, although I’ve used Scrivener and didn’t love it enough to switch: for anything beyond blog posts I mostly use a combination of Microsoft Word and Mellel, a word processor that is very fast and stable but can’t track changes. This, for me, is not merely bad: I can’t use Mellel beyond first drafts.

The other problem with Mellel isn’t related to the program itself, but to the release cycle. It’s discouraging when a forum post from the developer says, “Yes, we have been slacking off. The pace of development of Mellel – that is, the number of new releases – have dropped significantly over the last three years.” That’s another way of saying, “We’re not really working on it.”

Word, in turn, gets used for any documents I have to share with others (since they already have Word).

Fallows describes how Scrivener offers “a ‘project’ organization system that makes it easy to amass many notes, files, quotes, research documents, etc related to the essay or article or book you’re writing.” I primarily use Devonthink Pro (DTP) for this kind of purpose, and it connects whatever ideas I have to other quotes, ideas, and the like. The “artificial intelligence” engine is surprisingly useful at making connections that I didn’t realize I had. Obviously I could use DTP with Scrivener, but the use of DTP makes the marginal value of Scrivener somewhat lower.

Scrivener 2.0, however, is intriguing; these videos demonstrate its power. More on that later, as I’d like to follow-up on the idea that computers can “help and hinder the way we work.” Scrivener enables one to rearrange large chunks of materials easily, which is how a lot of writers work in the off-line world. For example, I’ve been reading Critical Essays on James Joyce’s Ulysses for a seminar paper and came across this description of Joyce’s process in A. Walton Litz’s “The Design of Ulysses:”

[Joyce] did not write Ulysses straight through, following the final order the episodes. First it was necessary to determine the design of the novel, to visualize its characters and the course of the action, and this entailed putting scattered portions on paper in order to clarify them. Then, like the mosaic worker, Joyce collected and sorted material to fit the design. Finally, the fragments were placed in their proper positions through a process of rough drafts and revisions.

The “design” and the ability to “visualize its characters and the course of the action” corresponds roughly to Scrivener’s idea pane. The “scattered portions on paper” come next so they can be rearranged, “collected” and “sorted.” There’s nothing wrong with using pieces of paper, of course—it worked for Joyce!—but I wonder what the great novelist would think of working digitally.

Joyce used notecards, and Litz liked the mosaic-worker analogy so much that he uses it again a few pages later:

It was the function of the note-sheets to assure that patterns and relationships already visualized by Joyce reached their fore-ordained positions in the text. Like the mosaic worker, he was continuously sorting and re-grouping his raw materials, assigned each fragment to its proper place in the general design. The mechanical nature of this process emphasizes the mechanical nature of those ordering principles which give Ulysses its superficial unity [. . . ]

I used to write more like this and now I write less like this: it is often my goal to ensure that each chapter follows inexorably from the preceding chapter. The narrative threads and the desires of each character should force the novel in a particular direction. If I can rearrange the chapters relatively easily, then I feel like I’ve done something wrong. I still want “patterns and relationships” to reach conclusions, but I don’t want those conclusions “fore-ordained:” I want them to arise organically, and for them to be inevitable yet surprising. This is a difficult trick to pull off, but it means that the serial nature of the writing I do is probably less likely to be helped by the structure of Scrivener than the writing some others might do.

In the essay after Litz’s, Anthony Cronin’s “The Advent of Bloom” begins with the structure of Ulysses: “[. . .] if Ulysses can be said to have a plot, its plot is formless and does not give form to the book – it is not shaped to produce a series of dramatic sensations for purposes aesthetic or otherwise; it has no conclusion in event, only a termination in time [. . .]” If a plot “does not give form to the book,” then something must; for some writers, Scrivener might organize it and help find a way to present formlessness. The program helps one create a mosaic, but I’m not trying to create a mosaic in my work, at least right now: I’m trying to create a linear plot. So I don’t think the program will help me as much as it could.

Nonfiction books, on the other hand, might be much better with Scrivener: in my papers, I move material around much more frequently than I do in fiction. Since I haven’t written any nonfiction books, however, I can’t comment as much on those.

I suspect that large, high-resolution monitors enable programs like Scrivener: at 24″ or larger, one can have a broad enough swatch of material open to really make a (computer) desktop feel like a (physical) desktop. You can layout and rearrange items much more easily. The new 27″ iMacs in particular are appealing for this purpose, and one can now find 27″ external monitors from Dell, Apple, and others. As desktops become more like desktops, being able to visualize large amounts of information at once makes tools like Scrivener more useful.

At the moment, I’m about 80K words into a novel that I think will end up in the neighborhood of 100K – 110K words, which is a bit long for a first published work but not impossibly long. Using a 24″ iMac, I can easily have two pages of text open at a time, which is very convenient. That’s what I use for my “notes” section (miscellaneous stuff I want to remember but can’t immediately add to the main narrative) and my main window, which has the novel progressing from Chapter 1 to “### END ###.” On my second monitor, a 20″ cheapie Dell, I have an outline and character list open.

Some of those functions could be taken over by Scrivener, based on what I’ve seen in the videos. For my next novel—if there is another in the immediate future; I need to devote more time to academic writing—I’d be willing to try Scrivener long enough to know if version 2.0 is a good fit. For this one, however, the thought of changing tools in the middle of the process would be too disruptive. There’s no reason, after all, that I can’t use both Scrivener and Devonthink Pro.

Thinking and doing: Procrastination and the life of the mind

I finally got around to reading James Surowiecki’s “What does procrastination tell us about ourselves?” (answer: maybe nothing; maybe a lot), which has been going around the Internet like herpes for a very good reason: almost all of us procrastinate, almost all of us hate ourselves for procrastinating, and almost all of us go back to procrastinating without really asking ourselves what it means to procrastinate.

According to Surowiecki, time preferences help explain procrastination. For a good introduction on the topic, see Philip Zimbardo and John Boyd’s The Time Paradox. The short, non-technical version: Some people tend to value present consumption more than future consumption, while others are the inverse. And it’s not just time preferences that change who we are; as Dan Ariely documents in Predictably Irrational, we also change our stated behaviors based on whether, for example, we’re aroused. We also sometimes prefer to bind ourselves through commitments to deadlines or to external structures that will “force” us to behave a certain way. How many dissertations would be completed without the social stigma that comes from working on a project for years and failing to complete it, coupled with the threat of funding removal?

The basic issue is that we have more than one “self,” and the self closest to the specious present (which lasts about three seconds) might be the “truest.” This comes out in the form of procrastination. To quote at length from Surowiecki, who is nominally reviewing The Thief of Time: Philosophical Essays on Procrastination:

Most of the contributors to the new book agree that this peculiar irrationality stems from our relationship to time—in particular, from a tendency that economists call “hyperbolic discounting.” A two-stage experiment provides a classic illustration: In the first stage, people are offered the choice between a hundred dollars today or a hundred and ten dollars tomorrow; in the second stage, they choose between a hundred dollars a month from now or a hundred and ten dollars a month and a day from now. In substance, the two choices are identical: wait an extra day, get an extra ten bucks. Yet, in the first stage many people choose to take the smaller sum immediately, whereas in the second they prefer to wait one more day and get the extra ten bucks.

In other words, hyperbolic discounters are able to make the rational choice when they’re thinking about the future, but, as the present gets closer, short-term considerations overwhelm their long-term goals. A similar phenomenon is at work in an experiment run by a group including the economist George Loewenstein, in which people were asked to pick one movie to watch that night and one to watch at a later date. Not surprisingly, for the movie they wanted to watch immediately, people tended to pick lowbrow comedies and blockbusters, but when asked what movie they wanted to watch later they were more likely to pick serious, important films. The problem, of course, is that when the time comes to watch the serious movie, another frothy one will often seem more appealing. This is why Netflix queues are filled with movies that never get watched: our responsible selves put “Hotel Rwanda” and “The Seventh Seal” in our queue, but when the time comes we end up in front of a rerun of “The Hangover.”

The lesson of these experiments is not that people are shortsighted or shallow but that their preferences aren’t consistent over time. We want to watch the Bergman masterpiece, to give ourselves enough time to write the report properly, to set aside money for retirement. But our desires shift as the long run becomes the short run.

This probably explains why you have to like the daily process of whatever you’re becoming skilled at (writing, researching, law, programming) in order to get good at it: if you have a very long term goal (“Write a great novel” or “Write an entire operating system”), you’ll probably never get there because it’s very easy to defer that until tomorrow. But if you break the task down (I’m going to write 500 words today; I’m going to work on memory management) and fundamentally like the task, you might actually do it. If your short-term desires roughly align with your long-term desires, you’re doing something right. If they don’t, and if you can’t find a way to harmonize them, you’re going to be the kind of person who looks back in 20 years and says, “Where did the time go?”

The answer is obvious: minute by minute and second by second, into activities that don’t pass what Paul Graham calls “The obituary test” in “Good and Bad Procrastination” (like many topics others pass over, he’s already thought about the issue). Are you doing something that will be mentioned in your obituary? If so, then you’re doing something right. Most of us aren’t: we’re watching TV, hanging out on Facebook, thinking that we really should clean the house, waiting for 5:00 to roll around when we get off work, thinking we should go shopping for that essential household item. As Graham says, “The most impressive people I know are all terrible procrastinators. So could it be that procrastination isn’t always bad?” It isn’t, as long as we’re deferring something unimportant for something important, and as long as we have appropriate values for “important.”

So how do we work against bad procrastination and towards doing something useful? The question has been on my mind lately, because a friend who’s an undergrad recently wrote:

A lot of my motivation comes from a fantasy of myself-as-_____, where the role that fills the blank tends to change erratically. Past examples include: writer, poet, monk, philosopher, womanizer. How long will the physicist/professor fantasy last?

I replied:

This is true of a lot of people. One question worth asking: Do you enjoy the day-to-day activities involved with whatever the fantasy is? For me, the “myself-as-novelist” fantasy continues to be closer to fantasy than reality, although “myself-as-writer” is definitely here. But I basically like the work of being a novelist: I like writing, I like inventing stories, I like coming up with characters, plot, etc. Do I like it every single day? No. Are there some days when it’s a chore to drag myself to the keyboard? Absolutely. And I hate query letters, dealing with agents, close calls, etc. But I like most of the stuff and think that’s what you need if you’re going to sustain something over the long term. Most people who are famous or successful for something aren’t good at the something because they want to be famous or successful; they like the something, which eventually leads to fame or success or whatever.

If you essentially like the day-to-day time in the lab, in running experiments, in fixing the equipment, etc., then being a prof might be for you.

One other note: writer, poet, and philosopher have some aspect of money involved in it. So does physicist / professor. Unless you’re Neil Strauss or Tucker Max, “womanizer” is probably a hobby more than a profession. And think of Richard Feynman as an example: he sounds like he got a lot of play, but that wasn’t his main focus; it’s just something he did on the side, so to speak. (“You mean, you just ask them?!”). The more you have some other skill (being a writer, a rock star, whatever), the easier it seems to be to find members of your preferred sex to be interested in you. In Assholes Finish First, Max notes that women started coming to him after his website became successful (note that I have not had the same experience writing about books and lit).

As for the physicist/prof fantasy, I have no idea how long it will last. You sound like you’re staying upwind, per Paul Graham’s essay “What You’ll Wish You’d Known“, which is important because that will let you re-deploy as time goes on. To my mind, read/writing and math are upwind of almost everything else; if you work on those two – three subjects, you’ll probably be okay.

One nice thing about grad school in physics is that you can apparently leverage that to do a lot of other things: programming; becoming a Wall Street quant; doing various kinds of business analysis; etc. It’s probably a better fantasy than monk, poet, or philosopher for that reason. The “philosopher” thing is also (relatively) easy to do on the side, and I would guess it’s probably more fun writing a philosophy blog than writing peer-reviewed philosophy papers, which sounds eminently tedious, at least to me.

Oh: and I have a pile of unposted, half-written blog posts in my Textmate project drawer:

You can see a pile of them on the left. Most will eventually get written. Some will eventually be deleted. All were started with good intentions. Some have been sitting there for a depressingly long period of time. In fact, this post might have found its way among them, if not for the fact that I decided to write it in a single blaze of activity, and if not for the fact that I’m writing about procrastination, this post might have gone the way of many others: half-finished and eventually abandoned.

One reason I’ve had staying power with this blog, while so many of my friends have written a blog for a few months and then quit, is because I basically like blogging for its own sake. Blogging hasn’t brought me fame, power, money, groupies, or other markers of conventional success (so far, anyway!), and it appears unlikely to do so in the short- to medium-term (the long term is anyone’s guess). Sometimes I worry that blogging keeps me from more important work, like writing fiction, but I keep doing it because I like it and because blogging teaches me a lot about the subject I’m writing about and is an excellent forum for small ideas that might one day grow into much larger ones. This is basically the issue that “Signaling, status, blogging, academia, and ideas” discusses.

If the small projects lead to the big projects, you’re doing something right. If the small projects supplant, instead of supplementing, the big projects, you’re doing something wrong. But if you don’t like the small increments of whatever you’re working on, you’re not likely to get to the big project. You’re likely to procrastinate. You’re likely to skip from fantasy to fantasy instead of finding your place. You’re not likely to do the right kind of procrastinating. I wish I’d realized all this when I was younger. Of course, I wish I’d learned a lot of things when I was younger, but I didn’t have Surowiecki, Graham, Zimbardo, Max, and Feynman. Now I do, which enables me to say, “this blog post itself is a form of procrastination, but a productive one, and it’s therefore one I’m going to finish because I like writing it.” That sure beats improbable resolutions.

%d bloggers like this: