Eight years of writing and the first busted Moleskine

Most of my writing happens on a computer, which means it’s pretty hard to depict the final product in a visually satisfying way.* But I also carry around a pretentious Moleskine™ notebook for the random ideas that strike in grocery stories or at parties. The latest notebook, however, developed a split binding:

I’ve been using Moleskines for about eight years, which means I go through about two of them per publishable novel:

Notice how none of the others have the binding split that afflicted the latest one. I haven’t consciously treated this one differently from its predecessors or used it any longer. Maybe the quality control at Moleskine central has declined, although people have made claims in that direction for a very long time.

Regardless of the reason, the latest notebook has about twelve usable pages left; I tend to write nonfiction, blog post ideas, things I need to remember, reminders about e-mails, entries from an unkept diary, and stuff like that in the back. Ideas, quotes, things people say, and other material related to fiction goes in front. When back and front meet in the middle, it’s time to get a new one.

When I start working on a new novel, I usually go back through all the old notebooks at the beginning to see what material might be usable and when I started taking ideas for that specific project. Some ideas for novels have been burbling in the back of my mind for a very long time, waiting for me to have the time and skill to move them from a couple of scrawled lines to 80,000 words of story. The oldest Moleskines I have were bought in the 2002 neighborhood. They’ve held up pretty well; the ones I started buying in the 2005 neighborhood are showing their age. Tough to say if this is an indication of falling quality control or something else altogether.

While Googling around for the complaint about Moleskine quality I linked to above, I also found a site that recommends The Guildhall Notebook. I’ve already ordered one, although apparently Guildhall doesn’t have a U.S. distributor, so I have to wait for mine to ship from the UK. I hope the improved binding is worth the wait. EDIT 1: They weren’t worth the wait, or the hassle; if that weren’t enough, Christine Nusse of Exaclair Inc. /Quo Vadis Planners, which distributes or distributed Guildhall notebooks, said in an e-mail that her understanding is that the notebooks are being discontinued. She recommends the Quo Vadis Habana instead (although I think it too big) or a Rhodia notebook (which I think just right, as I said below.

So even if you want a Guildhall pocket notebook, you probably won’t be able to find one for long; fortunately, the Rhodia Webbie is a better alternative.

EDIT 2: Someone found me by asking, “are moleskines pretentious”? Answer, in post form: “Are Moleskines pretentious? Yup. Guildhall Notebooks are worse.”

EDIT 3: I’ve settled on the Rhodia Webbie as a full-time notebook: it’s expensive but much more durable than other notebooks I’ve found. I’ll write a full review at some point.

EDIT 4: I posted an updated photo of the stack. Or you can see it here:


* Even describing it using conventional prepositions is tough: do I write “on” or “in” or “with” a computer? Good arguments exist for any of the three.

I've been writing academic

For the last couple weeks I’ve been spending a lot of time on my (second) publishable paper, this one on the contrasting temperaments in Elaine Dundy’s The Dud Avocado and Ernest Hemingway’s The Sun Also Rises. They share many superficial characteristics: both tell the stories of decadent Americans in Europe shortly after World Wars; both feature protagonists who do not have major or pressing financial responsibilities; both feature a period of time in Paris punctuated by a trip to Spain that ends up back in Paris; both include characters lacking specific, tangible objectives that propel their travels. Thirty years after The Sun Also Rises, The Dud Avocado continues the tradition of having Americans wander through Europe, but the attitude it takes is predominantly comic, in contrast to the tragic temperament its predecessors shows.

I think it’s an interesting paper—but authors are inclined to think as fondly of their papers as parents are of their children—but writing it sucks up most of the time I’d otherwise use to blog. Blogging and academic writing are usually complements, not substitutes, but in this case the increasing price of blogging relative to paper writing makes me do less of it.

For now.

Where have I been? In exam land

Someone wrote to ask why I haven’t been posting much over the last month. He was too polite to say, “haven’t been posting much with real content,” but I think the last bit was implied. Anyway, the answer is short and unpleasant: studying for my master’s exams. The written is Thursday and the oral about a week after that. People I know in real life sometimes laugh when I say that I anticipate them like a combined rectal exam / execution. One girl responded, “In that order?” I said, “If it were the other way around it wouldn’t be that bad, would it?”

Expect me to resurface in about two weeks. Hopefully this resurfacing will be more like a whale coming up for oxygen than a whale simply beaching himself.

Why we need the third way: "What Are You Going to Do With That" and the need for imagination

In “What Are You Going to Do With That?,” William Deresiewicz tells the freshmen class at Stanford:

In the journey toward the success that you all hope to achieve, you have completed, by getting into Stanford, only the first of many legs. Three more years of college, three or four or five years of law school or medical school or a Ph.D. program, then residencies or postdocs or years as a junior associate. In short, an ever-narrowing funnel of specialization. You go from being a political-science major to being a lawyer to being a corporate attorney to being a corporate attorney focusing on taxation issues in the consumer-products industry. You go from being a biochemistry major to being a doctor to being a cardiologist to being a cardiac surgeon who performs heart-valve replacements.

But he goes on to point out why and how these kinds of defined professional paths—the ones high school and college students students are so often told constitute “success”—might not be optimal, for either the person on the path or society in general. If you “simply go with the flow,” you can end up merely being defined by what someone else has laid out. Perhaps not surprisingly, Deresiewicz goes on to say, “There is an alternative.” He calls it “moral imagination” and defines it this way: “Moral imagination means the capacity to envision new ways to live your life.” I would call it something else: the “third way.”

Deresiewicz’s essay shows why we need more talk about the third way: there are more options out there than further advanced schooling. Stanford in particular is a good place to be reminded of this. Obviously, Deresiewicz doesn’t say you must choose grad school or the professions, but the absence of any acknowledgement about starting your own company implies that those are the two primary choices.

I’ve similar talk. In my interview with him, Tucker Max describes the primary speech he gives at colleges:

[. . . W]hen you’re an undergrad, generally you think you can do two things. You’re gonna have to get a job after you graduate or you gotta go do more school. Because everyone who’s giving you advice or telling you how to live your life are people who’ve done one of those two things.

He describes a “third way,” with his two normal paths defined a lot like Deresiewicz’s, but in a lower register:

You don’t generally have anyone in your life who has gone out on their own and done something entrepreneurial or done something artistic or truly risky or truly taken the path less traveled, because those people [. . .] don’t work in academics. And don’t become cubicle monkeys. So what I try and explain in my speeches is that there’s a third way. Because a lot of people—I think most people—want to do something besides those two things.

A lot of people want to do something else, but that something else is, in some ways, harder to do than the normal path. Yet the people who go the third way often talk about it as being more satisfying, and the people who go the “two paths” often speak wistfully of the third—despite the difficulty one is likely to encounter. A friend wrote this to me: “I know for a fact that I’d hate [Tucker] Max’s writing, but he’s dead right about how few students are aware that they can do something artistic or creative or entrepreneurial.” Too few students are aware of this—and too few people in general are. You can consider this post a very small step in the direction of increasing awareness.

So far I’ve noted two examples. Paul Graham talks about the problem of standard paths too, in “A Student’s Guide to Startups:” “Till recently graduating seniors had two choices: get a job or go to grad school. I think there will increasingly be a third option: to start your own startup.” His answer is more defined than Deresiewicz’s or Max’s, but the very language he uses is similar. But he’s also got a way of generating the “third way” by funding startups. Instead of merely telling people to find one, he’s creating a third way for people to flow, which might be the most valuable contribution of all, at least for the technically inclined.

I think all three of these disparate writers—Deresiewicz, Max, and Graham—are pointing to a more fundamental need for the imagination necessary to exit the obvious paths that so often end up going nowhere. Of the three, Graham has done the most to institutionalize this process and make it available for others by starting Y Combinator. Max has probably done the most to be a living embodiment of an unusual third way. Deresiewicz is pointing to the possibility from within the way of a well-defined path (and the same one I’m one) from undergrad to graduate school to being a professor. Taken together, they diagnose and offer treatment for the same malady that can’t quite be identified yet comes from so many sources and has so many symptoms: Dilbert, cubicles, malaise, ennui, florescent lights, midlife crises, 20-somethings with advanced degrees working as baristas, waiters, or bartenders, essay writers.

Artistic or creative activities don’t usually come prepackaged in convenient jobs that get handed to college graduates. They get created by people who are artistic and creative, who find a way to turn what they want to do, or their inchoate ideas, into something greater than the idea itself. The “inchoate idea” is important: I suspect most people don’t entirely know what they’re doing when they find a third way. Steven Berlin Johnson has a term for this in his book Where Good Ideas Come From: The Natural History of Innovation: the slow hunch. This happens when something that you’ve been gnawing on slowly develops over time. Johnson describes it much more fully, of course, but a lot of my ideas in writing novels or academic work comes from slow hunches. Writing fiction isn’t an activity that really comes packaged in convenient job form: it is made by each practitioner individually. People who succeed as writers sometimes do so not through conventional publishing, but through alternate ways—as Max did with his website, or as J.A. Konrath apparently does with his blog, “A Newbie’s Guide to Publishing.”

Like Deresiewicz and Max, I don’t really have a solution to the problem other than to encourage you to think imaginatively. But who’s against thinking imaginatively? Partners are probably telling their third-year associates the same thing, even as the associates put in soul-killing seventy hours weeks under those menacing florescent lights. The other part of my solution is to be aware of the problem. I’ll also channel Graham in “What You’ll Wish You’d Known” and encourage you to stay upwind:

In the graduation-speech approach, you decide where you want to be in twenty years, and then ask: what should I do now to get there? I propose instead that you don’t commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward.

It’s not so important what you work on, so long as you’re not wasting your time. Work on things that interest you and increase your options, and worry later about which you’ll take.

Suppose you’re a college freshman deciding whether to major in math or economics. Well, math will give you more options: you can go into almost any field from math. If you major in math it will be easy to get into grad school in economics, but if you major in economics it will be hard to get into grad school in math.

Flying a glider is a good metaphor here. Because a glider doesn’t have an engine, you can’t fly into the wind without losing a lot of altitude. If you let yourself get far downwind of good places to land, your options narrow uncomfortably. As a rule you want to stay upwind.

“Work on things that interest you and increase your options:” the target of Graham’s essay is nominally high school students, but it’s applicable to a much broader swath of people. Maybe you’re one. If so, however, you’ll probably read this and then go back to filling out those TPS reports. Or maybe you’ll be one of the very rare people who realize there is no speed limit and react appropriately. At least you can’t say that no one told you. At least three people have: Deresiewicz, Max, and Graham. Four if you count me, writing a meta essay.

Why we need the third way: “What Are You Going to Do With That” and the need for imagination

In “What Are You Going to Do With That?,” William Deresiewicz tells the freshmen class at Stanford:

In the journey toward the success that you all hope to achieve, you have completed, by getting into Stanford, only the first of many legs. Three more years of college, three or four or five years of law school or medical school or a Ph.D. program, then residencies or postdocs or years as a junior associate. In short, an ever-narrowing funnel of specialization. You go from being a political-science major to being a lawyer to being a corporate attorney to being a corporate attorney focusing on taxation issues in the consumer-products industry. You go from being a biochemistry major to being a doctor to being a cardiologist to being a cardiac surgeon who performs heart-valve replacements.

But he goes on to point out why and how these kinds of defined professional paths—the ones high school and college students students are so often told constitute “success”—might not be optimal, for either the person on the path or society in general. If you “simply go with the flow,” you can end up merely being defined by what someone else has laid out. Perhaps not surprisingly, Deresiewicz goes on to say, “There is an alternative.” He calls it “moral imagination” and defines it this way: “Moral imagination means the capacity to envision new ways to live your life.” I would call it something else: the “third way.”

Deresiewicz’s essay shows why we need more talk about the third way: there are more options out there than further advanced schooling. Stanford in particular is a good place to be reminded of this. Obviously, Deresiewicz doesn’t say you must choose grad school or the professions, but the absence of any acknowledgement about starting your own company implies that those are the two primary choices.

I’ve had similar talk. In my interview with him, Tucker Max describes the primary speech he gives at colleges:

[. . . W]hen you’re an undergrad, generally you think you can do two things. You’re gonna have to get a job after you graduate or you gotta go do more school. Because everyone who’s giving you advice or telling you how to live your life are people who’ve done one of those two things.

He describes a “third way,” with his two normal paths defined a lot like Deresiewicz’s, but in a lower register:

You don’t generally have anyone in your life who has gone out on their own and done something entrepreneurial or done something artistic or truly risky or truly taken the path less traveled, because those people [. . .] don’t work in academics. And don’t become cubicle monkeys. So what I try and explain in my speeches is that there’s a third way. Because a lot of people—I think most people—want to do something besides those two things.

A lot of people want to do something else, but that something else is, in some ways, harder to do than the normal path. Yet the people who go the third way often talk about it as being more satisfying, and the people who go the “two paths” often speak wistfully of the third—despite the difficulty one is likely to encounter. A friend wrote this to me: “I know for a fact that I’d hate [Tucker] Max’s writing, but he’s dead right about how few students are aware that they can do something artistic or creative or entrepreneurial.” Too few students are aware of this—and too few people in general are. You can consider this post a very small step in the direction of increasing awareness.

So far I’ve noted two examples. Paul Graham talks about the problem of standard paths too, in “A Student’s Guide to Startups:” “Till recently graduating seniors had two choices: get a job or go to grad school. I think there will increasingly be a third option: to start your own startup.” His answer is more defined than Deresiewicz’s or Max’s, but the very language he uses is similar. But he’s also got a way of generating the “third way” by funding startups. Instead of merely telling people to find one, he’s creating a third way for people to flow, which might be the most valuable contribution of all, at least for the technically inclined.

I think all three of these disparate writers—Deresiewicz, Max, and Graham—are pointing to a more fundamental need for the imagination necessary to exit the obvious paths that so often end up going nowhere. Of the three, Graham has done the most to institutionalize this process and make it available for others by starting Y Combinator. Max has probably done the most to be a living embodiment of an unusual third way. Deresiewicz is pointing to the possibility from within the way of a well-defined path (and the same one I’m one) from undergrad to graduate school to being a professor. Taken together, they diagnose and offer treatment for the same malady that can’t quite be identified yet comes from so many sources and has so many symptoms: Dilbert, cubicles, malaise, ennui, florescent lights, midlife crises, 20-somethings with advanced degrees working as baristas, waiters, or bartenders, essay writers.

Artistic or creative activities don’t usually come prepackaged in convenient jobs that get handed to college graduates. They get created by people who are artistic and creative, who find a way to turn what they want to do, or their inchoate ideas, into something greater than the idea itself. The “inchoate idea” is important: I suspect most people don’t entirely know what they’re doing when they find a third way. Steven Berlin Johnson has a term for this in his book Where Good Ideas Come From: The Natural History of Innovation: the slow hunch. This happens when something that you’ve been gnawing on slowly develops over time. Johnson describes it much more fully, of course, but a lot of my ideas in writing novels or academic work comes from slow hunches. Writing fiction isn’t an activity that really comes packaged in convenient job form: it is made by each practitioner individually. People who succeed as writers sometimes do so not through conventional publishing, but through alternate ways—as Max did with his website, or as J.A. Konrath apparently does with his blog, “A Newbie’s Guide to Publishing.”

Like Deresiewicz and Max, I don’t really have a solution to the problem other than to encourage you to think imaginatively. But who’s against thinking imaginatively? Partners are probably telling their third-year associates the same thing, even as the associates put in soul-killing seventy hours weeks under those menacing florescent lights. The other part of my solution is to be aware of the problem. I’ll also channel Graham in “What You’ll Wish You’d Known” and encourage you to stay upwind:

In the graduation-speech approach, you decide where you want to be in twenty years, and then ask: what should I do now to get there? I propose instead that you don’t commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward.

It’s not so important what you work on, so long as you’re not wasting your time. Work on things that interest you and increase your options, and worry later about which you’ll take.

Suppose you’re a college freshman deciding whether to major in math or economics. Well, math will give you more options: you can go into almost any field from math. If you major in math it will be easy to get into grad school in economics, but if you major in economics it will be hard to get into grad school in math.

Flying a glider is a good metaphor here. Because a glider doesn’t have an engine, you can’t fly into the wind without losing a lot of altitude. If you let yourself get far downwind of good places to land, your options narrow uncomfortably. As a rule you want to stay upwind.

“Work on things that interest you and increase your options:” the target of Graham’s essay is nominally high school students, but it’s applicable to a much broader swath of people. Maybe you’re one. If so, however, you’ll probably read this and then go back to filling out those TPS reports. Or maybe you’ll be one of the very rare people who realize there is no speed limit and react appropriately. At least you can’t say that no one told you. At least three people have: Deresiewicz, Max, and Graham. Four if you count me, writing a meta essay.

No one can agree on how to make tea

Since reading “A Hacker’s Guide to Tea” (and this worthy discussion) I’ve begun drinking more of the beverage, which I rather like now that I know how to make it: tea isn’t hard to prepare. But I came from the idiotic “more is better” school of thought and figured the longer and hotter that tea is steeped, the better it must be. In reality, this just makes it tremendously bitter and vile.

In actuality, light teas—like green and white—need to be steeped at temperatures well below boiling for about a minute or two. Black teas should be steeped with boiling water for two to three minutes. Tea should be loose leaf and circulate freely with the hot water poured on it; I now use an IngenuiTEA from Adagio for one to two cups. The drink falls from the bottom of the device, rather like it’s peeing, but I find the overall effect quite amusing.

Still, the number of people with very strong and conflicting opinions about how to make tea is astonishing. “Very strong and conflicting opinions” would also have made an excellent title for Christopher Hitchens’ memoir, but today he merely offers bilious tea making instructions—and that’s as strange a construction to write as it is to read—in How To Make a Decent Cup of Tea: Ignore Yoko Ono and John Lennon, and heed George Orwell’s tea-making advice:

It’s quite common to be served a cup or a pot of water, well off the boil, with the tea bags lying on an adjacent cold plate. Then comes the ridiculous business of pouring the tepid water, dunking the bag until some change in color occurs, and eventually finding some way of disposing of the resulting and dispiriting tampon surrogate. The drink itself is then best thrown away, though if swallowed, it will have about the same effect on morale as a reading of the memoirs of President James Earl Carter.

I love the overstated, overstuffed phrasing: “ridiculous business,” “dispiriting tampon surrogate,” “best thrown away.” But his advice is limited to black tea. He goes to to quote Orwell ” ‘[O]ne should take the teapot to the kettle, and not the other way about. The water should be actually boiling at the moment of impact, which means that one should keep it on the flame while one pours.’ This isn’t hard to do, even if you are using electricity rather than gas, once you have brought all the makings to the same scene of operations right next to the kettle.” But, in The Story of Tea: A Cultural and Drinking Guide (which is not very good and reads like a travelogue), Mary Lou and Robert J. Heiss say:

While millions of avid tea drinkers around the world ‘take the teapot to the kettle’ to use water that is as hot as possible to brew ‘proper English tea,’ we find that even the stoutest black teas prefer to be brewed in water that is slightly off the boil. Any perceived reduction in strength can be made up by steeping the tea a little longer.

Here is my proposition for Hitchens and innumerable others: instead of insisting that one way is better, why not take the Coke-Pepsi challenge? Brew a large number of cups both ways, give them to a large number of people over a large number of occasions, and see which one works better? More likely than not, neither will work out. Based on the large amount of contradictory advice I’ve read regarding tea, I would guess that once one has a reasonably fresh, loose leaf and a reasonable knowledge of approximate brewing temperatures, the rest is superstition.

The analogy to wine is probably appropriate: except for people with very highly developed senses for wine, most of us probably can tell “bad” “better” and “best” but little more. So we decide what wine to drink based on price and innuendo more than anything else. By the same token, I bet that Hitchens can’t really tell the difference between tea brewed off the boil or not, but he probably derives a certain amount of status by having very strong opinions about how tea should be brewed. I leave to the reader who is familiar with Hitchens’ work to decide whether this general principle might apply beyond the realm of caffeinated beverages.

Finally, Hitchens is only writing about black tea, but he doesn’t say as much. Making green or white tea as he recommends will be terrible. Still, even there the advice is contradictory Tony at The Chicago Tea Company—quoted in the first link—says black tea should be steeped for one minute or so. “A guide to tea” by the foppish Chris Cason says that black tea should be steeped no more than five minutes, while white teas are more forgiving and could be steeped as long as seven. I am more inclined to agree with Tony, based on experiment. The issue of making tea should not, however, be one argued with the fervor of someone discussing Middle Eastern politics.

EDIT: I’m now reading Orwell’s “A Nice Cup of Tea,” in which he says:

“If you look up ‘tea’ in the first cookery book that comes to hand you will probably find that it is unmentioned; or at most you will find a few lines of sketchy instructions which give no ruling on several of the most important points.
This is curious, not only because tea is one of the mainstays of civilisation in this country, as well as in Eire, Australia and New Zealand, but because the best manner of making it is the subject of violent disputes.
When I look through my own recipe for the perfect cup of tea, I find no fewer than 11 outstanding points. On perhaps two of them there would be pretty general agreement, but at least four others are acutely controversial” {Orwell “Essays”@990}.

To me, the most interesting part of this is his comment about how only “two” of 11 points “would be in pretty general agreement,” while “four others are acutely controversial.” This indicates that tea-making preferences have been an issue for at least sixty years (the essay was published in 1946) and are likely to continue to be controversial in the near future. So far as I know, “violent disputes” haven’t resulted from tea making, but then perhaps Americans, especially those in overheated Arizona, are not so particular about tea, or there isn’t the critical mass necessary for violent factions to form.

EDIT 2: A redditor pointed me to the ISO 3103 standard on making tea. Even the parody, however, leans toward black tea: “The method consists in extracting of soluble substances in dried tea leaf, containing in a porcelain or earthenware pot, by means of freshly boiling water [. . .]” Follow the standard regarding green tea and you’ll find a less-than-optimal cup.


As long as we’re discussing Hitchens, here’s one of the more amusing quotes from Hitch-22: “I always take it for granted that sexual moralizing by public figures is a sign of hypocrisy or worse, and most usually a desire to perform the very act that is most being condemned.”

Why unpublished novelists keep writing: why not? An answer as to why this one does

Alix Christie’s “We Ten Million” asks why unpublished novelists write, the number being an estimate of the number of unpublished novels out there (hat tip Heather Horn). Very few books get published; very few that do get any attention; very few of those even make any money; and delusion is a vital skill for many who continue writing. Rationally, most of these would-be writers would probably be better off if they quit writing and did something more economically and socially more productive with their time, like working for Wal-Mart, digging holes and filing them up, writing blogs about their cats, etc.

According to Horn, possible answers include: the idea of a craft, the importance of literature (even if it’s unread?), the need for story, and art as courage. I’m not sure I buy any of those, or any of Christie’s answers. I think the real reason is simpler: novelists keep writing because they basically like the act of writing novels. Publishing, fame, fortune, and all the rest would be nice, as they certainly would be for this unpublished writer with an inbox full of requests for fulls and partials (industry lingo for “send me the full manuscript” or “send me some chapters”) from agents, but the possibility of future and unlikely accolades don’t fuel the work on a daily basis. Instead, the daily drive to succeed is about the material itself. I’ve mentioned this famous quote before and will do so again: “Robertson Davies, the great Canadian novelist, once observed: ‘There is absolutely no point in sitting down to write a book unless you feel that you must write that book, or else go mad, or die.’ ”

The people writing unpublished novels are presumably doing so in lieu of going mad or dying. They feel they have to or need to write.

In a recent post, I wrote about an exchange with a friend who’s an undergrad:

A lot of my motivation comes from a fantasy of myself-as-_____, where the role that fills the blank tends to change erratically. Past examples include: writer, poet, monk, philosopher, womanizer. How long will the physicist/professor fantasy last? 

I replied:

This is true of a lot of people. One question worth asking: Do you enjoy the day-to-day activities involved with whatever the fantasy is? For me, the “myself-as-novelist” fantasy continues to be closer to fantasy than reality, although “myself-as-writer” is definitely here. But I basically like the work of being a novelist: I like writing, I like inventing stories, I like coming up with characters, plot, etc. Do I like it every single day? No. Are there some days when it’s a chore to drag myself to the keyboard? Absolutely. And I hate query letters, dealing with agents, close calls, etc. But I like most of the stuff and think that’s what you need if you’re going to sustain something over the long term. Most people who are famous or successful for something aren’t good at the something because they want to be famous or successful; they like the something, which eventually leads to fame or success or whatever.

“I basically like the work of being a novelist,” including the writing and so forth. That’s why I keep going. I think anyone who continues for any other reason is probably already mad, to use Davies’ term. Alternately, a lot of the would-be novelists out there are probably writing not because they want to get published, but to work out their inner demons, or signal something, or because they don’t know what else to do with their lives, or because they’re misinformed. They’re doing something other than really trying to write something that someone else might want to read.

I’m reminded of a passage from Norah Vincent’s nonfiction book Self-Made Man, in which she describes dressing like and passing as a man. Vincent, dressed as a man named “Ned,” describes going out with a woman met on an online dating site, who “was either the most conversationally inconsiderate person I’d ever met or the most socially impervious:”

Clearly she wasn’t ready to start dating again. She wasn’t looking for a relationship. She was looking for distraction and an ear to tell her troubles to. She didn’t have enough emotional energy left to get seriously involved with Ned [. . .]

A lot of would-be writers are probably doing much the same. I’d guess that relatively few of those ten million novels are publishable, or that many of the writers of those novels have any clue what something like publishable might mean (I didn’t when I started, which might’ve helped me; more on that below). As Laura Miller says regarding the “slush pile” of unsolicited queries agents and publishers get:

You’ve either experienced slush or you haven’t, and the difference is not trivial. People who have never had the job of reading through the heaps of unsolicited manuscripts sent to anyone even remotely connected with publishing typically have no inkling of two awful facts: 1) just how much slush is out there, and 2) how really, really, really, really terrible the vast majority of it is. Civilians who kvetch about the bad writing of Dan Brown, Stephenie Meyer or any other hugely popular but critically disdained novelist can talk as much trash as they want about the supposedly low standards of traditional publishing. They haven’t seen the vast majority of what didn’t get published — and believe me, if you have, it’s enough to make your blood run cold, thinking about that stuff being introduced into the general population.

So you can probably knock off at least 90% of those unpublished novels as not even being serious attempts, where “serious” means “at least thinking about what makes good novels good and bad novels bad.” Of those serious attempts, a lot of them are probably written by people who will one day be good but aren’t yet (Charlie Stross, the SF writer: “[. . .] I was averaging 1-2 novels a year, for very approximate values of “novel”. (They weren’t publishable. I was writing my million words of crap. You don’t want to read them, honest.)”). John Scalzi says something similar: “Writing an entire novel is something most people have to work up to,” and it’s really hard.

I started four novels and wisely abandoned them. I finally wrote two feature-complete novels in the sense that they started and had ends and had middles that led to the ends, kind of, but they were terrible, and I sent them to agents and got deservedly rejected. If you were one of those slush pile readers, I apologize, but those attempts were so far in the past that you’ve probably forgotten them. Then I wrote the last three novels over the last three or so years and started getting those requests for fulls and partials, which was a lot like the typical dating experience in that they ended with variations of “I like you, but not in that way.”

Nonetheless, I would like to think I can stand far enough back from myself to say that, at the very least, they’re publishable, and I think quite fun. Eventually I assume I will write something that gets a literary agent or press to agree with me—or I’ll go mad or die before that die arrives. Between now and the, I keep writing mostly because a) I’m an idiot (this shouldn’t be discounted) and b) I mostly like the work, as I described above. The second might seem a minor variation on what Christie says—”the only reason is my belief that I have got a story that I must tell”—but it’s a sufficiently important one that I’ll forward it here.

The function of stories in society and some of that other stuff is good, but I’m still guessing that my real reason (and, probably, hers) is that I like to write, which is slightly different from having a story to tell. I suspect the same is true of most artists and intellectuals and hackers; even most hacker/programmer types probably like the fact that they can change the world with their code, and so forth, but their big motivation is probably solving problems and writing code. Notice how the verb “writing” takes on a noun—code—that “writing prose” has lost. The word shows the similar impetus underlying both activities.

I’m not a hacker because, although I’ve written a little bit of code, I don’t like doing it all that much. If I did, it would’ve been vastly smarter to pursue that than it is to continue what I’m doing now. At least I’ve done enough to appreciate how hard it is to write code. And those write good code are rewarded for their skill. Good hackers, programmers, or computer scientists (pick your choice, each with its shades of connotation but denoting more or less the same activity) make a lot of money, and the smart ones often have an immediate, tangible effect on the world. This is sometimes but not always true of writers. But when I began writing fiction with some level of seriousness, I didn’t sit down and say to myself, “What is the optimal path?” I had some ideas and began typing. A depressingly large number of years later, I’m still doing the same basic thing in a way that might be detrimental to my own best interests. So why do I keep going? Why am I part of the ten million?

Because I like the work.

Being wrong, and a partial list of ways I’ve been wrong

A variety of somewhat big-deal econ bloggers have written about things they now believe they were wrong about. Looking back on changed opinions (which is a slightly more polite of saying “I was wrong”) is a useful exercise in intellectual honesty—a trait most people lack. I might be among them but like to think that I’m more intellectually honest than I actually am.

Still, here are some (unsorted) opinions on topics about which I’ve been wrong or at least not as right as I could be:

1) I basically believed that the stock market’s average rate of return would remain 10% per year over reasonable time periods. That it will still average somewhere close to 10% per year still seems probable, but the “reasonable time periods” (like two decades or so) no longer does, and in the long run, as a famous economist whose name escapes me observed, we’re all dead.

2) Like McArdle, the “Great Moderation” seemed real up until the last six months or so.

3) There are some things I was wrong about that turned out well: I didn’t think we’d see a black president in my lifetime. In 2004, if you’d told me that a black man would be president in 2008, I probably would’ve laughed at you.

4) I didn’t get why people liked Jane Austen until I read James Wood’s How Fiction Works, with its description of free indirect speech, and his examples from Austen. Now I do.

5) The iPhone? Nice, but a fad. I didn’t think it would be as important as it has been, or that other phone manufacturers would be so slow to respond.

6) I didn’t think Facebook would become and stay as popular as it is; I signed as an undergrad chiefly as a quick way of figuring out which girls already had boyfriends. Now I seldom log on, but evidently I’m in the minority.

7) I used to believe that it was possible to have rational discussions about religion and/or politics with most people. Now I don’t. Both subjects is are seldom subjected to empirical tests, so no feedback mechanism can demonstrate when or if a belief is wrong. Politics are (slightly) more subject to such tests, via election, studies, and the like, but the broadest political beliefs aren’t really. See Paul Graham’s “Keep Your Identity Small” for more on this subject, along with “What You Can’t Say.” At best one can have meta-conversations about religion and politics (“Why do people need religion?”)

8) During the ramp-up to the Iraq war, I was in college, and many of my professors were virulently against the war and thought that the government was perfectly capable of dissembling and distorting the debate about weapons of mass destruction; some had lived through Vietnam, with its phony Gulf of Tonkin incident, and the later Iran-Contra hearings. I hadn’t and thought it wildly implausible that so many people and institutions would be hoodwinked by faulty information, so I was more or less in favor of the war, like a lot of my equally gullible compatriots.

Oops.

9) On first reading Carlos Ruiz Zafón’s The Shadow of the Wind, I didn’t appreciate many of its most impressive qualities, especially regarding the narrative, the dialogue, and the extent to which the novel combines post-modern games with immense readability. Now I do.

10) I used to think that the sexual double standard was primarily due to misinformation, the cruel application of religious principles to individual lives, ignorance, and malice. Now I think the sexual double standard is primarily due to daughter-guarding by parents and parents’ influence on culture, female efforts to guard men through slandering their potential competitors’ reputations, general female competitiveness, the fact that the choosier sex is always the one that invests more in offspring, and differing economic and pleasure incentives acting on children than their parents.

These forces help explain a great deal of our culture’s confusion about sexuality and its mixed messages—especially among the young. I used to think this confusion would eventually devolve into a more laissez-faire, I’m-okay-you’re-okay attitude, which it still might, but now that day seems very far off.

(See my essay “The Weekly Standard on the New-Old Dating Game, Hooking Up, Daughter-Guarding, and much, much more” for details.)

11) A student question from two years ago prompted me to realized that, although I used to believe something close to the classical economic model of man in which behavior automatically reveals preferences and if someone does something, it must be because they rationally believe it will benefit them, now I’ve realized that context, framing effects, peer pressure, time preferences, and the like have a far greater effect than I once gave them credit for. Reading Dan Ariely’s Predictably Irrational, Philip Zimbardo’s The Lucifer Effect and The Time Paradox, Neil Strauss’ The Game, and Tim Harford’s The Logic of Life contributed to my change in views.

It might not hurt for you to try this test for yourself: if you can’t think of anything you’ve been wrong about, does that mean that you’re consistently right about everything, or does that mean something quite different? If you need help, there’s an entire book on the subject by Kathryn Schulz named Being Wrong: Adventures on the Margins of Error, although I haven’t actually read said book yet.

Being wrong and a partial list of ways I’ve been wrong

A variety of somewhat big deal bloggers in economics have written about things they now believe they were wrong about. Looking back on changed opinions (which is a slightly more polite of saying “I was wrong”) is an exercise in intellectual honesty—a trait widely lacked.

Some (unsorted) things I’ve been wrong about:

1) I basically believed that the stock market’s average rate of return would remain 10% per year over reasonable time periods. That it will still average somewhere close to 10% per year still seems probable, but the “reasonable time periods” (like two decades or so) no longer does, and in the long run, as a famous economist whose name escapes me observed, we’re all dead.

2) Like McArdle, the “Great Moderation” seemed real up until the last six months or so.

3) There are some things I was wrong about that turned out well: I didn’t think we’d see a black president in my lifetime. In 2004, if you’d told me that a black man would be president in 2008, I probably would’ve laughed.

4) I didn’t get why people liked Jane Austen until I read James Wood’s How Fiction Works, with its description of free indirect speech, and his examples from Austen. Now I do.

5) The iPhone? Nice, but a fad. I didn’t think it would be as big a deal as it has been, or that other phone manufacturers would be so slow to respond.

6) I didn’t think Facebook would become and stay as popular as it is; I signed as an undergrad chiefly as a quick way of figuring out which girls already had boyfriends. Now I seldom log on, but evidently I’m in the minority. Pictures of dogs, food, babies… I don’t care but the evidence shows many, many people do.

7) I used to believe that it was possible to have rational discussions about religion and/or politics with most people. Both subjects are seldom subjected to empirical tests, so no feedback mechanism can demonstrate when or if a belief is wrong. Politics are (slightly) more subject to such tests, via election, studies, and the like, but the broadest political beliefs aren’t really. See Paul Graham’s “Keep Your Identity Small” for more on this subject, along with “What You Can’t Say.”

8) During the ramp-up to the Iraq war, I was in college, and many of my professors were virulently against the war and thought that the government was perfectly capable of dissembling and distorting the debate about weapons of mass destruction; some had lived through Vietnam, with its phony Gulf of Tonkin incident, and the later Iran-Contra hearings. I hadn’t and thought it wildly implausible that so many people and institutions would be hoodwinked by faulty information, so I was more or less in favor of the war, like a lot of my equally gullible compatriots.

9) On first reading Carlos Ruiz Zafón’s The Shadow of the Wind, I didn’t appreciate many of its most impressive qualities, especially regarding the narrative, the dialogue, and the extent to which the novel combines post-modern games with immense readability. Now I do.

10) I used to think that the sexual double standard was primarily due to misinformation, the cruel application of religious principles to individual lives, ignorance, and malice. Now I think the sexual double standard is primarily due to daughter guarding by parents and parents’ influence on culture, female efforts to guard men through slandering their potential competitors’ reputations, general female competitiveness, and the fact that the choosier sex is always the one that invests more in offspring.

These forces help explain cultural incoherence about sexuality, especially among the young. A laissez-faire, I’m-okay-you’re-okay attitude seems very far off.

(See “The Weekly Standard on the New-Old Dating Game, Hooking Up, Daughter-Guarding, and much, much more.”)

11) A student question from two years ago prompted me to realized that, although I used to believe something close to the classical economic model of man in which behavior automatically reveals preferences and if someone does something, it must be because they rationally believe it will benefit them, now I’ve realized that context, framing effects, peer pressure, time preferences, and the like have a far greater effect than I once gave them credit for. Reading Dan Ariely’s Predictably Irrational, Philip Zimbardo’s The Lucifer Effect and The Time Paradox, Neil Strauss’ The Game, and Tim Harford’s The Logic of Life contributed to my change in views.

Try the “What I’ve been wrong about” test for yourself. If you can’t think of anything you’ve been wrong about, does that mean that you’re consistently right about everything, or does that mean something quite different? If you need help, see Kathryn Schulz’s Being Wrong: Adventures on the Margins of Error, although I haven’t actually read said book yet.

Don’t rent an apartment from Navid Abedian in Tucson, Arizona, or, how I learned to be wary of lawsuits

In 2008 I moved in Tucson for grad school and rented a condo that turned out to be a decent place to live, except for the landlord and a neighbor universally referred to as “Crazy Nick” (he was not crazy in a good way). When my roommate and I moved, the landlord kept about $500 of our deposit after promising that he wouldn’t and saying that he’d refund it. Stealing our security deposit violated Arizona’s Residential Landlord & Tenant Act, which regulates the usual tenant-landlord problems.

Because I’m such a smart guy and was both unhappy about his lies and interested in our money, I decided to sue him in small claims court, where I eventually won a $1,350 judgment. He paid $350 after a debtor’s hearing in January and promised to pay the rest; in June I sought another debtor’s hearing to compel him to pay at which point he threatened to come over to my apartment and kill me. For those of you keeping score, this marks the second time someone has done so in one summer, up from zero times previously in my entire life.

I filed a police report, stayed at friends’ houses for a few days, and canceled the hearing: improbable though Abedian’s threat might be, it’s not worth shooting or being shot over $1,000. He’s also a cipher to me: all I know is that he works in a carpet store, bought a condo in Tucson near the height of the ’00s real estate boom and, according to a Google search, might have his house foreclosed on. In other words, he might be desperate, and people have killed each other over far less than $1,000.

Although running away sets a bad precedent—will he just threaten to kill the next tenant who comes along? am I not doing the right thing for my fellow man—I still think capitulating wiser than continuing.

What originally seemed to mostly be entertainment (i.e. going to court and pontificating), began to suck up way too much mental energy. In the Hacker News discussion of Paul Graham’s “The Top Idea In Your Mind,” grellas wrote, “There is a lesson here about lawsuits, which will drain you of both money and peace of mind all at the same time. Sometimes you can’t turn the other cheek, much as you would like to do so, and have no choice but to fight. Having the guts to stand up for yourself (or for your company) is in itself a virtue and there are times when it is best not to walk away.”

He’s right, and a lawsuit I’d imagined as entertainment and teaching a useful lessons that might turn into dividends for the next tenants backfired. It also occupied way too much space in my mind—space that I should’ve spent writing or doing research. Instead I worried about the sanity and desperation of a guy I didn’t know and who was probably armed.

In Francine Prose’s novel Touch, the protagonist is a 14- or 15-year-old girl named Maisie, who tells her preening stepmother, Joan, a version of what happened on a bus when two or three boys touched her breasts in somewhat murky circumstances. It isn’t clear at the narrative’s start whether she consented, but the event as narrated to us is also one in which the characters act without enough culpability to call what they did anything beyond adolescent horseplay and power struggles.

Joan wants to meet a lawyer, which makes Maisie think that “I was filled with dread. Pure dread. It felt like icy water trickling down my back.” Joan says, “It would be a matter of principle.” Most people don’t lead their lives solely according to principle; pragmatics matter too. Few of us want to be martyrs for a cause, and if we do, that cause better be worth it. Most of us want to get along. Altruistic punishment is real but can be overrated. Maisie would be harming her own well-being and self-interest. I thought I was standing up for the principle of tenants’ rights and for fairness, but I chose to give up that principle when Navid threatened to kill me. Pragmatics won.

Like Maisie, I’m choosing pragmatism—which I probably should’ve learned in the first place. I’ve started Bleak House a couple of times (I’m not a Dickens fan) and understand Jardyce vs Jardyce well enough to know that lawsuits are often vehicles for mutually assured destruction more than they are about fairness or rights. When in doubt or when it’s avoidable, don’t get the law involved. And, apparently, be ready to write off your security deposit.

OLYMPUS DIGITAL CAMERAEDIT: It’s 2014 and I’d mostly forgotten about Navid, but I just got a letter saying that he declared bankruptcy. I’m a) somehow listed as a creditor and b) the Department of Justice somehow got my current address, in order to c) invite me on some kind of creditors’ committee. I wish I couldn’t say that I don’t feel a little schadenfreude, but, alas, I’m too small a person. Apparently his wife or ex-wife, Linda Kay Abedian Stevens—or Linda Kay Stevens? the wording is unclear—was also on the lease and on the property deed.

Since leaving Tucson I have been threatened with death zero times.