College graduate earning and learning: more on student choice

There’s been a lot of talk among economists and others lately about declining wages for college graduates as a group (for example: Arnold Kling, Michael Mandel, and Tyler Cowen) and males in particular. Mandel says:

Real earnings for young male college grads are down 19% since their peak in 2000.
Real earnings for young female college grads are down 16% since their peak in 2003.

See the pretty graphs at the links. These accounts are interesting but don’t emphasize, or don’t emphasize as much as they should, student choice in college majors and how that affects earnings. In “Student choice, employment skills, and grade inflation,” I said that colleges and universities are, to some extent, responding to student demand for easier classes and majors that probably end up imparting fewer skills and paying less. I’ve linked to this Payscale.com salary data chart before, and I’ll do it again; the majors at the top of the income scale are really, really hard and have brutal weed-out classes for freshmen and sophomores, while those at the bottom aren’t that tough.

It appears that students are, on average, opting for majors that don’t require all that much effort.

From what I’ve observed, even naive undergrads “know” somehow that engineering, finance, econ, and a couple other majors produce graduates that pay more, yet many end up majoring in simple business (notice the linked NYT article: “Business majors spend less time preparing for class than do students in any other broad field, according to the most recent National Survey of Student Engagement [. . .]”), comm, and other fields not noted for their rigor. As such, I wonder how much of the earnings picture in your graph is about declining wages as such and how much of it is really about students choosing majors that don’t impart job skills of knowledge (cf Academically Adrift, etc.) but do leave plenty of time to hit the bars on Thursday night. Notice too what Philip Babcock and Mindy Marks found in “The Falling Time Cost of College: Evidence from Half a Century of Time Use Data:” “Full-time students allocated 40 hours per week toward class and studying in 1961, whereas by 2004 they were investing about 26 to 28 hours per week. Declines were extremely broad-based, and are not easily accounted for by compositional changes or framing effects.”

If students are studying less, maybe we shouldn’t be surprised that their earnings decline when they graduate. I can imagine a system in which students are told that “college” is the key to financial, economic, and social success, so they go to “college” but don’t want to study very hard or learn much. They want beer and circus. So they choose majors in which they don’t have to. Schools, in the meantime, like the tuition dollars such students bring—especially when freshmen and sophomores are often crammed in 300 – 1,000-person lecture halls that are extraordinarily cheap to operate because students are charged the same amount per credit hour for a class of 1,000 as they are for a seminar of 10. Some disciplines increasingly weaken their offerings in response to student demand.

Business appears to be one of those majors. It’s in the broad middle of Payscale.com’s salary data, which is interesting given how business majors presumably go into their discipline in part hoping to make money—but notice too just how many generic business majors there are. The New York Times article says “The family of majors under the business umbrella — including finance, accounting, marketing, management and “general business” — accounts for just over 20 percent [. . .] of all bachelor’s degrees awarded annually in the United States, making it the most popular field of study.” That’s close to what Louis Menand reports in The Marketplace of Ideas: “The biggest undergraduate major by far in the United States is business. Twenty-two percent of all bachelor’s degrees are awarded in that field. Ten percent of all bachelor’s degrees are awarded in education.” If all these business majors graduate without any job skills, maybe we shouldn’t be all that surprised at their inability to command high wages when they graduate.

I’d like to know: has the composition of majors changed over the years Mandel documents? If so, from what to what? Menand has some coarse data:

There are almost twice as many bachelor’s degrees conferred every year in social work as there are in all foreign languages and literatures combined. Only 4 percent of college graduates major in English. Just 2 percent major in history. In fact, the proportion of undergraduate degrees awarded annually in the liberal arts and sciences has been declining for a hundred years, apart from a brief rise between 1955 and 1970, which was a period of rapidly increasing enrollments and national economic growth. Except for those fifteen unusual years, the more American higher education has expanded, the more the liberal arts sector has shrunk in proportion to the whole.

But he’s not trying to answer questions about wages. Note too that my question about composition is a genuine one: I have no idea of what the answer is.

One other major point: if Bryan Caplan is right about college being about signaling, then there might also be a larger composition issue than the one I’ve already raised: people who aren’t skilled learners and who don’t have the willingness or capacity to succeed after college may be increasingly attending college. In that case, the signal of a college degree isn’t as valuable because the people themselves going through college aren’t as good—they’re on the margins, and the improvement to their skillset is limited. Furthermore, colleges universities aren’t doing all that much to improve that skillset—see again Academically Adrift.

I don’t know what, if anything, can be done to improve this dynamic. Information problems about which college major pay the most don’t seem to be a major issue, at least anecdotally; students know that comm degrees are easy and other, more lucrative degrees are hard. There may be Zimbardo / Boyd-style time preference issues going on, where students want to consume present pleasure in the form of parties and “hanging out” now at the expense of earnings later, and universities are abetting this in the form of easy majors.

This is the part where I’m supposed to posit how the issues described above might be improved. I don’t have top-down, pragmatic solutions to this problem—nor do I see strong incentives on the part of any major actors to solve it. Actually, I don’t see any solutions, whether top-down or bottom-up, because I don’t think the information asymmetry is all that great and consumption preferences mean that, even with better information, students might still choose comm and generic business.

Mandel ends his post by saying, “Finally, if we were going to design some economic policies to help young college grads, what would they be?” The answer might be something like, “make university disciplines harder, so students have to learn something by the end,” but I don’t see that happening. That he asks the question indicates to me he doesn’t have an answer either. If there were one, we wouldn’t have a set of interrelated problems regarding education, earnings, globalization, and economics, which aren’t easy to disentangle.

Although I don’t have solutions, I will say this post is a call to pay more attention to how student choices and preferences affect education and earnings discussions.

EDIT: See also College has been oversold, and pay special attention to the data on arts versus science majors. I say this as someone who majored in English and now is in grad school in the same subject, but by anecdotal observation I would guess about 75% of people in humanities grad schools are pointlessly delaying real life.

Facebook, go away—if I want to log in, I know where to find you

Facebook keeps sending me e-mails about how much I’m missing on Facebook; see the image at the right for one example. But I’m not convinced I’m missing anything, no matter how much Facebook wants me to imagine I am.

In “Practical Tips on Writing a Book from 23 Brilliant Authors,” Ben Casnocha says that writers need to “Develop a very serious plan for dealing with internet distractions. I use an app called Self-Control on my Mac.” Many other writers echo him. We have, all of us, a myriad of choices every day. We can choose to do something that might provide some lasting meaning or value. Or we can choose to tell people who are often effectively strangers what we ate for dinner, or that we’re listening to Lynyrd Skynyrd and Lil’ Wayne, or our inconsidered, inchoate opinions about the political or social scandal of the day, which will be forgotten by everybody except Wikipedia within a decade, if not a year.

Or we can choose to do something better—which increasingly means we have to control distractions—or, as Paul Graham puts it, “disconnect” them. Facebook and other entities that make money from providing distractions are, perhaps not surprisingly, very interested in getting you more interested in their distractions. That’s the purpose of their e-mails. But I’ve becoming increasingly convinced that Facebook offers something closer to simulacra than real life, and that the people who are going to do something really substantial are, increasingly, going to be the people who can master Facebook—just as the people who did really substantial things in the 1960 – 2005 period learned to master TV.

Other writers in the “Practical Tips” essay discuss the importance of setting work times (presumably distinct from Facebook times) or developing schedules or similar techniques to make sure you don’t let, say, six hours pass, then wonder what happened during those six hours—probable answers might include news, e-mail, social networks, TV, dabbling, rearrange your furniture, cleaning, whatever. All things that might be worthwhile, but only in their place. And Facebook’s place should be small, no matter how much the site itself encourages you to make it big. I’ll probably log on Facebook again, and I’m not saying you should never use Facebook, or that you should always avoid the Internet. But you should be cognizant of what you’re doing, and Facebook is making it increasingly easy not to be cognizant. And that’s a danger.

I was talking to my Dad, who recently got on Facebook—along with Curtis Sittenfeld joining, this is a sure sign Facebook is over—and he was creeped out by having Pandora find his Facebook account with no active effort on his part; the same thing happened when he was posting to TripAdvisor under what he thought was a pseudonym. On the phone, he said that everyone is living in Neuromancer. And he’s right. Facebook is trying to connect you in more and more places, even places you might not necessarily want to be connected. This isn’t a phenomenon unique to Facebook, of course, but my Dad’s experience shows what’s happening in the background of your online life: companies are gathering data from you that will reappear in unpredictable places.

There are defenses against the creeping power of master databases. I’ve begun using Ghostery, a brilliant extension for Firefox, Safari, and Chrome that lets one see web bugs, beacons, and third-party sites that follow your movements around the Internet. Here’s an example of the stuff Salon.com, a relatively innocuous news site, loads every time a person visits:

What is all that stuff? It’s like the mystery ingredients in so much prepackaged food: you wonder what all those polysyllabic substances are but still know, on some level, they can’t be good for you. In the case of Salon.com’s third-party tracking software, Ghostery can at least tell you what’s going on. It also gives you a way to block a lot of the tracking—hence the strikethroughs on the sites I’ve blocked. The more astute among you will note that I’m something of a hypocrite when it comes to a data trail—I still buy stuff from Amazon.com, which keeps your purchase history forever—but at least one can, to some extent, fight back against the companies who are tracking everything you do.

But fighting back technologically, through means like Ghostery, is only part of the battle. After I began writing this essay, I began to notice things like this, via a Savage Love letter writer:

I was briefly dating someone until he was a huge asshole to me. I have since not had any contact with him. However, I have been Facebook stalking him and obsessing over pictures of the guys I assume he’s dating now. Why am I having such a hard time getting over him? Our relationship was so brief! He’s a major asshole!

I don’t think Facebook is making it easier for the writer to get over him or improve your life. It wouldn’t be a great stretch to think Facebook is making the process harder. So maybe the solution is to get rid of Facebook, or at least limit one’s use, or unfriend the ex, or some combination thereof. Go to a bar, find someone else, reconnect with the real world, find a hobby, start a blog, realize that you’re not the first person with these problems. Optimal revenge, if you’re the sort of person who goes in that direction, is a life well-lived. Facebook stalking is the opposite: it’s a life lived through the lives of others, without even the transformative power of language that media like the novel offer.

Obviously, obsessive behavior predated the Internet. But the Internet and Facebook make it so much easier to engage in obsessive behavior—you don’t even have to leave your house!—that the lower friction costs make the behavior easier to indulge. One solution: remove the tool by which you engage in said obsessive behavior. Dan Savage observes, “But it sounds like you might still have feelings for this guy! Just a hunch!” And if those feelings aren’t reciprocated, being exposed to the source of those feelings on a routine basis, even in digital form, isn’t going to help. What is going to help? Finding an authentic way of spending your time; learning to get in a state of flow; building or making stuff that other people find useful. Notice that Facebook is not on that list.

Some of you might legitimately ask why I keep a Facebook account, given my ambivalence, verging on antipathy. The answers are several fold: the most honest is probably that I’m a hypocrite. The next-most honest is that, if / when my novels start coming out, Facebook might be useful as an ad tool. And some people use Facebook and only Facebook to send out messages about events and parties. It’s also a useful to figure out when I’m going to a random city who might’ve moved there. Those people you lost touch with back in college suddenly become much closer when you’re both strangers somewhere.

But those are rare needs. The common needs that Facebook fulfills—to quasi-live through someone else’s life, to waste time, to feel like you’re on an anhedonic treadmill of envy—shouldn’t be needs at all. Facebook is encouraging you to make them needs. I’m encouraging you to realize that the real answers to life aren’t likely to be found on Facebook, no matter how badly Facebook wants to lure you to that login screen—they’re likely going to be found within.


By the way, I love In Practical Tips on Writing a Book from 23 Brilliant Authors. I’ve read it a couple times and still love it. It’s got a lot of surface area for such a short post, which is why I keep linking to it in various contexts.

The Steve Jobs Biography

Like everyone else, I started Walter Isaacson’s Steve Jobs biography today. It’s wonderful. In the first pages, Isaacson gives a sense of how Jobs both viewed himself and was viewed in his place at Apple: “When he was restored to the throne at Apple [. . . .]” How many companies could see their CEOs as occupying thrones? Almost no one has or had the medieval level of control Jobs did over Apple. But he didn’t exercise that control capriciously: he used it to make things people want. Lower on the same page, Isaacson describes his unwillingness to write on Jobs at first, but he says that he “found myself gathering string on the subject” of Apple’s early history. “Gathering string:” it’s something I do all the time, using the methods Steven Berlin Johnson describes in this essay about DevonThink Pro. One imagines the string eventually being knit into a sweater, but first one has to have the material.

A page later, Isaacson says “The creativity that can occur when a feel for both the humanities and the sciences combine in one strong personality was the topic that most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to creating innovative economies in the twenty-first century.” By now, such an assertion is almost banal, but that doesn’t mean it isn’t right and doesn’t mean it shouldn’t be asserted. Whenever you hear someone creating the false binary C. P. Snow discusses deconstructs in Two Cultures, point them to Jobs, who is merely the most salient example of why there aren’t two or more cultures—there’s one. You can call it creative, innovative, human-centered, discovery-oriented, bound by makers, or any number of other descriptions, but it’s there. It’s not just “a key to creating innovative economies in the twenty-first century,” either. It’s a key to being.

One more impression: while discussing the Apple II and the role of marketing guy Mike Markkula, Isaac describes the three principles Markkula adopts: “empathy,” “focus,” and, most interesting to this discussion, the “awkwardly named [. . .] impute.” The last principle “emphasized that people form an opinion about a company or product based on the signals that it conveys. ‘People DO judge a book by its cover,’ he wrote.” He’s right, and that brings up this book as a physical object: it’s beautiful. A single black and white picture of Jobs as an older man, still look vaguely like a rapscallion, dominates the cover. Another picture of him, this time as a younger man, dominates the back. The pages themselves are very white, and the paper quality is high; ink doesn’t bleed through easily, and the paper resists feathering. Jobs agreed not to meddle with the text; Isaacson says “He didn’t seek any control over what I wrote.” But he did meddle around the text, however: “His only involvement came when my publisher was choosing the cover art. When he saw an early version of a proposed covert treatment, he disliked it so much that he asked to have input in designing a new version. I was both amused and willing, so I readily assented.”

Good. I wonder if Jobs had “input” in the paper quality too. Sometimes I wonder if publishers are themselves trying to encourage people to adopt eBooks through the use of lousy paper stock and spines in books, especially hardcovers. Take Steven Berlin Johnson’s excellent book, Where Good Ideas Come From. The cover is black, with yellow text shaped like a lightbulb. Excellent design. But the pages themselves are a brownish gray, like newsprint, and the glued binding feels flimsy. The paperback is probably worse. It’s not the kind of book one would imagine Steve Jobs allowing, but the state of Johnson’s book as a physical object indicates what publishers value: cutting corners, making things cheap, and subtly conveying to readers that the publisher doesn’t care enough to make it good.

Publishers, in other words, are ruled by accountants who probably say that you can save $.15 per book by using worse paper. Apple was ruled by a megalomaniac with a persnickety attention to detail. People love Apple. No one, not even authors, love publishers. The reasons are legion, but when I think about what a lot of recent books “impute” to the reader, I think about how Steve Jobs would make them do it differently if he could. If you’re reading this in the distant future, the idea of reading words printed on dead trees is probably as strange to you as riding a carriage would be to me, but for now it matters. And, more importantly, I think books will continue to exist as physical art objects as well as repositories for knowledge as long as the Jobs and Isaacsons of the world make them.

I’m not far into the biography and feel the call of other responsibilities. But I leave Steve Jobs reluctantly, which happens to too few books of any genre. And I have a feeling that thirty years from now I’ll be reading an interview with some inventor or captain of industry who cites Steve Jobs and Steve Jobs as inspirations in whatever that inventor accomplishes.

Follow-up to the eBook and publishing post

See the original post here, and pay special attention to the thoughtful and informed comments (which are a pleasant change from the usual Internet fare). They also bring up some points I’d like to address:

1) I don’t think publishers will go away altogether, even if they persist in some as mere quality signals or brands. Among the millions of self-published books coursing through the Internet, making informed decisions as a reader gets even harder than it is now. In the previous post, I mentioned the problem of false negatives—books that should’ve been published but are rejected—without reiterating that most negatives are true negatives—that is, books that are rejected because they’re bad. Readers are having and will continue to have problems in this regard. As Laura Miller says in “When anyone can be a published author: How do you find something good to read in a brave new self-published world?“:

You’ve either experienced slush or you haven’t, and the difference is not trivial. People who have never had the job of reading through the heaps of unsolicited manuscripts sent to anyone even remotely connected with publishing typically have no inkling of two awful facts: 1) just how much slush is out there, and 2) how really, really, really, really terrible the vast majority of it is.

2) As a result of 1), I wouldn’t be surprised if “publishing” morphs into a much smaller, broader-based business in which editor-agent hybrids take on promising writers in a somewhat traditional manner but don’t offer advances or some of the conventional “perks.” Instead, they’ll work with writers to improve the writers’ writing, structure, and so forth, in exchange for somewhere in the neighborhood of 10 – 20% of the book’s profits.

Be very wary of writers who say they don’t need editors. Maybe Nabokov didn’t need an editor, but pretty much every other writer did and does. And editors are expensive—I know because I’ve looked into what hiring one would cost—and writers, especially young, untested writers, don’t have a lot of money. So I don’t think lump-sum upfront payments will work for most writers, particularly fiction writers. Editors might judge who is worth investment based on signals like, say, blog posts.

Laments like this one by Kristine Kathryn Rusch make me wonder about what function editors are performing now; I can’t excerpt it effectively, but it observes the extent to which junior editors at publishing houses treat her like an idiot. If the experience described in her post is routine or commonplace, I think it bodes ill for conventional publishing houses (assuming, of course, there’s not some mitigating factor she’s not describing in the post).

A lot of writers say publishers aren’t doing that much to promote their book as it is, which may be true, but they do at least send a quality signal. I wonder, though, about the cost of books—especially hardcovers, and still think this cost is going to fall. Which leads me to. . .

3) I think self-published writers are, over time, going to put pricing pressure on conventionally published books. If you’re a random mystery reader and don’t have especially high quality standards for prose quality or prose originality and consume a very large number of mysteries, self-published ones that aren’t as polished as commercially published fiction might be just as good. If you’re buying books for $2.99 on a Nook instead of $5.99 – $9.99 via Nook or mass-market paperback, then you get a lot more word for your buck. This is especially true, it seems to me, in genre publishing, where series are common and so are relatively rapid and similar books.

Being the kind of “informed” reader discussed in # 1 doesn’t stop most people from being not very discriminating.

4) Desperation is underrated as an inspiration to change. Jeff observes in a comment: “As an author just barely at the bottom of the midlist, if my choice is between self-publishing and not publishing and all, I’ll choose the former.” Me too, although, like him, I’d choose conventional publishing at this point in time, given the choice. But writers without a “choice” will increasingly lean towards self-publishing.

5) Blogs and other non-publisher signals of quality may become more important over time. If readers are debating an author’s merits, looking at their blog or other online writing may be a useful way to decide whether a writer is worth the time it takes to begin a novel. I suspect most non-established writers know or suspect this by now, but it’s worth reiterating anyway. These days, when people say things like, “I want to be a writer” to me, I ask if they have a blog. If the answer is “no,” that signals they’re probably not very serious about writing. Even if the blog only has one post a month, if that post is a substantial or interesting one I take it as a positive sign.

6) If you’re interested in how the publishing industry works now and why, despite the media portrayals, it works better than it’s sometimes depicted, take a look at Charlie Stross’s series of posts Common Misconceptions About Publishing, which were last updated in May 2010 but are still required reading for anyone interested in the subject.

7) I don’t think most of my analysis is terribly original, and you could find similar analyses elsewhere. Nonetheless, I find the changing business interesting both as a reader and writer / would-be writer.

8) I’m not sure much, if any of this, matters to readers, but it should matter a lot to writers who care at all about making some money from their work.

Kitchen Confidential: Adventures in the Culinary Underbelly — Anthony Bourdain

Kitchen Confidential: Adventures in the Culinary Underbelly is as good as a lot of people say it is, which is pretty uncommon. It moves quickly and cleverly: as a young man, Bourdain observes an older cook’s hands, which “looked like the claws of some monstrous science-fiction crustacean, knobby and calloused under wounds old and new.” Notice that word, “crustacean,” and how well it fits, especially since the kitchen is making seafood. The memoir is filled with evocative and expressive moments like that. I’m tempted to start listing them. But that would spoil the surprising pleasure they offer on the page.

There’s a moment when Bourdain points out one of the problems with writing about something as sensual as food, since you can never taste the food through words:

. . . the events described are somehow diminished in the telling. A perfect bowl of bouillabaisse, that first, all-important oyster, plucked from the Bassin d’Arcachon, both are made cheaper, less distinct in my memory, once I’ve written about them.

But the problem of something becoming “somehow diminishing in the telling” or “cheaper, less distinct in my memory” are perils not only of the food writer, although he might be particularly sensitive to them, but to the writer of almost any genre. Tactile sensations like food, sex, water, and the like might be especially susceptible, but even our descriptions of our thoughts are probably different once we’ve “written about them.” But writing about them is the only effective way we have of communicating them to others. And Bourdain is very, very good at that communication. I never thought I cared about what it was like to work in a kitchen, or about the tribulations of the chef. I didn’t realize just how dramatic being a chief could be. Now I understand, and am slightly closer to understanding the fascination with cooking TV shows. I say “closer,” however because I’d still rather be in the kitchen with knife and spatula at hand than watching someone else in the kitchen, much as I’d rather be on the field with a soccer ball at my feet than playing the FIFA soccer video game.

I come out of Kitchen Confidential with a sense that I’ve read a religious story, in which the wayward one day finds God. Except most of us moderns don’t really find God, but we find something abstract to serve, and that something is greater than ourselves. For Bourdain it’s food, despite the many problems that come with it. For others it might be art, science, math, business, the ideal of the family. The things you can choose to admire proliferate. But most of us only choose one or maybe two things. Or the thing chooses us.

You have to love the thing, as Bourdain does cooking, but you can’t love it only for itself. I’ve read the unfortunate prose of plenty of people who say they love “writing” but don’t love it enough to learn basic grammar, expand their vocabularies, or think about the reader more than themselves (Bourdain holds chefs who cook attractive dishes that don’t taste very good in low regard, which is approximately how I feel about people who publish essays in novel format). Love might be necessary if you’re going to go to the distance, but a lot of people have this silly, romantic idea that love is all about the moment, dying for each other, crashing emotional waves, love-at-first-sight, tussles-in-the-bedroom.

And it is about that—we learn about Bourdain’s apprenticeship—but the part is relatively small: a lot of love is about persevering during the tedious, boring parts of life, learning one’s craft, and learning how to get along with others. People who cook because they think they love to cook, without having considered that cooking professionally might mean doing it six to seven days a week for years on end, haven’t realized that no, maybe love isn’t enough. Here’s Michael Idov in “Bitter Brew: I opened a charming neighborhood coffee shop. Then it destroyed my life,” which every aspiring coffee artist should read:

Looking back, we (incredibly) should have heeded the advice of bad-boy chef Anthony Bourdain, who wrote our epitaph in Kitchen Confidential: “The most dangerous species of owner … is the one who gets into the business for love.”

Advice like this by its nature goes unheeded because most people probably can’t project themselves imaginatively into the mind of the advice giver. The advice is “diminished in the telling,” since we don’t have the sensory information and deep background that went into the person giving the advice. We’re bad at thinking about what doing something over and over for months or years at a time is like. We’ll probably never be good at it, but that’s not going to stop us from giving and taking it.

I like to cook and cook for myself and friends with what I imagine to be reasonable skill. If, for some unknown reason, Bourdain showed up at my apartment for dinner, I think I could make something he’d find passable, especially because he likes food you can eat better than food that’s designed to show off the chef’s smarts. But I probably don’t love cooking enough to do it as a pro. I don’t like it enough to put forth my best effort when I’m not in the mood. Maybe I once thought I liked cooking enough, because who hasn’t imagined themselves as a chef somewhere as they grease their pan with olive oil, knowing that an hour later perfect penne a la vodka and tender green beans with garlic will be served? We’ve all probably briefly imagined ourselves giving Nobel and Oscar acceptance speeches too.

But the gap between current skills and the social admiration can only be bridged by the long honing of skill that requires incredibly internal and psychological fortitude (or, possibly, dumb luck and not having anywhere else to go). Even if we do keep trying, the plaudits may never come. I know of Bourdain not because of his work as a chef, but because he’s so skilled a writer that I’ve seen him mentioned often enough to read his book. Which I will now recommend that you do too, because it’s fabulous. He probably could’ve amped up the sex part, though he does say that he doesn’t want the reader “to think that everything up to this point was about fornication, free booze, and ready access to drugs.” But for Bourdain it is, more than anything else, about the food. I think it would be extraordinarily difficult to fake his level of enthusiasm for food. And when you have an enthusiasm that you probably can’t fake, you’ve probably also got a shot at being the best.


I also wrote about Bourdain in “So you wanna be a writer: What Anthony Bourdain can tell you even when he’s not talking about writing.” I like that he views cooking as a craft. “Craft” sounds intellectually honest, as opposed to an art that can fall prey to pretension, and even though all arts require some level of craftsmanship. He raises cooking to an art form without overdramatizing it.

On "Amazon Signs Up Authors, Writing Publishers Out of Deal"

Seemingly everyone in the book “blogosphere” has something to say about Amazon Signs Up Authors, Writing Publishers Out of Deal, which points to Amazon’s growing presence not just in book retailing but in book publishing (“Amazon will publish 122 books this fall in an array of genres, in both physical and e-book form. [. . .] It has set up a flagship line run by a publishing veteran, Laurence Kirshbaum, to bring out brand-name fiction and nonfiction”). And that’s just its big-name efforts: it now offers a platform for any moron, including this one, to upload and publish eBooks.

Naturally, as someone mulling over options, I’ve been thinking about this stuff:

1) There are a couple of problems publishers have. One big problem is simple: they offer lousy standard royalties on eBooks. Publishers apparently offer a measly 17.5%, before the agent cut. Amazon, Barnes & Noble, Apple, and so forth will offer 70% (if the author is using an agent, presumably the agent gets a cut). Big-name authors can presumably get better deals, but probably not 70% deals. So an author can sell many, many fewer eBooks and still make more money.

2) Smart authors are probably thinking about whether publishers are going to be in business at all in anything like their current form five years from now. This means authors, especially younger ones, might not want to lock in their eBooks at a 17.5% royalty rate for the rest of their lives only to discover that, five or ten years from now, virtually no one is reading paper books and virtually no one is using conventional publishers in conventional ways. If you’re a writer and you have a longer-than-the-next-quarter outlook, this makes a lot of sense.

3) On a subject closer to home, publishers and agents probably have too many false negatives—that is, people who they should offer representation to but don’t. For a long time, those people simply had no real recourse: they went away or kept trying through the rejections. Lots of now-famous writers went through dozens or hundreds of rejections. If I one day become a now-famous writer, I’ll have the same rejection story. But we don’t know about the could-have-been-famous writers who had to give up for various reasons. Today, if writers are sufficiently determined, they can start selling on their own (I may fall into this category shortly) and see what happens. Chances are good that “nothing” happens, but chances are good that “nothing” happens in traditional publishing land too. But if something happens, it would be hard to imagine that writers used to taking home 70% will happily roll over and let publishers give them 17.5%.

4) Publishers more generally are facing a classic Innovator’s Dilemma-style problem: what happens when the old model is fading but the new one is less profitable in the short to medium term? You run the risk of startups and new-model companies overtaking your business, leaving you in the position of Kodak, old-school IBM, Polaroid, everyone who ever competed with Microsoft prior to about 2004, and innumerable other companies who’ve been killed by shifting markets.

5) Since the massive bloodletting at publishing companies in the 2008 – 2009 neighborhood, it seems to have gotten even harder to get the attention of publishers, which exacerbates numbers 3 and 4, and probably drives more people toward self-publishing, thus accelerating the overall dynamic.

The major publishers aren’t daft and know all this. But they are constrained and can’t do much about it. They can’t distinguish between standard slush and what I’d like to think is my own worthwhile stuff, because if they could, they wouldn’t say no to duds and pass on hits. So this may simply be the sort of thing everyone can see coming and no one can do anything about.

EDIT: This, from Jamie Byng, is worth remembering too: “Publishing is also about finding new talent, rigorous editing, championing the books you believe in, and all that doesn’t just disappear with digital books.” The essential challenge of writing remains even if the distribution changes.

EDIT 2: Literary agent Jane Dystel:

Last week while I was following up on a proposal I had out on submission to publishers, I heard back from a senior editor at one of the top six publishing houses. This person is someone who I consider to be very smart and who has great taste. I had sent him a proposal which he acknowledged was very well done and which covered a subject he was interested in. In turning it down, he sounded discouraged and demoralized as he said that the higher ups in his company were no longer allowing him to buy mid-list titles that in the past he had been able to turn into bestsellers. Rather, he said, they were only allowing him to buy “sure things,” which I took to mean books that can’t fail.

If this story is actually indicative of a general trend in publishing, the number of false negatives should be going up and, concomitantly, the number of writers willing to try new things should too.

On “Amazon Signs Up Authors, Writing Publishers Out of Deal”

Seemingly everyone in the book “blogosphere” has something to say about Amazon Signs Up Authors, Writing Publishers Out of Deal, which points to Amazon’s growing presence not just in book retailing but in book publishing (“Amazon will publish 122 books this fall in an array of genres, in both physical and e-book form. [. . .] It has set up a flagship line run by a publishing veteran, Laurence Kirshbaum, to bring out brand-name fiction and nonfiction”). And that’s just its big-name efforts: it now offers a platform for any moron, including this one, to upload and publish eBooks.

Naturally, as someone mulling over options, I’ve been thinking about this stuff:

1) There are a couple of problems publishers have. One big problem is simple: they offer lousy standard royalties on eBooks. Publishers apparently offer a measly 17.5%, before the agent cut. Amazon, Barnes & Noble, Apple, and so forth will offer 70% (if the author is using an agent, presumably the agent gets a cut). Big-name authors can presumably get better deals, but probably not 70% deals. So an author can sell many, many fewer eBooks and still make more money.

2) Smart authors are probably thinking about whether publishers are going to be in business at all in anything like their current form five years from now. This means authors, especially younger ones, might not want to lock in their eBooks at a 17.5% royalty rate for the rest of their lives only to discover that, five or ten years from now, virtually no one is reading paper books and virtually no one is using conventional publishers in conventional ways. If you’re a writer and you have a longer-than-the-next-quarter outlook, this makes a lot of sense.

3) On a subject closer to home, publishers and agents probably have too many false negatives—that is, people who they should offer representation to but don’t. For a long time, those people simply had no real recourse: they went away or kept trying through the rejections. Lots of now-famous writers went through dozens or hundreds of rejections. If I one day become a now-famous writer, I’ll have the same rejection story. But we don’t know about the could-have-been-famous writers who had to give up for various reasons. Today, if writers are sufficiently determined, they can start selling on their own (I may fall into this category shortly) and see what happens. Chances are good that “nothing” happens, but chances are good that “nothing” happens in traditional publishing land too. But if something happens, it would be hard to imagine that writers used to taking home 70% will happily roll over and let publishers give them 17.5%.

4) Publishers more generally are facing a classic Innovator’s Dilemma-style problem: what happens when the old model is fading but the new one is less profitable in the short to medium term? You run the risk of startups and new-model companies overtaking your business, leaving you in the position of Kodak, old-school IBM, Polaroid, everyone who ever competed with Microsoft prior to about 2004, and innumerable other companies who’ve been killed by shifting markets.

5) Since the massive bloodletting at publishing companies in the 2008 – 2009 neighborhood, it seems to have gotten even harder to get the attention of publishers, which exacerbates numbers 3 and 4, and probably drives more people toward self-publishing, thus accelerating the overall dynamic.

The major publishers aren’t daft and know all this. But they are constrained and can’t do much about it. They can’t distinguish between standard slush and what I’d like to think is my own worthwhile stuff, because if they could, they wouldn’t say no to duds and pass on hits. So this may simply be the sort of thing everyone can see coming and no one can do anything about.

EDIT: This, from Jamie Byng, is worth remembering too: “Publishing is also about finding new talent, rigorous editing, championing the books you believe in, and all that doesn’t just disappear with digital books.” The essential challenge of writing remains even if the distribution changes.

EDIT 2: Literary agent Jane Dystel:

Last week while I was following up on a proposal I had out on submission to publishers, I heard back from a senior editor at one of the top six publishing houses. This person is someone who I consider to be very smart and who has great taste. I had sent him a proposal which he acknowledged was very well done and which covered a subject he was interested in. In turning it down, he sounded discouraged and demoralized as he said that the higher ups in his company were no longer allowing him to buy mid-list titles that in the past he had been able to turn into bestsellers. Rather, he said, they were only allowing him to buy “sure things,” which I took to mean books that can’t fail.

If this story is actually indicative of a general trend in publishing, the number of false negatives should be going up and, concomitantly, the number of writers willing to try new things should too.

Initial thoughts on Ann Patchett's State of Wonder

I started State of Wonder last night. Today I needed to finish a lot of work. Some got done. A lot didn’t. The novel held me: by the power of its story, by the hypnosis of trying to figure out just who Dr. Swenson was and is, by the dilemmas each character faces, by the writing, by the agony of jungle life. The writing isn’t showy, exactly, but it’s good, strange and normal at the same time, with very average seeming sentences like these: “Marina shrugged. It was a peculiar kind of therapy, lying flat out with the child you had only now realized you wanted while being asked if you had wanted a child.” The wants and desires of the second sentence wrap in on each other, implying paradox in a book full of paradoxes and choices just on the verge of being transcended. A paragraph below there’s this: “That was Dr. Rapp’s great lesson in the Amazon, in science: Never be so focused on what you’re looking for that you overlook the thing you actually find.”

Yes: Marina overlooks everything and finds everything. Not many characters in novels do. Not many people do. Not many characters or people articulate trade-offs like the ones in State of Wonder. Not many live their trade-offs so fully. Too few novels have characters who care about something, anything, with real depth. In State of Wonder, everyone who counts cares.

The last 100 pages are extraordinary, and if the early middle section is sometimes confusing (thanks in part to Marina’s use of Lariam (Wikipedia: “The FDA product guide states it can cause mental health problems including: anxiety, hallucinations, depression, unusual behavior, and suicidal ideations among others”)), the end more than makes up for perceived early deficits. With a novel like this I can’t help but wonder if the sections I perceived as confusing were essential in some way I haven’t grasped.

“Ravishing” is an overused and often stupid word critics use, but I will use it here.

I have other reading to be done but I am still stuck pondering Drs. Swenson and Singh; this doesn’t happen very often.

The Marriage Plot — Jeffrey Eugenides

The Marriage Plot is very competently done, and there’s nothing particularly wrong with it; some things may even be done particularly well. The problem is, as my faint praise indicates, a novel isn’t a student essay: it’s not enough for nothing to be particularly wrong. Something has to be smashing and fantastic for it to really matter. The Virgin Suicides, with its ceaseless questioning of what happened to the Lisbon sisters and its unusual narrative structure in the form of a chorus of outsider men who were once boys attempting to understand something they never quite can, had this quality. There’s a haunting, melancholy quality to the story and the way its told. Middlesex is imaginatively powerful because of Cal’s parents’ unusual relationship (does love conquer all, including biology?) and Cal’s own inter- or transexual state, which is so unusual amid novels that mostly cover straight people, occasionally cover gay people, but very seldom cover people whose bodies and minds don’t quite match like they should.

I keep copies of both Eugenides’ earlier novels, but I’m selling my copy of The Marriage Plot. I can’t imagine rereading it. In The Curtain, Milan Kundera wrote something that has long stayed with me because of how right he is:

Every novel created with real passion aspires quite naturally to a lasting aesthetic value, meaning to a value capable of surviving its author. To write without having that ambition is cynicism: a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.

Eugenides has that ambition. I concede The Marriage Plot might outlast its author. But I’m skeptical it will: the religious stuff Mitchell experiences doesn’t measure up, and the mostly banal problems faced by recent college graduates doesn’t quite live up to anything. Leonard is the only person with real problems, both in terms of his ailment (manic depression) and his work (as a biologist: he is confronting the natural world, and some of the most interesting sections describe both his efforts in taming yeast and his status in a science lab). Madeleine, like so many of us, is committed to love with a person who maybe isn’t worth it. In listening to her sister’s trouble, we find this: “like anyone in love, Madeleine believed that her own relationship was different from every other relationship, immune from typical problems.” She isn’t, and her relationship is more like that of other people than she’d like to imagine it to be. And The Marriage Plot is more like other novels than I want it to be.

There are long sections of background that we might not need. We find that “Leonard had grown up in an Arts & Crafts house whose previous owner had been murdered in the front hall.” Grisly, but not vital to the story. “[. . . ] Madeleine took the opportunity to make herself more presentable. She ran her hands through her hair, finger-combing it.” Nothing wrong with this: it’s just average. Maybe too Victorian. Later: “Ground personnel rolled a metal stairway up to the plane’s first door, which opened from inside, and passengers began disembarking.” Do we need this? Or can it be eliminated? On their own, these sentences are okay, and I’ve committed such sentences many times, despite Martin Amis warning me not to. I want to put this book on a diet, to convince it to render only the essential. Too much of it makes me want to cut more; I can also now say that the only thing worse than taking an essay test of your own is reading about someone else’s essay test, especially when that essay test involves religion.

There are also some strange sentences; this one makes me wonder if the last word is a typo: “Years of being popular had left her with the reflexive ability to separate the cool from the uncool, even within subgroups, like the English department, where the concept of cool didn’t appear to obtain.” What does “appear to obtain” mean? Perhaps it’s supposed to be “appear to apply.” The good ones are still good, though: “Dabney had the artistic soul of a third-string tight end.” I’ve met Dabneys. And I get what Madeleine gets: people who declaim one kind of hierarchy or status system are always setting up another, whether they recognize it or not. I also find it intriguing that Madeleine can be an intense reader and also intensely popular. The two seldom appear together in fiction. Perhaps the combination makes her an astute social reader of everyone but herself.

She also understands Mitchell, who acts as a beta orbiter for most of the novel. He provides her with extra male romantic attention mostly because he’s a fool, and she knows it on some level: “Mitchell was the kind of smart, sane, parent-pleasing boy she should fall in love with and marry. That she would never fall in love with Mitchell and marry him, precisely because of this eligibility, was yet another indication, in a morning teeming with them, of just how screwed up she was in matters of the heart.” Being “smart, sane” and “parent-pleasing” is another way of saying “boring.” He also doesn’t make a move when he’s effectively asked to. At one point, Madeleine takes Mitchell home and goes to his attic room wearing only an old shirt—then resents him for not making a move when he obviously wants to and she does too.

She has a point.

When Mitchell is too eligible, that “eligibility” gets held against him. And he buys into ideologies that encourage him to remain a fool. A priest says to Mitchell: “Listen, a girl’s not watermelon you plug a hole in to see if it’s sweet.” Tell that to most women who do the same of men. There are plenty of sexist assumptions in this statement alone to get a feminist writing an angry paper about women, innocence, desire, and sexuality. Perhaps you shouldn’t take romantic advice from someone sworn to a life of celibacy and thus ignorance in a realm that most of us take to be vitally important. To be fair, Mitchell mostly doesn’t, but that he’s seeking knowledge from a source like that tells us he doesn’t even know where to begin to look for help. And Madeleine exploits this weakness. She says, “[. . .] one night the previous December, in a state of anxiety about her romantic life, Madeleine had run into Mitchell on campus and brought him back to her apartment. She’d needed male attention and had flirted with him, without entirely admitting it to herself.”

Rather nasty. Even worse than he falls for it. The optimal solution for Mitchell: find another girl, ideally one hotter than Madeleine, and use the other girl as leverage. Moping around doesn’t get the girl. As Sean Connery says in an otherwise lousy movie called The Rock, “Losers always whine about their best. Winners go home and fuck the prom queen.” Mitchell hasn’t realized or internalized this. Contrast Mitchell’s neediness with that of his rival’s distance: “The more Leonard pulled away, the more anxious Madeleine became.” She’s desperate for Leonard, which enables him to make her like him even more. Mitchell is on the opposite side of this recusive dynamic. He should read Radway’s Reading the Romance, which describes how women like to read romance novels in which the heroine falls for major alpha males. Radway doesn’t use this term, of course, and works to explain away women’s preferences for alpha males, but the descriptions still shine through.

Still, there are funny bits to The Marriage Plot; on the same page where Madeleine assesses Mitchell as a beta, her mother says that she “saw a program about Indian recently,” as if “a program” on TV could convey much about the country—but wanting to say she’s seen it does convey a lot about her. She goes on to say, “It was terribly depressing. The poverty!” Mitchell says “That’s a plus for me [. . .] I thrive in squalor.” The unexpected reaction to Madeleine’s mother and reframing of expected values makes this funny and shows us that Mitchell isn’t the stiff he might otherwise appear to be. And the book isn’t the stiff it might otherwise be. It’s just not funny consistently enough or deep consistently enough. It’s a muddle, even when I do laugh at lines like, “Madeleine’s love troubles had begun at a time when the French theory she was reading deconstructed the very notion of love.” Love isn’t so easily eliminated, however: it only takes belief to sustain it.

And the characters are more self aware than I’ve sometimes depicted them here. Madeleine, for instance, knows that graduating from college, for a certain class of person who is expected to go to college, just isn’t that hard. On graduation day, “she wasn’t proud of herself. She was in no mood to celebrate. She’d lost faith in the significance of the day and what the day represented.” If college is mostly a test of showing up, it’s hard to blame her; and majoring in English probably isn’t very hard for most hard-core readers (it wasn’t for this one, anyway; to me reading was fun, which meant that I did so much more of it than most of my classmates that class itself wasn’t very hard). And she finds that the deconstructing education she receives isn’t much use when she’s confronted with the messy reality of interpersonal relationships, including her relationship with Leonard. Saying manic depression is a socially constructed discourse won’t get help like lithium will, even with lithium’s side effects.

Leonard’s stay at Pilgrim Lake, a biological research facility something like Cold Springs Harbor Laboratory, is among the novel’s most interesting sections. I would’ve liked it longer and Mitchell’s Indian sojourn shorter. Leonard is researching reproduction in yeast; this yields a predictable but impressive number of metaphors for human dilemmas. His work also can’t be solved by appeals to socially constructed discourse, and I suspect many of the scientists at the lab are more interesting than Madeleine at Mitchell. For example, Madeleine observes one of the very few female research scientists and observes:

Madeleine guessed that MacGregor [who just won a Nobel Prize] made people uneasy because of the purity of her renunciation and the simplicity of her scientific method. They didn’t want her to succeed, because that would invalidate the rationale for their research staffs and bloated budgets. MacGregor could also be opinionated and blunt. People didn’t like that it anyone, but they liked it less in a woman.

Tell us more about the “simplicity of her scientific method.” How does that relate to literary theory? Could we see MacGregor take more of an interest in Madeleine? Who are the people who “didn’t want her to succeed,” and how does she react to them? I wouldn’t want to turn the novel into Atlas Shrugged, but there are rich idea veins here that go unmined in favor of Mitchell’s noodlings. My suggestions are somewhat unfair, as I’m violating Updike’s first rule of book reviewing—”Try to understand what the author wished to do, and do not blame him for not achieving what he did not attempt”—but I think an exploration of gender in science more interesting than an exploration of gender and mating habits among relatively average 20-somethings. Maybe because I fit into that group I’m too close to the subject to find it remarkable, but I think the novel has a smaller-than-life quality to it, in the same way B. R. Myers describes Jonathan Franzen’s novel Freedom at the link:

One opens a new novel and is promptly introduced to some dull minor characters. Tiring of them, one skims ahead to meet the leads, only to realize: those minor characters are the leads. A common experience for even the occasional reader of contemporary fiction, it never fails to make the heart sink. The problem is not only one of craft or execution. Characters are now conceived as if the whole point of literature were to create plausible likenesses of the folks next door. They have their little worries, but so what? Do writers really believe that every unhappy family is special? If so, Tolstoy has a lot to answer for—including Freedom, Jonathan Franzen’s latest. A suburban comedy-drama about the relationship between cookie-baking Patty, who describes herself as “relatively dumber” than her siblings; red-faced husband Walter, “whose most salient quality … was his niceness”; and Walter’s womanizing college friend, Richard, who plays in an indie band called Walnut Surprise, the novel is a 576-page monument to insignificance.

The Marriage Plot is a much better novel than this, but one detects the same kinds of maladies at work: “dull minor characters,” a problem beyond “craft or execution” (which are, again, well done here), “little worries” for the most part (until an unconvincing ending), and a general feel that life is elsewhere. Around the same time, Bill Gates, Steve Jobs, Bill Joy, Richard Stallman, and many others were coalescing around Silicon Valley to change the world. I wouldn’t be communicating with you right now via this medium if it weren’t for their work. Which isn’t to say every novel set in the late 70s or early 80s should be about computers, technology, or technologists: but in the face of banality, I can’t help drifting toward thoughts of people whose work really, incredibly, resolutely matters.

Eugenides is clearly interested in the inner workings of people—the problem is that Mitchell and Madeleine do not have particularly interesting or engaging insides. Mitchell needs a copy of The Game to be time-warped to him, stat, and Madeleine needs to better realize what reading nineteenth-century novels should prime her to know: that she’s not the first person in the universe with unwise love decisions or family problems. Why doesn’t she better analyze her own situation in terms of the novels she loves so much? Why doesn’t she better realize that, yes, her life could be one of the fragments in Roland Barthes’ A Lover’s Discourse? It could be, as Eleanor Barkhorn says in “What Jeffrey Eugenides Doesn’t Understand About Women,” that Madeleine doesn’t have any real female friends, but I’m not convinced: I’ve met women who have few or no real female friends, and I don’t think that aspect of Madeleine’s life is unrealistic. The bigger problem is her lack of friends in general, so those friends can’t say the obvious to her: Why Leonard? Do you realize what you’re giving up? And if she does, and she gives up much of herself anyway, then the problem is her own blindness—a topic that I don’t find tremendously satisfying to read about, since it basically implies Madeleine is stupid. Characters can only be stupidly blinded by love for so long before one removed the “blinded” and turns “stupidly” back into a noun.

Most of my problems with the novel aren’t with its prose on a micro level, although it has those issues: it’s with the dearth of real ideas in the novel. It doesn’t quite go with the literary-theory-as-life metaphors, which drop out partway through. It doesn’t quite go with the alpha-beta-male decision that Madeleine faces. It doesn’t quite go with the manic-depression-as-serious-issue-maybe-linked-with-creativity issue that Leonard has. It’s a host of “almosts” that reminds me some of a sunnier version of Michel Houellebecq, especially in The Elementary Particles and Platform.

Houellebecq, however, is willing to engage in a kind of brutal realism—for lack of a better phrase—that Eugenides doesn’t get to. Yet that’s what the characters need: less understanding of their petty problems and more context, or a harder eye, or someone to smack Mitchell and Madeleine, then explain both their problems. I could explain their problems. I’ve met a million Mitchells and Madeleines. Hell, I used to be one in some respects. But the world has a habit of correcting your faults, if you’re paying attention to the signals the world is giving. Mitchell and Madeleine aren’t. That’s what makes them so unsatisfying. As three of the characters go, so does the very, very competent novel that doesn’t get past competence and into transcendence.


You can read my initial impressions here.

We lack perspective: notes from Alain de Botton's The Pleasures and Sorrows of Work

Yet our world of abundance, with seas of wine and alps of bread, has hardly turned out to be the ebullient place dreamt of by our ancestors in the famine-stricken years of the Middle Ages. The brightest minds spend their working lives simplifying or accelerating functions of unreasonable banality. Engineers write theses on the velocities of scanning machines and consultants devote their careers to implementing minor economies in the movements of shelf-stackers and forklift operators. The alcohol-inspired fights that break out in market towns on Saturday evenings are predictable symptoms of fury at our incarceration. They are a reminder of the price we pay for our daily submission at the altars of prudence and order – and of the rage that silently accumulates beneath a uniquely law-abiding and compliant surface.

1) A lot of engineers like their jobs and look at them as solving a series of puzzles: “theses on the velocities of scanning machines” are only as banal as you make them. In addition, even if you do find them banal, if you can make a faster scanning machine and sell it for a lot of money, you may not care when you retire to paint water colors for the rest of your life.

2) Fights say more about the dumb fighters than about the human condition.

3) Humans might simply never be, as a group, overtly happy in whatever conditions we experience; realizing this might release us from unreasonable expectations. A cultural fixation on happiness might paradoxically prevent us from experiencing what we think or imagine we most want or desire.

4) Related to three, people who leave work to drink on the weekends are probably intentionally looking for fights: I doubt the behavior can be blamed solely on alcohol. Many people seem to undergo a two-step process: they consciously drink so they can unconsciously act out in the ways they’d actually like to. My question is simple: why not just go to step two and be intellectually honest with ourselves?

5) Stumbling on Happiness discusses how and why we feel unhappy when we compare ourselves to others. Most of us don’t compare ourselves to people in the “Middle Ages;” we compare ourselves to our wives’ sisters’ husbands, to paraphrase that famous aphorism (switch gender roles as appropriate to you, the reader, and your gender / sexual orientation).

6) We submit “at the altars of prudence and order” because the alternative is often worse. That being said, I think Western society underestimates the power and importance of trance, ecstasy, transcendence, atë—all things that, denied and repressed, seem to manifest themselves in unusual ways (see The Secret History for more on this. Still, if the alternative to prudence and order is chaos, no iPhone, longer commutes, and living a dicey part of town, prudence and order sound pretty good—as does self-imposed “incarceration.”

7) The Pleasures and Sorrows of Work is, like much of de Botton’s work, nicely balanced between readability and intellectual engagement, reasoned and learned without being pedantic. These are harder notes to strike than may be obvious at first.

%d bloggers like this: