Summary Judgement: Sweet Tooth — Ian McEwan

For a novel about a spy, Sweet Tooth is surprisingly slack. Maybe it’s slack in defense of realism. The cause eludes me, since the writing is as customarily crisp as the story isn’t. Excellent quotes are easy, from the first page, with this description of Serena’s father, an Anglican Bishop: his “belief in God was muted and reasonable, did not intrude much on our lives and was just sufficient to raise him smoothly through the Church hierarchy and install us in a comfortable Queen Anne House.” The parents are distant to the point of barely believable indifference: much later in the novel, Serena thinks, “Would the Bishop even notice I’d been away?” She’s free of parents, like an orphan in a 19th Century novel or a teenager in a contemporary TV show.

That doesn’t detract from the aforementioned beauty, like this, to go back to the second page: “We liked to think of ourselves as bad girls, but actually we were rather good.” Serena, on learning about the difficulties of writing, “went for important walks,” the silliness and accuracy of the phrase “important walks” working so well to conceptualize her state of mind and what many people with intellectual dispositions end up doing.

But the beauty of sentences eventually feels like backdrop when a second or third act fails to develop. The novel ends with a great, revisionary secret, the sort of secret that powers PhD dissertations more often than it does readerly love. We’ve seen these surprise techniques before—most notably in Atonement, but also, after a fashion, On Chesil Beach.

Like many writers, including this one, McEwan, through Serena, is at least interested in and perhaps obsessed by what reading and books do to people. Serena works in books as much as she’s a spy and sleeps with authors (which is the sort of practice I’d like to encourage). She notes what she reads and how she reads it. The book becomes about a love of books, but it does so to the point that the occupant of this book becomes dull. What does the book talk add up to? I’m a person sympathetic to books and book talk, but in Sweet Tooth the answer is “not much.” It becomes easy to lose focus midway through. Sure, for Serena, reading is how she both constructs and understands her world, but then you have to, you know, go do something. That’s not to say that she isn’t artful or funny. Consider this problem, about Jeremy, Serena’s first lover who turns out, predictably, to prefer men:

I wanted him to have a secret and shameful desire that only I could satisfy. I wanted to make this lofty, courteous man all mind. Did he want to smack my backside, or have me smack his? Was he wanting to try on my underwear? This mystery obsessed me when I was away from him, and made it all the harder to stop thinking about him when I was supposed to be concentrating on the maths. Colette was my escape.

Colette was her escape, but into what and from what? From mysteries? From something she can’t quite articulate, perhaps. And Serena, as a narrator, is also willing to ostentatiously tell us that she’s older and wiser now: “What I took to be the norm—taut, smooth, supple—was the transient special case of youth. To me, the old were a separate species, like sparrows or foxes. And now, what I would give to be fifty-four again!” This intrusion of the future self reminds us that we’re reading something from the future of events, with two pairs of eyes: the eyes of the undergraduate Serena and the eyes of the much older Serena, imagining her younger self from a position of greater articulacy and knowledge. Done too often, though, it becomes tedious. The notes in my copy trail off as the novel advances, and as I hope for Serena to become more than an acted-upon reporter of events. Her own life feels like it happened to someone else. Later in the novel, much later, the reason for this is revealed. But the view at the end of a long trail doesn’t always redeem the journey. The reason is clever, cerebral, not expected and not forced, and doesn’t make me want to read Sweet Tooth again, unless the next reading is part of some academic project about the usual sorts of academic things.

Serena says this of her reading habit:

All thanks to my mother, I didn’t stop reading. I’d never read much poetry or any plays at school, but I think I had more pleasure out of novels than my university friends, who were obliged to sweat over weekly essays on Middlemarch or Vanity Fair. I raced through the same books, chatted about them perhaps, if there was someone around who could tolerate my base level of discourse, then I moved on. Reading was my way of not thinking about maths. More than that (or do I mean less?), it was my way of not thinking.

Reading can be a powerful way of not thinking. I know from experience, even if most people think of reading as a highbrow, intensely intellectual activity these days. It isn’t, necessarily. And the assigned essay can be a chore instead of a pleasure. Serena wants it to be a pleasure:

My needs were simple. I didn’t bother much with themes or felicitous phrases and skipped fine descriptions of weather, landscapes, and interiors. I wanted characters I could believe in, and I wanted to be made curious about what was to happen to them. Generally, I preferred people to be falling in and out of love, but I didn’t mind so much if they tried their hand at something else. It was vulgar to want it, but I liked someone to say ‘Marry me’ by the end. Novels without female characters were a lifeless desert. Conrad was beyond my consideration, as were most stories by Kipling and Hemingway. Nor was I impressed by reputations. I read anything I saw lying around. Pulp fiction, great literature and everything in between—I gave them all the same rough treatment.

Simple intellectual and erotic needs might be easier to fulfill than complex ones, in one sense, but also harder, in the way that a simple task executed perfectly may be harder than a complex task executed with a margin for error. Still, Serena should have known that it isn’t vulgar to want love and marriage and plot. It’s vulgar that professors and highbrow critics might make her think it is vulgar to want those things, to want fiction that might be, to use that overused term, “relatable.” That one might be able to follow effectively. Serena isn’t a close reader, or someone practicing towards being a professional.

But she is someone who learns how to be through books, which makes her different from someone who learns how to be from in other ways, or someone who never learns how to be. She says, “I caused amusement among my Newnham friends studying English when I told them that Valley of the Dolls was as good as anything Jane Austen ever wrote. They laughed, they teased me for months. And they hadn’t read a line of Susann’s work.” Her friends are snobby and dismissive. Given the choice between snobby and unrefined but passionate, I’ll take the latter. The difference between those becomes a running issue, as when Serena begins to write a little column, and, like bloggers, something unfortunate happens: “I had written half a dozen jaunty pieces when something went wrong. Like many writers who come by a little success, I began to take myself too seriously.”

It’s a narrow act, the stance that straddles too serious and not serious enough. When I’m waffling between them, I try for “not serious enough:” after all, we’re talking about fiction here, not life and death. But for Serena the two become bound together because of her work. That’s an interesting theme; if only the plot were drilled more vigorously through the loam of Serena’s mind and story.

Jonah Lehrer’s Imagine is still worth reading

Jonah Lehrer, as is now well known, repeatedly misrepresented research and plagiarized other people’s writing in Imagine: How Creativity Works. But, as Roy Peter Clark points out, “Jonah Lehrer’s ‘Imagine’ is worth reading, despite the problems.” Clark goes on to say, “not all the sins [Lehrer commits . . .] are equally grievous,” but, despite that, “the reading of the book ‘Imagine’ helped me understand my world and my craft, and what else can you hope for from a non-fiction book.”

I’ve found the same thing after reading Imagine based on Clark’s endorsement. But reading it in light of Lehrer’s indiscretions reveals new potential layers of meaning, because a couple of passages have a very different resonance, like this one, about Shakespeare’s milieu:

His [Shakespeare’s] peers repeatedly accused him of plagiarism, and he was often guilty, at least by contemporary standards. What these allegations failed to take into account, however, was that Shakespeare was pioneering a new creative method in which every conceivable source informed his art. For Shakespeare, the act of creation was inseparable from the act of connection. {Lehrer “Imagine”@221}

Lehrer seems to be using the same method. But the age of the Internet makes tracking sources much, much easier than it used to be. And he goes on:

The point isn’t that Shakespeare stole. It’s that, for the first time in a long time, there was stuff worth stealing—and nobody stopped him. Shakespeare seemed to know this—he was intensely aware that his genius depended on the culture around him. {Lehrer “Imagine”@221}

In retrospect, this reads as a preemptive defense of Lehrer’s own method. But I don’t get why Lehrer made stuff up: most of what he invented doesn’t seem to be very important, and it’s the kind of peripheral material that makes for good reading but isn’t essential. Given contemporary attitudes towards plagiarism—the passages above show that he knows and understands those attitudes—why risk so much for so little gain? It’s like a millionaire stealing a pair of $20 jeans. Why tarnish success? I can imagine some possible answers to these questions, but none of them are very satisfying, and I ultimately want to ascribe Lehrer’s lies to simple human vanity.

Imagine is still pretty interesting. I doubt it’s a perfect book, and I wouldn’t cite Lehrer in my neuroscience PhD dissertation. But I am now conscious of the tension between free-form creative thought and focused attention to a particular, grinding problem (“We need structure or everything falls apart. But we also need spaces that surprise us. Because it is the exchanges we don’t expect, with the people we just met, that will change the way we think about everything”); I am conscious of the need for both longtime collaborators and for new faces; and I am conscious of how people with deep domain expertise may benefit from applying that expertise elsewhere. Some of Lehrer’s points, like his description of the virtues of cities or the eccentric greatness of Paul Erdos, are already familiar. But he helps me see them in new ways. A moment like this, for example, shows me something important about my own writing and creative work:

Friedrich Nietzsche, in The Birth of Tragedy, distinguished between two archetypes of creativity, both borrowed from Greek mythology. There was the Dionysian drive—Dionysus was the god of wine and intoxication—which led people to embrace their unconscious and create radically new forms of art. [. . .] The Apollonian artist, by contrast, attempted to resolve the messiness and impose a sober order onto the disorder of reality. Like Auden, creators in the spirit of Apollo distrust the rumors of the right hemisphere. Instead, they insist on paying careful attention, scrutinizing their thoughts until they make sense. Auden put it best: ‘All genuine poetry is in a sense the formation of private spheres out of public chaos.’ {Lehrer “Imagine”@64}

I am far more in the Apollonian mode than the Dionysian mode, but, perhaps for that reason, I’m fascinated by and perhaps even envious of Dionysian thinking, acting, and living. A novel like The Secret History thus becomes all the more important to me, because it has an Apollonian narrator, Richard, dealing with the aftermath of an attempt to reach Dionysian ecstasy. In the novel, not surprisingly, the outcomes are pretty bad, but the idea of deliberately trying to reach an ecstatic experience resonates with my temperament.

There are some moments that appear, on the surface, self-contradictory. Lehrer says, “The most creative ideas, it turns out, don’t occur when we’re alone. Rather, they emerge from our social circles, from collections of acquaintances who inspire novel thoughts. Sometimes the most important people in life are the people we barely know” {Lehrer “Imagine”@204}.

Earlier in Imagine, however, Lehrer discusses how many creative ideas when people are taking morning showers—where most are presumably alone. So do creative ideas emerge from chatting with others, or when our mind is a relaxed state that lets it make disparate connections among ideas? The answer appears to be “both,” but Lehrer doesn’t explicitly discuss the implied contradictions. I’m not saying he couldn’t reconcile them, but I am saying that someone should’ve pointed these kinds of contradictions out.

Even if all of Imagine’s research and stories are somehow wrong—and I don’t think they are—the book still offers novel ways to think about creativity and how to structure one’s life or work more effectively and in ways that I hadn’t foreseen. I wish the publisher hadn’t withdrawn it altogether. Used copies on Amazon now start at $25. It may be that the existing copies thus continue to rise in value because of their scarcity; alternately, readers might turn to pirate editions on the Internet, which I can only assume are easy enough to find (my book came from the University of Arizona’s library).

Don’t Go to Law School (Unless) — Paul Campos

Paul Campos saves what might be the best paragraph of Don’t Go To Law School (Unless): A Law Professor’s Inside Guide to Maximizing Opportunity and Minimizing Risk for the very end of the book, so I’m going to invert his structure and start with it:

Have you ever said to yourself, “I don’t know what to do with my life – so I’m going to spend three years of it going deeply and irreversibly into debt, in a quite possibly futile attempt to enter a profession that I have no actual desire to join?” I bet you haven’t, because who would ever say something that idiotic? Every year, however, thousands of people are perfectly capable of doing something that idiotic. If they weren’t, half the law schools in the country would be out of business tomorrow.

We’ve looked into the mirror and seen the enemy, and the enemy is ourselves. Sure, someone else might hand us the weapons we use to mutilate ourselves—that is, student loans—but someone who hands you a loaded gun isn’t obligating you to shoot yourself in the foot. Perhaps they shouldn’t have handed you the gun, but they did, and you can’t wholly blame that person for your mistake. It sure is more fun, however, to blame someone else for your mistakes than it is to stand up and say, “I’m an idiot and I’ve made bad life choices.”

I, however, am idiot and made a bad life choice—but I quit law school after one year, based largely on bad assumptions, fear, stupid desire, anachronistic beliefs about the legal market, and various other factors I’d rather not examine in detail. The problems with law school are slowly becoming better known: “for more than 30 years now the market for legal services has been contracting relative to the rest of the economy.” The basic problem is that law schools have been raising tuition faster than the rate of inflation for decades, and the legal market is a well-defined and studied one: there are about twice as many credentialed lawyers being minted as there are jobs for them to enter.

You don’t have to be a mathematician to realize that some of those would-be lawyers are going to be left out. In the last, they would have been left with relatively little debt, which would have made arguments like “a law degree will open doors even if you don’t practice law” at least somewhat plausible and mildly tenable. Now those kinds of arguments aren’t. There are lots of common, bad reasons people go to law school: “Like a lot of other people, I went to law school I couldn’t think of anything better to do. At the time I applied I was three years removed from my undergraduate days as a somewhat aimless English major,” and, though this may sound odd, law school itself doesn’t prepare people for practicing law.

That wasn’t really a problem when tuition was cheap and proto-lawyers could work cheaply for a couple years to learn the trade. Now the stakes are high and law school’s inadequacies are a huge problem, because having more than $100,000 in law school debt that can’t be discharged through bankruptcy will hurt people for decades, especially if they can’t get the training necessary to actually practice law. As Campos says, “The two most important practical skills that any lawyer working in private practice must possess are the ability to acquire clients, and to get them to pay their bills, which happens to be two things that most legal academics have never done in their lives.”

There’s little pressure, at least right now, to change the system. There’s little pressure legal academics to learn these kinds of skills and impart them to their students. The only way I can see to create that kind of pressure is by convincing enough people not to go to law school that the schools themselves start receiving market pressure to reform. Without that pressure, they can simply continue.

Campos is a law professor and has spent the last year and a half writing about the problems in law school on the blog Inside the Law School Scam, which is like porn for academic eggheads. It’s got lots of well-researched money shots. But, also like porn, too much of it all at once is enervating, and by now the larger point—don’t go to law school—is or should be well-known. For people considering law school, the only real question can be answered with a binary: Should I go to law school? The answer is almost certainly “no.” For most people, ITLSS only needs to be read once: the problems of law schools are most pressing for law school insiders, not for the rest of us. We need to know that “most people currently attending law school would be better off not doing so.”

And it’s intellectually honest to admit as much: “I’ve become increasingly aware that my ridiculously good job is being paid for by people who are increasingly unable to get the kinds of jobs they came to law school to get.” But relatively few insiders are willing to admit that the systems they participate in and propagate aren’t good for outsiders. That’s one reason Campos’s book is so admirable. It’s also uses stories but eschews relying exclusively on them and focuses instead on money.

The more I pay attention to the world, the more I see how much money and financial constraints underlie a lot of surface phenomena. In an ideal world, money is a strong proxy for value; a company like Google or Apple is worth a lot because both provide a lot of value to people. The education world, however, has broken that link, and the breakage is getting worse with time.

I wonder how long it’s going to take until some law school decides to utterly reverse course and simply say that it’s going to have ugly buildings, a small library, huge class sizes, and very low tuition—say, $10,000 a year. Or $9,500. The professor-to-student ratio would be something like 1:100, and there’d be a dean and virtually no other administrative support or special programs. But this model would focus on being sustainable and making sure that students don’t face penury at the end of law school.

Instead of working to compete with the current model that almost all law schools employ (or deploy), Jake’s hypothetical would do the opposite, and be proud of getting people a legal education for under $30,000 in tuition, with a maximal focus on employability following graduation and a minimum focus on student loans (I’d also love to see open-source textbooks). They could advertise their alternate strategy, and maybe have a blog that explains the ways the conventional system is set up to screw students.

As far as I know, a couple of schools try the “admit everybody and charge them a lot” model, but few try the “admit many, but charge them a little model.” The notorious Thomas M. Cooley Law School does the former, and no one who knows anything about law school will go there, but they charge $54,000 per year right now. There might be institutional or ABA-imposed barriers that I’m unaware of. Still, if that kind of model is successful, it could at least challenge the hegemony of the Harvard-Yale-Stanford model of law school, which is untenable and getting worse.

See also “The specious reasoning in Lawrence M. Mitchell’s ‘Law School Is Worth the Money‘” and “Why You Should Not Go to Law School.” Do not listen to your parents, for whom law school might’ve made financial sense, or your friends’s empty congratulations, because most of your friends don’t know any better. Law school enrollments have plummeted since their 2008 high, for good reason.

Here’s an interview with a Columbia law grad who quit law for a coding bootcamp. Skipping law school would’ve made more sense, but news about how bad the legal market is relative to the tech sector has not percolated through the entire country (yet).

Back to Blood — Tom Wolfe

The real problem with Back to Blood is that you’ve already read it, most notably in The Bonfire of the Vanities and A Man in Full—and if you haven’t read those, you should start with them. Back to Blood has the same assortment of obsessions and interests: there is the child with an unusual name and an elite pedigree: “Last week he totally forgot to call the dean, the one with the rehabilitated harelip, at their son Fiver’s boarding school, Hotchkiss [. . .]” But does anyone still care about elite boarding schools? Does anyone still care about the Miami Herald other than the people who work there? The father of Fiver is the editor, and he thinks it is “one of the half-dozen-or-so most important newspapers in the United States” in an era when the era of newspapers has passed.

The Miami nightclub is named “Balzac’s,” after another Wolfe preoccupations. There is a prurient mention of girls who “were wearing denim shorts with the belt lines down perilously close to the mons veneris and the pants legs cut off up to. . . here . . .” Has anyone in the U.S. ever used the term mons veneris, outside of Tom Wolfe and medical schools? I think it appeared in I am Charlotte Simmons a couple of times too, and there it was even more improbable. And the word loins! In this case, “juicy little loins and perfect little cupcake bottoms.” I’ve heard loins described as loins before, but only by Tom Wolfe and the writers of the Bible. Someone born more recently than 1931 would use “pussy” if they wanted to be crude, “va jay jay” if they wanted to be hipster, or “vagina” if they wanted clinical directness. But not loins. No one but Tom Wolfe would use loins, and use it again and again.

Sometimes writers working out variations on ideas that iterate subtly book by book can work—Elmore Leonard is a good example. Others just feel like they’re repeating themselves. When I am Charlotte Simmons came out, I was in college and skipped class to read it, only to feel an increasing sense of disappointment with the wrongness of many scenes—like Charlotte feeling nervous about the cost of long distance calls. That was an anachronism. Most college students had free long distance by 2004. I would’ve let anyone who asked use my phone to call home. Or, for another example of reportorial wrongness, Charlotte gets a salvaged, pieced-together computer, like a salvaged car. By 2004, however, older but working computers were $25 on Craigslist, or outright given away by schools. These two examples are salient, but there were others, just as I am Charlotte Simmons repeated words, phrases, and ideas from Wolfe’s earlier books. It, and Back to Blood, repeatedly describe moments of cowardly prurience, with men likes wolves and women who didn’t want it or didn’t want to want it and submitted to it only reluctantly, like a female character from the 19th century and not at all like many of the contemporary women I know.

The period details in Back to Blood are wrong. Today, anyone cool would be driving a Tesla Roadster, or Fisker Karma, not a Ferrari 403; Ferraris might’ve been cool twenty years ago, but technology and culture have moved on. Then there’s the simply and wildly improbable: a French professor named Lantier thinks of his daughter that she wasn’t ready for “snobbery” because “She was at the age, twenty-one, when a girl’s heart is filled to the brim with charity and love for the little people.” Someone exposed to live students every semester is unlikely to think of their hearts as “filled to the brim with charity and love” for much of anything, except perhaps alcohol, condoms, iPhones, verbing nouns, and obsessive Facebooking. Not that there’s anything wrong with those things, but familiarity is a great slayer of illusions like Lantier’s belief about the hearts of most 21-year-old girls.

Back to Blood isn’t a bad book, but it has the same but lesser strengths of the earlier novels, with the same but exaggerated weaknesses of them. We’re told, not shown, that “Mac was an exemplar of the genus WASP in a moral and cultural sense,” without knowing why, if at all, that’s important. We’re told a lot of things, most of them not especially new if we’re familiar with the Wolfe oeuvre.

There are clever moments, as when Magdalena, in a fight with her Spanish-speaking mother (or, in Wolfe-land, Mother), resorts “to the E-bomb: English.” It’s a moment of geriatric cruelty, since “Her mother had no idea what colloquially meant. Magdalena didn’t, either, until not all that many nights ago when Norman used it and explained it to her. Her mother might know hang and possibly even slang, but the hang of slang no doubt baffled her, and the expression clueless was guaranteed to make her look the way she did right now, which is to say, clueless.” It’s clever, and the kind of cleverness that makes the scene fresh and unusual. It’s also the kind of cleverness missing in repeated references to the mons verneris, or to loins, or to high-end private schools.

Wolfe also gets and has gotten for decades the weirdness and power of modern media; its spotlight is restless yet powerful, and it plays a tremendous role in Bonfire. In Back to Blood, Nestor Camacho, a Miami cop, rescues a refugee from the mast of a ship and is recorded doing it; consequently, he becomes momentarily famous, such that: “Even now, at the midnight hour, the sun shone ’round about him.” The analogizing of fame to light seems obvious, even necessary, and although I don’t want to probe its deeper properties here I like how Wolfe avoids the spotlight metaphor, much as I didn’t a few sentences ago. Wolfe uses metaphor in an almost 19th Century fashion, usually effectively.

He gets the way civic booster types think of the arts not as a thing in and of themselves, but as a checkbox; an editor at the Miami Herald thinks that “Urban planners all over the country were abuzz with this fuzzy idea that that every ‘world-class’ city—world class was another au courant term—must have a world class cultural destination. Cultural referred to the arts. . . in the form of a world-class art museum” {Wolfe “Blood”@111}. He’s right, of course, but right in a generic way, like people are right about love being like a rose. If you’ve read anything about urban planning, or cities (and I have), you won’t be surprised at the editor’s knowledge, which he probably picked up in the same places I did, and which says very little about him as a character, exception that he, like so many Wolfe characters, is an information and status receptacle more than he is a person with his own needs and desires.

The complaint expressed throughout this post is similar to but a bit different than James Woods’, which concerns how Wolfe’s characters tend to speak in similar or identical registers, despite coming from wildly different backgrounds. That isn’t necessarily a weakness, but the verisimilitude of the characters must be maintained in novels that portray such startlingly different people in a similar register; that’s what Bonfire of the Vanities does and what Back to Blood doesn’t, quite. The earlier novel also doesn’t feel reported even if it was reported; the latter does, in the same way I am Charlotte Simmons misses the college milieu in a thousand subtle ways. If you swing, it doesn’t matter whether you miss the ball by a millimeter or a meter. The scrim of realism is pierced and the novel doesn’t quite work.

Wood also says that “Wolfe isn’t interested in ordinary life. Ordinary life is complex, contradictory, prismatic. Wolfe’s characters are never contradictory, because they have only one big emotion, and it is lust—for sex, money, power, status.” But this isn’t quite true: Wolfe is interested in ordinary life when it’s touched by big events, or ordinary life when its inhabitants have a powerful yearning for something other than ordinary life. That yearning, that drive, can be fascinating. Plus, there’s nothing wrong with writing about extraordinary life, which can be as fascinating, “complex, contradictory, prismatic.” Wood obviously isn’t making this argument, and I doubt he would make it in the kind of caricature I’m making it here, but it’s easy to draw this kind of false lesson from the Back to Blood review. Almost every Wood review is a momentary master class in the novel as a genre, which is why so many writers and would-be writers attend so carefully to them, and why it’s worth appending this brief commentary to a review that in some ways is more useful and interesting than the impressively hyped novel being discussed.

Back to Blood is drawing on capital built up from Wolfe’s earlier novels, and overall it leaves a sense of “Fool me once, shame on you; fool me twice, shame on me.” If another Wolfe novel appears, I don’t think I’m likely to be fooled again. There are better novels about the state of America—Gillian Flynn’s Gone Girl is one—even if they don’t announce themselves as tomes about the state of America. Given how the voices of Back to Blood don’t quite work and the book-report function doesn’t quite work, there are probably better uses of one’s reading time.

A Jane Austen Education: How Six Novels Taught Me About Love, Friendship, and the Things That Really Matter — William Deresiewicz

I really like and admire A Jane Austen Education, despite agreeing with the younger Deresiewicz who the older one mocks for believing sentiments like this one, about Jane Austen’s Emma: “The story seemed to consist of nothing more than a lot of chitchat among a bunch of commonplace characters in a country village. No grand events, no great issues, and, inexplicably for a writer of romance novels, not even any passion.” Deresiewicz is setting himself up to be knocked down, and yet when I read Emma I, too, was bored by the “chitchat” among the bumpkins.

But Deresiewicz goes on to explain why his younger self was totally wrong, and how he grew as a person through closely reading Jane Austen and applying her novels to his life experience. Though his explanation is persuasive, I still don’t buy it. To me, the characters in Emma are still “a pretty unpromising bunch of people to begin with, and then all they seemed to do was sit around and talk: about who was sick, who had had a card party the night before, who had said what to whom. Mr. Woodhouse’s idea of a big time was taking a stroll around the garden.” I usually call the ceaseless chatter without any action referent “empty status games,” because the games don’t refer to anything outside their immediate social situations (granted, it might also be that I don’t usually excel in them). These sorts of situations are akin to the ones Paul Graham describes in “Why Nerds Are Unpopular:”

I think the important thing about the real world is [that. . . ] it’s very large, and the things you do have real effects. That’s what school, prison, and ladies-who-lunch all lack. The inhabitants of all those worlds are trapped in little bubbles where nothing they do can have more than a local effect. Naturally these societies degenerate into savagery. They have no function for their form to follow.

Jane Austen’s societies obviously don’t generate into savagery—unless they’ve been transformed into Pride and Prejudice and Zombies (“Now with Ultraviolent Zombie Mayhem!”)—but their inhabitants do feel “trapped in little bubbles where nothing they do can have more than a local effect,” which makes them unsatisfying, at least to my temperament. Graham might also not be an ideal person to cite, given how much he admires Austen: “Everyone admires Jane Austen. Add my name to the list. To me she seems the best novelist of all time.” Still, strike me from the list: her style is amazing and her content vapid. Consider this description, also from Deresiewicz:

One whole chapter—Isabella had just brought her family home for Christmas—consisted entirely of aimless talk, as everyone caught up on one another’s news. For more than half a dozen pages, the plot simply came to a halt. But the truth was, for long stretches of the book there really wasn’t much plot to speak of.

Or this: “What could be duller, I thought, than a bunch of long, heavy novels, by women novelists, in stilted language, on trivial subjects?” There are much duller books—Beckett’s trilogy, Molloy, Malone Dies, The Unnamable comes to mind, since those are novels written to make some philosophical statement about the meaninglessness of life or to give English professors a bone to gnaw into scholarly papers—but the point stands. I’m not opposed to “women novelists,” and anyone who is on the grounds of perceived unimportance should try The Secret History and Gone Girl, but “long, heavy novels [. . .] on trivial subjects” are tedious regardless of their author’s gender.

Moreover, I’m not alone: “As it turned out, people had been reacting to Jane Austen exactly as I had for as long as they’d been reading her. The first reviews warned that readers might find her stories ‘trifling,’ with ‘no great variety,’ ‘extremely deficient’ in imagination and ‘entirely devoid of invention,’ with ‘so little narrative’ that it was hard to even describe what they were about.” At some level, as happens with much art, a preference for Austen may come down to temperament, and to what a person believes about what The Novel or a novel should do. I’ve never been able to get into novels that don’t have some kind of narrative drive or energy—both vague terms that I could spend the rest of this essay describing, or, rather, trying to describe—and, like Lev Grossman, I think “Plot makes perverts of us all:”

A good story is a dirty secret that we all share. It’s what makes guilty pleasures so pleasurable, but it’s also what makes them so guilty. A juicy tale reeks of crass commercialism and cheap thrills. We crave such entertainments, but we despise them.

For as long as a century, however, if not longer, literary culture has been bifurcating between high-culture, non-plot types who inhabit universities and book reviews and institutions, and common readers, who like something to happen and maybe some T&A or depraved longings in their fiction, even if the language used for the T&A and depraved longings isn’t very interesting. Most of us are taught that long, tedious books written in stilted language are more valuable than those that do the opposite.

To be sure, I don’t think the people who genuinely love Austen have been academically brainwashed—I think they do authentically love her writing—but I also think the original reviewers and the younger Deresiewicz have a point too, but that point is mostly drowned in school-based settings.

At the time Deresiewicz had his Austen breakthrough, he was seeing a waitress, and they “had little in common and had never progressed beyond the sex. She was gorgeous, bisexual, impulsive, experienced, with a look that knew things and a laugh that didn’t give a damn.” Perhaps this is a function of me being in my 20s, but this arrangement doesn’t sound so bad, and, having dated the equivalent woman, I rather enjoyed those things at the time. Furthermore, I don’t think such relationships are wrong—though I would also say, obviously, that they’re not the only kind of relationships available, or the only kind a person should have over the course of their life. Sometimes people eat fast food; other times they dine in fine restaurants, or at the Cheesecake Factory, or cook for themselves, or cook with another person, or cook simple foods, or complex ones, or have potlucks. I leave it to you to map that metaphor onto sexuality and relationships, but the point about variety in relationships is useful. For Deresiewicz, “Austen taught me a new kind of moral seriousness—taught me what moral seriousness really means. It means taking responsibility for the little world, not the big one. It means taking responsibility for yourself.” But people who are always morally serious can also be dull, just as people who are never morally serious are often unintentionally cruel.

The trick is being able to distinguish the two, and to find a middle way, and to develop some self-awareness, which is hard for many if not most of us. Certainly it was hard for Deresiewicz’s younger self:

If you’re oblivious to other people, chances are pretty good that you’re going to hurt them. I knew now that if I was ever going to have any real friends—or I should say, any real friends with my friends—I’d have to do something about it. I’d have to learn to stop being a defensive, reactive, self-enclosed jerk.

On the other hand, being oblivious to other people sometimes means being very tuned into technical or other problems that need solving—for the best example of this I’ve seen in literature, consider Lawrence Waterhouse in Cryptonomicon, who is shockingly oblivious and essential to the Allied war effort and who extends cryptography. It should also be noted that he’s not intentionally mean to others, and in the novel no one is emotionally hurt by him in an obvious fashion, but the depiction of his thought process as an engineer / mathematician seems pretty accurate. You get moments like this: “In particular, the final steps of the organist’s explanation were like a falcon’s dive through layer after layer of pretense and illusion, thrilling or sickening or confusing depending on what you were. The heavens were riven open. Lawrence glimpsed choirs of angels ranking off into geometrical infinity,” perhaps in exchange for attention to other people. To what extent are dispositions trade-offs? It’s a decent question, I think, but also one I can’t really answer.

Which is the kind of thing that I’m encouraged to do; in one moment, Deresiewicz praises the kind of professor we all hope to have: “When my professor asked a question, it wasn’t because he wanted us to get or guess ‘the’ answer; it was because he hadn’t figured out an answer yet himself, and genuinely wanted to hear what we had to say.” This is what I try to do in the classroom, although I’m guessing this kind of strategy works better for humanities students than for, say, math students, when the answer or answers are well-known, at least up to a fairly high level.

There are also intellectual surprises in A Jane Austen Education, and those surprises made me realize things I didn’t before:

Popular music is one giant shout of desire, one great rallying cry for freedom and pleasure. Pop psychology sends us the same signals, and so does advertising. ‘Trust your feelings,’ we are told. ‘Listen to your heart.’ ‘If it feels good, do it.’

And if everything is pointing you in one direction, it might be time to ask what lies in the other. Literature seems to ask this question. Pop music, as Deresiewicz points out, doesn’t. In Deresiewicz’s rendition, Austen herself was reacting against her time, which is to be commended:

Austen lived in the great age of trash fiction: the gothic novel, the sentimental novel, the bodice ripper—crumbling castles, creaking doors, and secret passageways; heavenly maidens and dark seducers, piercing shrieks and floods of tears, wild rides and breathless escapes; shipwrecks, deathbeds, abductions, avowals; poverty, misery, rape, and incest.

In other words, she lived in “the great age” of all the good stuff, though I would argue that the good stuff is still with us if we know where to look—I’m pretty sure Game of Thrones has every element in the Deresiewicz list.

Some weird stylistic quirks recur in the book, like the habit of “Austen was showing me” or “Austen was saying”-style constructions (“I could grow up and finding happiness, Austen was letting me know, but only if I was willing to give up something very important” or “Austen taught me a new kind of moral seriousness—taught me what moral seriousness really means” or “Austen understood that kids are going to make mistakes, and she also understood that making mistakes is not the end of the world”). But the overall effectiveness is tremendous, and not only because I might be a major component of Deresiewicz’s target audience: self-absorbed people who secretly think they have the answers other people lack.

Alif the Unseen — G. Willow Wilson

Alif the Unseen almost works, but it persistently mischaracterizes technology in a distracting, false-sounding way that its eponymous hacker protagonist wouldn’t. On the first page of Alif’s narration, we find this about his phone: “Another hack had set this one up for him, bypassing the encryption installed by whatever telecom giant monopolized its patent.” But encryption algorithms are math, and math can’t be patented. Furthermore, a patent is by definition a limited monopoly right. As a result, the last part of the sentence seems incoherent. And what is being encrypted? The phone’s operating system? Its user data?

A few pages later, Alif is “watching as a readout began to scroll up the screen, tracking the IP address and usage statistics of whoever was attempting to break through his encryption software.” But reading “usage statistics” makes no sense here: Alif isn’t, say, providing blogging software or an e-commerce platform (a few pages later, he installs a keystroke logger and other software on the computer of his love interest, and says that he does so “to track her usage statistics.” This makes more sense). Someone wouldn’t “break through his encryption software;” he or she would attempt to penetrate Alif’s firewall. The same would-be intruder leaves after “executing Pony Express, a trojan Alif had hidden in what looked like an encryption glitch.” I don’t know what “an encryption glitch” means here, and I don’t think the author does either.

Alif notes that, in the Arab Spring, “the digital stratosphere became a war zone. The bloggers who used free software platforms were most vulnerable.” If anything, open-source software should be less vulnerable, because well-known open-source software systems won’t have obvious backdoors (because they’d be found) and they have the advantage of many eyes on their source code. There’s an equally jarring moment when Alif says that he’s written a piece of software in “C++. But the type system is soft of—new. I’ve made a lot of modifications.” But he probably is referring to whether it’s dynamically or statically typed—that is, checking whether a program’s internal variables and other values are computable and safe when the program is run or when it’s compiled. It isn’t clear why Alif would change C++’s type system. At another moment, Alif worries that a malevolent, governmental entity is watching him: “The Hand would see Alif using his e-mail and cloud computing accounts, but until he could crack his algorithm, Alif would appear to be working from Portugal, Hawaii, Tibet.” The phrase “crack his algorithm” is meaningless here. “Cloud computing” is the kind of term marketers use; programmers or hackers would probably say “servers.”

These kinds of persistent, distracting errors detract from the story and the novel’s realism. It might seem strange to discuss realism in a book that features Djinn, vampires, and other supernatural elements, but any writer still has a duty to get the language of the “real” or mundane world right. Wilson doesn’t, and that makes the whole novel feel fake when it shouldn’t. In The Name of the Rose, religious language and medieval thought infuse every line, even when contemporary philosophical ideas are being expressed through the language of the time. Eco knows the period like Wilson doesn’t know the language of hackers, programmers, and computer science. I’m not an expert, but I’ve read enough in the field to understand what she misses.

Still, there’s a sense of hidden knowledge that runs throughout Alif the Unseen, and a melding of old ideas with new technology. That’s an appealing idea, and so is the idea of an Arab Golden Compass. It’s got some religious elements that could come from The Name of the Rose. Much of the writing is skillful if not particularly memorable. Funny moments appear: Vikram the Vampire, on hearing one of Alif’s schemes, says, “I don’t want foreigners involved in my business. Jinn are one thing but I draw the line at Americans.” Such moments are just not common enough to merit reading this book over something better, like The Golden Compass or Gillian Flynn’s Gone Girl or Carlos Ruiz Zafon’s recent novels, all of which do language better.

Design.Y Notebook Review: The Record 216

EDIT: I sent an e-mail to Design.Y about the binding breakage described below, and they sent me a new notebook. I’ll write another update when I’ve filled the new one.

EDIT 2: The new one broke too. I’m not going to use these anymore and am switching to the much cheaper and much more durable Rhodia Webnotebooks mentioned below.  

The most salient feature of the Design.Y “Record 216”* is its price, which varies with the Yen-to-dollar exchange rate but currently hovers around $70 with shipping. Those of you who can do simple math are probably thinking that this is about 35 times greater than a drugstore pocket notebook and fives times a Rhodia Webbie. I want the Design.Y notebooks to be five times better. Hell, I want them to be twice as good. But while the notebook is certainly a lovely object that’s been lovingly packed, like a florist’s rose or an undertaker’s corpse, the Record 216 suffers from one major flaw: the paper is too thin.

I’m constantly crinkling it or creasing the corners or bending the middle when I mean to turn the page (see the photo below for an example). The paper is definitely a joy to write on, but a heavier version would be an improvement; thinner is not always better, as anorexia counselors will remind us. Some pens will bleed through, as shown in the picture to the right. The bleeding problem is not great with my fountain pen, but then I use a extra-fine nib that’s about as slender, if not more so, as the Pilot G2. Users of thicker nibs may have concomitantly greater problems. Still, I can forgive the bleed-through problem. It’s the lightness that bugs me, and the way I subconsciously worry about bending a page when I’m merely trying to turn it.

Almost everything else about the notebook is incredible. It’s been in my pocket for months without suffering any problem greater than a frayed band. The size, at 5.3″ by 3.1″, is quite handy, and I’ve come to like it better than the standard 5.5 x 3.5 size of Rhodia Webbies, Guildhall, Leuchtturm 1917, or Moleskine. The 3.1″ width makes it feel much more portable at no cost to usability; if anything, the sense of a long, narrow column is an enhancement. The line spacing on the page is neither too great (as it is on the Rhodia) nor too small, and lines extend to the edge of the page, as they should. The cover has a very slight lip that doesn’t distract. The Design.Y notebook also sits flat “out-of-the-box,” so to speak, and doesn’t suffer from the stiffness of a fresh Rhodia or Moleskine. That stiffness declines with age, but it’s still present. The binding is strong and supple.

These features don’t quite make up for the paper thinness or price, however. I only go through one notebook every six to twelve months, but even so, $70 is a substantial hit for what is basically a consumer trifle. A lovely consumer trifle, but with a fatal flaw that makes justifying its price difficult. Perfection is difficult, and the Rhodia Webbie isn’t perfect: its lines should extend to the end of the page and its lines should be closer to one another. If I were Steve Jobs, I’d be driven mad by these problems. Fortunately, I’m not, but it’s clear that Design.Y has noticed some of the same things Apple has. In the Walter Isaacson biography of Jobs, he writes: “Early on, Mike Markkula had taught Jobs to ‘impute’—to understand that people do judge a book by its cover—and therefore to make sure all the trappings and packaging of Apple signaled that there was a beautiful gem inside.” Design.Y does the same. The company even includes a sheath of extra paper with a hand-written note. I can’t help noticing that one flaw in the gem, however, that keeps me from wanting to give myself over to an otherwise shining light.

As you can probably tell, I want the notebook to be better than it is. A Japanese craftsman named Hiroshi Yoshino makes them (this also explains why their price is denominated in Yen) using “Tomoe River” paper, which the website accurately describes as “very thin and lightweight.” I wouldn’t want him to switch to the Rhodia’s tank-like paper. But something heavier and less bendable would make the Record 216 the perfect notebook.

EDIT: Unfortunately, the front and back of my notebook began to split after about five months of normal use:

And, to me, this disqualifies the Model 216 for day-to-day use; I haven’t had the problem with Rhodia Webnotebooks. Although I wish the Rhodia’s lines went to the edge of the page, and its paper is perhaps slightly too thick, I think the trade-offs—especially accounting for price—make it a better choice. A $70 notebook better be perfect. This one isn’t, and its durability is especially distressing.


* Or “Model 216,” depending on which part of the website you’re reading. Chalk this up to charming translation idiosyncrasies.

The accidentally bent page.

Product Review: The Leuchtturm 1917 notebook

The Leuchtturm 1917 is perfectly competent. It’s slightly larger than a Moleskine, when a notebook should be, if anything, slightly smaller. This is a small point. The paper quality is, to my eye and hand, indistinguishable from Moleskine’s, which in turn is very similar to Guildhall and most of the other non-Rhodia notebooks I’ve tried. It has one other annoying feature: the last 30 or so pages are perforated; this is another way of saying, “They’ll eventually fall out.” If you’re the kind of person who wants to desecrate your notebook by tearing out pages, then the Leuchtturm 1917 is for you. To be sure, perforated pages are a minor annoyance. But if you’re not trying to avoid minor annoyances, stick with Moleskines, since they’re widely available.

The only major problem with the Leuchtturm 1917 simple: it doesn’t offer any major, obvious improvements over the Moleskine. It doesn’t offer any real disadvantages, either, other than its departure from the canonical 3.5 x 5.5 size and the perforated final pages. Unlike the Quo Vadis Habana, however, the Leuchtturm 1917 isn’t so much larger that carrying it around becomes a chore.

If this review seems slight, that’s because it is—the differences between this notebook and a Moleskine are trivial. They experience the same corner tearing, although I didn’t use the Leuchtturm long enough for the tears to develop into the cover partially coming off. If you’ve used a Moleskine, you’ve already in effect used this notebook; both are decent, but neither beats the Rhodia Webbie.

More on that soon.

Worthless: The Young Person’s Indispensable Guide to Choosing the Right Major — Aaron Clarey

A lot of the content but little of the rhetoric in Worthless can be found in articles like Jordan Weissmann’s “53% of Recent College Grads Are Jobless or Underemployed—How? A college diploma isn’t worth what it used to be. To get hired, grads today need hard skills,” which says:

not all degrees are created equal [. . . graduates in the] sciences or other technical fields, such as accounting, were much less likely to be jobless or underemployed than humanities and arts graduates. You know that old saw about how college is just about getting a fancy piece of paper?

Weissman is right; Clarey is right in places too, but even when he is mostly right, he overstates his case: the American education system has become like the proverbial elephant being described by blind men: one touches its tusk, and its trunk, a third its legs, and a fourth its back, and each proclaims that he understands the essential shape of the elephant, while none of them see the whole.

Derek Thompson describes the elephant problem in “The Value of College Is: (a) Growing (b) Flat (c) Falling (d) All of the Above.” The right answer is “d,” but even if the value of college is falling, it’s still an improvement, for most people, over not going to college. More people should probably major in science, technology, engineering, and math, as Clarey writes, but if your margin is between not going to college and entering the workforce straight from high school, or going to college and getting a comm or English degree, which is more valuable? To be sure, more people who are marginal candidates at colleges should consider vocational education, which Clarey says.

In an early passage, Clarey—who used to teach at a college—asks students to list what they want to buy. Most say gas, cars, or gadgets. He goes on to say that “there was a huge mismatch between what people wanted and what they were studying.” He’s partially correct. But he neglects to say that many people say they want one thing and then spend money on something else.

In the United States, for example, government expenditures consumed about 42% of GDP in 2012. Regardless of what this group of students say they want, voters in the aggregate want relatively high levels of government spending—and they get it. None of the students mentioned “thousands of dollars in subsidized debt,” even though many if not most are getting it. None mentioned health care, either, even though health care consumes a growing percentage of GDP. Clarey writes, “There was also no shortage of psychology majors, but not one person ever listed ‘therapy’ on their wish list.” But few people wish to admit in public, or to their instructor, that they want or need therapy, which doesn’t signal reproductive or intellectual fitness. The quoted sentence also doesn’t need the word “also,” which appears in the awkward first sentence of the preceding paragraph: “Also ironic was how there were so many sociology majors, but not one person listed ‘social work’ in their wish list.”

While I agree with part of the larger point—you should think about how the things you want to consume match with the things you are learning how to produce, and you should focus on making things that people want—people don’t always know what they want, or what they’ll pay for, and what they say they want and what they actually buy are often quite different. Whenever possible, shoot for observed rather than reported behavior. Americans are willing to say that buying American products are important to them, but very few actually take place-of-origin into account in actual purchases. Pay attention to those gaps. In his example, Clarey doesn’t.

There also appears to be a growing dynamic in this country by which people who work in highly competitive tradable sectors, like software and finance, support a large and growing non-tradable sector (baristas, yoga teachers, people dependent on Social Security / Medicare). Like any trend, this one might change, but it might also lead to the kinds of problems Tyler Cowen describes in The Great Stagnation.

Clarey writes that “You will inevitably work eight hours a day for 30-40 years. This will be, hands down, the single biggest plurality of your conscious time on this planet.” There are a couple of problems with this description: first, not everyone works for eight hours a day for 30-40 years. As Paul Graham observes in “How to Make Wealth,” “Economically, you can think of a startup as a way to compress your whole working life into a few years. Instead of working at a low intensity for forty years, you work as hard as you possibly can for four.” Beyond that, if you’re the kind of person who doesn’t spend a lot of money, you could conceivably work a normal job for a shorter period of time and then do something else; personally, I’d find idleness dull, but I suppose some people like it, or the idea of it. Stylistically, notice too the use of the cliche “hands down:” it adds nothing to the sentence. And notice too how he uses the phrase “plurality of your consciousness.” I’m not really sure how consciousness gets divided into pluralities; the usage note in the Oxford AMerican Dictionary distinguishes plurality from majority by saying, “A plurality is the largest number among three or more.” But what are your other “consciousnesses?” Clarey doesn’t say.

There are other moments of overstatement—like the next page, where Clarey describes how you will be working, and then says “How enjoyable and rewarding all of this is boils down to one simple decision – what are you going to major in?” Leaving aside the further use of cliche, I’m not convinced this is true: many if not most people end up working in fields unrelated to their major. I suspect that the pleasure or lack thereof in one’s work life depends on temperament, attitude, motivation, and a myriad of other factors unrelated to college major. The issue doesn’t boil “down to one simple decision”—it relates to a whole host of personal, social and economic factors.

He also writes that degrees like “Sociology” and “Non-profit Administration” “are in the financial sense LITERALLY worthless.” This doesn’t appear to be true, given the well-known data on earnings premiums to college degrees—many of which are linked to earlier in this post.

Still, Worthless excels at telling you what The Atlantic won’t: if you want to make a lot of money and a difference in people’s lives, major in STEM fields, but you’re probably reluctant to do so because you’re lazy and those fields are hard. They haven’t experience the same level of grade inflation as other fields. In this respect, the book is right. But it doesn’t excel in asking larger questions what kind of people major in each discipline and how many opportunities a degree—any degree—can still open. If you’re a generic student who isn’t especially passionate about anything and aren’t sure what you want to do, stay upwind. Increasingly, that means STEM. You can say it softly or brusquely and still get the same result.

But majoring in something you despise in pursuit of a paycheck might isn’t optimal either. In Bronnie Ware’s Regrets for the Dying, Ware, who worked in “palliative care,” lists the regrets she listened to patients express as they died. They said things like “I wish I’d had the courage to live a life true to myself, not the life others expected of me,” “I wish I didn’t work so hard,” and “I wish I’d had the courage to express my feelings.” In her telling, none say, “I wish I’d been a Senior Account Supervisor Level 5,” or “Making Executive Vice President was the apex of my life,” or “If only I’d been an engineer, everything would’ve been different.”

This isn’t argument against majoring in the hard sciences, since no one is stopping engineers or hackers from working less hard or expressing their feelings. But it is an argument about the value of a life as measured in non-financial terms, and attempting to measure life in solely financial terms might yield a less than optimal return on investment. Daniel Gilbert’s book Stumbling on Happiness offers an enormous amount of research that shows how most people do not become substantially happier when they earn additional income beyond $40,000 per year, and most of them value meaningful work, their sex lives, and friends much more than extra marginal income. Again, I’m not arguing against majoring in STEM fields, but if your sole purpose in majoring in a STEM field is to maximize your lifetime earning potential, you might be maximizing the wrong thing. If you major in something easy because it’s the default path, you’re making a mistake. But if you want the easy route, I don’t think Worthless is going to convince you to avoid that route, even if its content will let you avoid saying, “No one told me.”

Arguing in favor of majoring in STEM fields might sound ironic coming from an English major and now English grad student like me, but I do so largely based on the observation of the life trajectories of the people around me. You can find innumerable arguments for liberal arts degrees—here’s a recent one, from Stanley Fish at the New York Times—but very few get around the income data problem combined with the rising cost of degree problem, let alone the way technology is ripping up and reshaping large parts of human life—which history, English, and philosophy aren’t doing (I’d argue that economics, neuroscience, and biology are doing more to shape the way we think about human behavior than history, English, philosophy, and the rest of the humanities; why argue about human nature when you can try to measure it?*).

Still, if your mostly view a degree as a signaling device—as Bryan Caplan does, and as he’s going to argue in The Case Against Education (you can read more about the ideas on his blog), then what you major in doesn’t matter that much because you’ve already signaled that you’re diligent and conscientious. In many fields, if you’re any good, you’ll be able to teach yourself those fields: there are numerous people working as programmers with little or no formal training in programming. Ditto for business; indeed, no one in my family had any formal training in any aspect of business, yet we’ve been running Seliger + Associates for decades; watching the experience of many tech entrepreneurs makes me skeptical of the value of formal business training that is devoid of content from the business one presumably wants to enter. I read stories like “Patagonia’s Founder Is America’s Most Unlikely Business Guru: For years, Yvon Chouinard kept his eco-conscious, employee-friendly practices largely to himself. Now megacorporations like Walmart, Levi Straus and Nike are following his lead” and wonder what the homo economicuses are learning in B-school.

In dealing with life, rather than just your major, a more viable book might be something like Po Bronson’s What Should I Do With My Life?, which is less didactic and certain—although it is also vague, wishy-washy, and overly long. It might have pointed out that, if you are defined primarily by external structures and expectations instead of an inner quest for growth, knowledge, and understanding, you will probably never be able to accomplish the kinds of things you should. For people externally motivated, hard degrees are especially important, because they’re not going to pick up a copy of Learn Python the Hard Way and learn Python the hard way. They’re not going to take charge of a business and figure out how to lead from the front.

If you find work that you love, it doesn’t really feel like work. Perhaps more people should work on finding that, if they can—not everybody can—and then seeing if they can extract money from what they like doing (see also Robin Hanson’s short post on the subject).

Are you better off reading this book, or reading links above? The answer depends on the extent you value judiciousness versus the extent you value someone telling you what to do without exploring the nuances inherent to the situations. I did not notice any sentences that were beautiful, moving, or surprising. Many needed basic copy editing (sample: “You would obviously like to choose a field that you have an interest in” should be “You would obviously like to choose a field that interests you”), and the book works best if you don’t read it closely, which reinforces the question I posed in the first sentence of this paragraph. Nonetheless, Worthless is a symptom of larger problems in American education, and I expect those symptoms to get worse before they get better.


* Not everything can be measured, but given the choice between measurement and not, shoot for measurement.

Summary Judgment: The War of the Sexes — Paul Seabright

The War of the Sexes: How Conflict and Cooperation Have Shaped Men and Women from Prehistory to the Present isn’t a bad book, but you’ve already in effect read it if you have a cursory knowledge of the vast evolutionary biology literature—or if you’ve read books like Roy Baumeister’s Is There Anything Good About Men?: How Cultures Flourish by Exploiting Men, or Tim Harford’s The Logic of Life, or Sarah Blaffer Hrdy’s The Woman That Never Evolved. If you have read those books—especially the first—you don’t need to read this one, and that’s why I’m not linking directly to it. There are too many better books.

Given a choice between The War of the Sexes or Jonathan Haidt’s The Righteous Mind, choose the latter. You’ll learn more about topics like this one, from The War of the Sexes:

Much of the elusive, infuriating, and enchanting nature of what we feel and why we feel it. Far from being a flaw in our makeup, it is a testimony to the complexity of the problems natural selection had to solve to enable us to handle sexual reproduction at all.

Although this is true, it also feel perilously close to being banal; by now, it’s well-established that emotions/feelings and “intelligence” or “logic” aren’t really separable entities in the human cognitive makeup. What we might think of as “a flaw” is actually an adaptation. Haidt discusses this in far more detail. Seabright also points, again correctly, to the way our own desires are really trade-offs and tensions rather than absolutes:

All individuals, men and women, will also want contradictory things: to be successful and to be protected, to choose our partners and to be chosen by them, to be passionate and to be reasonable, to be forceful and to be tender, to make shrewd choices and to be seduced. With such contradictory impulses, all of us will sometimes make choices we regret. Sex is about danger as well as about tenderness: the two are inseparable, and they are what has made us such a tender and dangerous species.

Our romantic lives aren’t immune to trade-offs, which might be why we find those romantic lives so frustrating so much of the time: they’re hugely important and simultaneously impossible to do perfectly “right.” But, again, this doesn’t feel like news. It feels like olds.

The writing is competent and the research reasonably thorough, but, again, the book as a whole is only useful if you’ve read little or no evolutionary biology; as it went on, I skipped steadily more pages. It isn’t bad. I feel like I’m witnessing a guy burst into a room the day after a big game, breathlessly wanting to celebrate his team’s victory, only to find the rest of the group expunged its impulse the night before.