The Generals — Tom Ricks

The Generals has one of the best qualities a general nonfiction book can have: it’s about a specific topic that it covers well, but its lessons and ideas also transcend its topic and apply to many others. Let me explain. Take this section, about General Patton:*

Even now, more than six decades after his death, Patton remains one of our most remarkable generals. ‘You have no balance at all,’ Marshall’s wife once scolded the young Patton, correctly, years before World War II. Maj. Gen. Ernest Harmon, one of his peers, wrote that he was ‘strange, brilliant, moody.’ The blustery Patton behaved in ways that would have gotten other officers relieved, but he was kept on because he was seen, accurately, as a man of unusual flaws and exceptional strengths. Marshall concluded that Patton was both a buffoon and a natural and skillful fighter.

Knowledge, skill, and expertise in one domain don’t necessarily transfer to other domains. A brilliant physicist may be a terrible marriage therapist, and vice-versa. Someone who is a “buffoon” might also have a compensating skill that makes up for their possible deficits. Paul Graham implicitly writes about this in Is It Worth Being Wise?:

‘wise’ means one has a high average outcome across all situations, and ‘smart’ means one does spectacularly well in a few. [. . .] The distinction is similar to the rule that one should judge talent at its best and character at its worst. Except you judge intelligence at its best, and wisdom by its average. That’s how the two are related: they’re the two different senses in which the same curve can be high.

A lot of people seem to have trade-offs between peaks and averages. Steve Jobs comes to mind: Walter Isaacson’s biography is rife with examples of Jobs being wrong, cruel, and occasionally outright stupid. His lows were low. But he got big, important stuff right—and not just right, but very, spectacularly right. He found (or made) the right environment for his skills. It’s almost impossible to imagine Jobs being a good employee at, say, Wal-Mart, or any large company that values homogeneity over creativity.

It’s obviously possible to have high averages and high peaks, but that doesn’t appear to be common. Really spectacular peaks often come in unusual packages. Those unusual packages are often easy to dismiss by someone not paying attention.

Unfortunately, as Ricks points out, America since the Korean War hasn’t judged its generals by their peaks or their averages: in fact, we haven’t judged generals on their competence much at all. That’s a tremendous, underappreciated problem. In Ricks’ description, the generals cut from the Marshall style were primarily “team players” who needed to work effectively with others and defer to the group. That’s not necessarily a bad thing; as Ricks says:

Perhaps those who rose highest in World War II were organization men. But for the most part they were members of a successful organization, with the failures among them weeded out instead of coddled and covered up. That would not be in the case in our subsequent wars, in which it would be more difficult to know what victory looked like or even whether it was achievable.

Different time periods reward different forms of industrial organization. If World War II rewarded “organization men,” many of today’s organizations reward people who figure out the weaknesses of large organizations, and then offer alternatives. But that can’t happen in the military, where the closest analogue to startups might be defense contractors and private, Blackwater-style armies. Those, however, have their own problems.

There’s also an analogy to teaching: almost no public school teacher is fired, ever, for bad teaching. Not being able to fire transparently terrible teachers is an impediment to getting better teachers, as almost anyone who’s ever been in a public school knows.

Organizations also need to focus on making sure that they’re focused on their major purpose, not on primarily serving the interests of the people inside them:

Trying to be fair to officers can be lethal to the soldiers they lead on the battlefield. The Army was using the Korean War to give the staff officers of the earlier war ‘their chance’ to command in combat—with disastrous results. Well before Chosin, the Army had recognized that it had a problem with inexperienced combat leadership in the war.

The problem is “inexperienced combat leadership,” but the solutions became worse in some respects than the problem itself. Fairness to one group can mean extreme unfairness to others, who often have much less of a voice. No one speaks for the enlisted men who are led by incompetent generals. (No one speaks for those led by an incompetent president, either, but that’s a separate issue related to larger American society.)

Misaligning incentives creates a deeper sense of rot; Ricks says that generals, by the post-Korean-War era,

were acting less like stewards of their profession, answerable to the public, and more like keepers of a closed guild, answerable mainly to each other. Becoming a general was now akin to winning a tenured professorship, liable to be removed not for professional failure but only for embarrassing one’s institution with moral lapses.

Notice what this says about Ricks’s view of the university: by comparing one system that advances mediocrity with tenure, he implies that tenure advances mediocrity. He doesn’t go on to explain why he uses the metaphor, because he assumes that his readers already believe as much. But tenured professors aren’t putting their students in life-or-death situations, and students can choose to pick a different department or university. Service members can’t. During World War II, as Ricks says, the road to victory and home led through Berlin and Tokyo. In recent wars, the road to victory has been murkier, the politico-military establishment mostly hasn’t selected generals adept at operating in the murk. The consequences are clear.

The Generals is too detailed for people who aren’t deeply interested in military affairs and history. It probably isn’t detailed enough for those who are immersed.

But it’s also the best intellectual explanation of why one should be wary of enlisting in today’s American military: you might get killed by someone incompetent but unaccountable on the basis of performance. Contemporary generals who lose wars and cost soldiers their lives are fêted. They “retire” to lucrative consulting gigs with defense contractors and lobbying firms. The soldiers are disabled or dead. To me that argues against becoming a soldier or junior officer. In most businesses, if you think your boss is an asshat, you can quit and start a rival firm. In the military, obeying is the only option, and no one is making sure that your boss is actually good at his job.

EDIT: B.J. Khalifah has an interesting letter in The Atlantic:

Thomas Ricks overlooked something important. Sadly, nobody becomes a general (or equivalent) in the military until they have served for many years. Most colonels are 50 by the time they get promoted. Many younger officers have experience and drive; as a group, they adapt well. Older officers are more cautious, members of the “cover your ass and do not make waves” category. They know how to manipulate the good-old-boy game. The service should be, but is not, a strict meritocracy. In effect, it follows union-style rules of seniority and time in grade. From second lieutenant to first lieutenant to captain is automatic. Some lousy officers have made it past captain to become major by being on court-martial or combat duty when they are promoted. The rules are not negotiable.

This contrasts hugely with startup and good corporate cultures, which judge people almost purely on merit. Successful startups have famously been founded by 18 year olds. Even law firm partners can be promoted within as little of five years of hiring, while associates frustrated by a firm’s practices can start their own. The military apparently doesn’t do that, and I haven’t seen any evidence that 50-year-old generals will necessarily be better than 26-year-old (hypothetical) generals. Certainly among startups this isn’t true.

The comparison isn’t perfect—markets reward innovators for making things people want, and the military doesn’t have a clear feedback loop. But at the moment almost no one is even discussing the issue, or making the comparison.


* The movie Patton is also remarkably good, especially the speech at the beginning. Patton doesn’t have the American character down correctly—Americans don’t love the sting of battle unless we’re provoked—but the speech demonstrates a lot about the man doing the speaking.

The bit about loving a winner and not tolerating a loser is also fascinating in light of The Generals: we’ve tolerated a lot of losers, like Donald Rumsfeld and Tommy Franks, and sacked winners like Eric Shinseki.

Warning: Don’t buy James Scott Bell’s Plot & Structure: (Techniques And Exercises For Crafting A Plot That Grips Readers From Start To Finish)

I bought Plot & Structure because the issue of how a novel’s narrative moves seems to be understudied by academics, who tend to produce jargon-laden, overly analytical nonsense, and by novelists themselves. I’d really like an equivalent of How Fiction Works, but for an important matter that James Wood disdains (“the novel soon showed itself willing to surrender the essential juvenility of plot”). My ideal book would, as Wood says, ask “a critic’s questions and [offer] a writer’s answers.”

Unfortunately, Bell asks few questions and offers fewer answers. This is frustrating to me because, when I started writing, plot was a major weakness. The first two novels I actually wrote to completion had no real plots and thus weren’t very good novels (my Dad pointed out the former and let me infer the latter). Since then I’ve spent a lot of time thinking about plot, and being dissatisfied that I’ve never seen it addressed well elsewhere. Self-consciously literary writers and critics tend to discount it (as Wood does), sometimes to the detriment of their own work.

Genre writers tend to understand plot but either aren’t known to me, critically speaking, or write so poorly on a sentence-by-sentence level that their work isn’t interesting. To me, the best novels combine plot/story and language in a single, cohesive package. That, however, is difficult to do, and the difficulty may explain why we see so many arid academic-feeling novels about, oh, I don’t know, language and pure consciousness and What It Means To Be Alive Today, while so many genre novels with ticking bombs and handsome heroes and buxom heroines who put out surprisingly easily and simple words laid out in simple ways that won’t confuse anyone.

Not only is this dearth annoying because of my own flaws, but because I can’t point aspiring writers to a particular book and say, “Read this.” I can talk about some of my own techniques—I’ve written plot outlines for a number of books I admire, like an artist tracing his favorite paintings in order to imbibe their spirit and technique. With scenes, I’ve learned to ask what each character wants, what stands in his or her way, and why he or she is doing to overcome that barrier. I don’t always have answers—the characters often don’t have them, either—but at least asking the questions provides some structure to what might otherwise be a misaligned mess.

One can’t, of course, separate plot from character, setting, narrating, and other technical features in a novel. It would be stupid to try. But plot is a great blindspot in Wood’s criticism, and it’s a blind spot I aspire to see, or to see someone else seeing.

Bell, however, is blind.

Jonah Lehrer’s Imagine is still worth reading

Jonah Lehrer, as is now well known, repeatedly misrepresented research and plagiarized other people’s writing in Imagine: How Creativity Works. But, as Roy Peter Clark points out, “Jonah Lehrer’s ‘Imagine’ is worth reading, despite the problems.” Clark goes on to say, “not all the sins [Lehrer commits . . .] are equally grievous,” but, despite that, “the reading of the book ‘Imagine’ helped me understand my world and my craft, and what else can you hope for from a non-fiction book.”

I’ve found the same thing after reading Imagine based on Clark’s endorsement. But reading it in light of Lehrer’s indiscretions reveals new potential layers of meaning, because a couple of passages have a very different resonance, like this one, about Shakespeare’s milieu:

His [Shakespeare’s] peers repeatedly accused him of plagiarism, and he was often guilty, at least by contemporary standards. What these allegations failed to take into account, however, was that Shakespeare was pioneering a new creative method in which every conceivable source informed his art. For Shakespeare, the act of creation was inseparable from the act of connection. {Lehrer “Imagine”@221}

Lehrer seems to be using the same method. But the age of the Internet makes tracking sources much, much easier than it used to be. And he goes on:

The point isn’t that Shakespeare stole. It’s that, for the first time in a long time, there was stuff worth stealing—and nobody stopped him. Shakespeare seemed to know this—he was intensely aware that his genius depended on the culture around him. {Lehrer “Imagine”@221}

In retrospect, this reads as a preemptive defense of Lehrer’s own method. But I don’t get why Lehrer made stuff up: most of what he invented doesn’t seem to be very important, and it’s the kind of peripheral material that makes for good reading but isn’t essential. Given contemporary attitudes towards plagiarism—the passages above show that he knows and understands those attitudes—why risk so much for so little gain? It’s like a millionaire stealing a pair of $20 jeans. Why tarnish success? I can imagine some possible answers to these questions, but none of them are very satisfying, and I ultimately want to ascribe Lehrer’s lies to simple human vanity.

Imagine is still pretty interesting. I doubt it’s a perfect book, and I wouldn’t cite Lehrer in my neuroscience PhD dissertation. But I am now conscious of the tension between free-form creative thought and focused attention to a particular, grinding problem (“We need structure or everything falls apart. But we also need spaces that surprise us. Because it is the exchanges we don’t expect, with the people we just met, that will change the way we think about everything”); I am conscious of the need for both longtime collaborators and for new faces; and I am conscious of how people with deep domain expertise may benefit from applying that expertise elsewhere. Some of Lehrer’s points, like his description of the virtues of cities or the eccentric greatness of Paul Erdos, are already familiar. But he helps me see them in new ways. A moment like this, for example, shows me something important about my own writing and creative work:

Friedrich Nietzsche, in The Birth of Tragedy, distinguished between two archetypes of creativity, both borrowed from Greek mythology. There was the Dionysian drive—Dionysus was the god of wine and intoxication—which led people to embrace their unconscious and create radically new forms of art. [. . .] The Apollonian artist, by contrast, attempted to resolve the messiness and impose a sober order onto the disorder of reality. Like Auden, creators in the spirit of Apollo distrust the rumors of the right hemisphere. Instead, they insist on paying careful attention, scrutinizing their thoughts until they make sense. Auden put it best: ‘All genuine poetry is in a sense the formation of private spheres out of public chaos.’ {Lehrer “Imagine”@64}

I am far more in the Apollonian mode than the Dionysian mode, but, perhaps for that reason, I’m fascinated by and perhaps even envious of Dionysian thinking, acting, and living. A novel like The Secret History thus becomes all the more important to me, because it has an Apollonian narrator, Richard, dealing with the aftermath of an attempt to reach Dionysian ecstasy. In the novel, not surprisingly, the outcomes are pretty bad, but the idea of deliberately trying to reach an ecstatic experience resonates with my temperament.

There are some moments that appear, on the surface, self-contradictory. Lehrer says, “The most creative ideas, it turns out, don’t occur when we’re alone. Rather, they emerge from our social circles, from collections of acquaintances who inspire novel thoughts. Sometimes the most important people in life are the people we barely know” {Lehrer “Imagine”@204}.

Earlier in Imagine, however, Lehrer discusses how many creative ideas when people are taking morning showers—where most are presumably alone. So do creative ideas emerge from chatting with others, or when our mind is a relaxed state that lets it make disparate connections among ideas? The answer appears to be “both,” but Lehrer doesn’t explicitly discuss the implied contradictions. I’m not saying he couldn’t reconcile them, but I am saying that someone should’ve pointed these kinds of contradictions out.

Even if all of Imagine’s research and stories are somehow wrong—and I don’t think they are—the book still offers novel ways to think about creativity and how to structure one’s life or work more effectively and in ways that I hadn’t foreseen. I wish the publisher hadn’t withdrawn it altogether. Used copies on Amazon now start at $25. It may be that the existing copies thus continue to rise in value because of their scarcity; alternately, readers might turn to pirate editions on the Internet, which I can only assume are easy enough to find (my book came from the University of Arizona’s library).

Don’t Go to Law School (Unless) — Paul Campos

Paul Campos saves what might be the best paragraph of Don’t Go To Law School (Unless): A Law Professor’s Inside Guide to Maximizing Opportunity and Minimizing Risk for the very end of the book, so I’m going to invert his structure and start with it:

Have you ever said to yourself, “I don’t know what to do with my life – so I’m going to spend three years of it going deeply and irreversibly into debt, in a quite possibly futile attempt to enter a profession that I have no actual desire to join?” I bet you haven’t, because who would ever say something that idiotic? Every year, however, thousands of people are perfectly capable of doing something that idiotic. If they weren’t, half the law schools in the country would be out of business tomorrow.

We’ve looked into the mirror and seen the enemy, and the enemy is ourselves. Sure, someone else might hand us the weapons we use to mutilate ourselves—that is, student loans—but someone who hands you a loaded gun isn’t obligating you to shoot yourself in the foot. Perhaps they shouldn’t have handed you the gun, but they did, and you can’t wholly blame that person for your mistake. It sure is more fun, however, to blame someone else for your mistakes than it is to stand up and say, “I’m an idiot and I’ve made bad life choices.”

I, however, am idiot and made a bad life choice—but I quit law school after one year, based largely on bad assumptions, fear, stupid desire, anachronistic beliefs about the legal market, and various other factors I’d rather not examine in detail. The problems with law school are slowly becoming better known: “for more than 30 years now the market for legal services has been contracting relative to the rest of the economy.” The basic problem is that law schools have been raising tuition faster than the rate of inflation for decades, and the legal market is a well-defined and studied one: there are about twice as many credentialed lawyers being minted as there are jobs for them to enter.

You don’t have to be a mathematician to realize that some of those would-be lawyers are going to be left out. In the last, they would have been left with relatively little debt, which would have made arguments like “a law degree will open doors even if you don’t practice law” at least somewhat plausible and mildly tenable. Now those kinds of arguments aren’t. There are lots of common, bad reasons people go to law school: “Like a lot of other people, I went to law school I couldn’t think of anything better to do. At the time I applied I was three years removed from my undergraduate days as a somewhat aimless English major,” and, though this may sound odd, law school itself doesn’t prepare people for practicing law.

That wasn’t really a problem when tuition was cheap and proto-lawyers could work cheaply for a couple years to learn the trade. Now the stakes are high and law school’s inadequacies are a huge problem, because having more than $100,000 in law school debt that can’t be discharged through bankruptcy will hurt people for decades, especially if they can’t get the training necessary to actually practice law. As Campos says, “The two most important practical skills that any lawyer working in private practice must possess are the ability to acquire clients, and to get them to pay their bills, which happens to be two things that most legal academics have never done in their lives.”

There’s little pressure, at least right now, to change the system. There’s little pressure legal academics to learn these kinds of skills and impart them to their students. The only way I can see to create that kind of pressure is by convincing enough people not to go to law school that the schools themselves start receiving market pressure to reform. Without that pressure, they can simply continue.

Campos is a law professor and has spent the last year and a half writing about the problems in law school on the blog Inside the Law School Scam, which is like porn for academic eggheads. It’s got lots of well-researched money shots. But, also like porn, too much of it all at once is enervating, and by now the larger point—don’t go to law school—is or should be well-known. For people considering law school, the only real question can be answered with a binary: Should I go to law school? The answer is almost certainly “no.” For most people, ITLSS only needs to be read once: the problems of law schools are most pressing for law school insiders, not for the rest of us. We need to know that “most people currently attending law school would be better off not doing so.”

And it’s intellectually honest to admit as much: “I’ve become increasingly aware that my ridiculously good job is being paid for by people who are increasingly unable to get the kinds of jobs they came to law school to get.” But relatively few insiders are willing to admit that the systems they participate in and propagate aren’t good for outsiders. That’s one reason Campos’s book is so admirable. It’s also uses stories but eschews relying exclusively on them and focuses instead on money.

The more I pay attention to the world, the more I see how much money and financial constraints underlie a lot of surface phenomena. In an ideal world, money is a strong proxy for value; a company like Google or Apple is worth a lot because both provide a lot of value to people. The education world, however, has broken that link, and the breakage is getting worse with time.

I wonder how long it’s going to take until some law school decides to utterly reverse course and simply say that it’s going to have ugly buildings, a small library, huge class sizes, and very low tuition—say, $10,000 a year. Or $9,500. The professor-to-student ratio would be something like 1:100, and there’d be a dean and virtually no other administrative support or special programs. But this model would focus on being sustainable and making sure that students don’t face penury at the end of law school.

Instead of working to compete with the current model that almost all law schools employ (or deploy), Jake’s hypothetical would do the opposite, and be proud of getting people a legal education for under $30,000 in tuition, with a maximal focus on employability following graduation and a minimum focus on student loans (I’d also love to see open-source textbooks). They could advertise their alternate strategy, and maybe have a blog that explains the ways the conventional system is set up to screw students.

As far as I know, a couple of schools try the “admit everybody and charge them a lot” model, but few try the “admit many, but charge them a little model.” The notorious Thomas M. Cooley Law School does the former, and no one who knows anything about law school will go there, but they charge $54,000 per year right now. There might be institutional or ABA-imposed barriers that I’m unaware of. Still, if that kind of model is successful, it could at least challenge the hegemony of the Harvard-Yale-Stanford model of law school, which is untenable and getting worse.

See also “The specious reasoning in Lawrence M. Mitchell’s ‘Law School Is Worth the Money‘” and “Why You Should Not Go to Law School.” Do not listen to your parents, for whom law school might’ve made financial sense, or your friends’s empty congratulations, because most of your friends don’t know any better. Law school enrollments have plummeted since their 2008 high, for good reason.

Here’s an interview with a Columbia law grad who quit law for a coding bootcamp. Skipping law school would’ve made more sense, but news about how bad the legal market is relative to the tech sector has not percolated through the entire country (yet).

We all have value systems, even if dollars aren’t their main currency

In Robert Skidelsky’s Econtalk interview, he mentions that we get restless if we have nothing to do, and there’s a certain amount of insatiability that appears built into the human condition. He’s referencing money, but it made me realize something: academics and intellectuals are restless and insatiable too, but they don’t use conventional currency: they use citation counts and perceived intellectual influence. They aren’t (mostly) acquisitively forward-looking, but they are interested in writing more and more, in order to have a greater and greater reputation.

Skidelsky’s most recent book is How Much is Enough?: Money and the Good Life, and in it he evidently discusses the idea of material good saturation, which is, I suspect, a topic that’s going to become more and more interesting over the course of my life. Most of us, as he points out and he points out that Keynes pointed out, reach a point of diminishing returns when it comes to goods and many other things: having a working car is very valuable to many of us, but having a $100,000 car is less so. Having a computer is very valuable, but having the latest model is less so. But we’re still working quite hard for goods that might not be valuable enough.

I leave it to the reader’s imagination to apply how the previous paragraph might be applied to academics or intellectuals, for whom it seems there is never enough respect go around.

Skidelsky’s point about work is especially interesting to me because I’m a person who has been working, so to speak, to make the kind of “work” that I do fun—at which point it’s not really onerous. I wonder if that kind of move is the future of work. We also get a certain amount of satisfaction from doing a thing well, and perhaps that will drive us, collectively, even in the face of not needing to do certain things to the extent that we need to do them now.

A Jane Austen Education: How Six Novels Taught Me About Love, Friendship, and the Things That Really Matter — William Deresiewicz

I really like and admire A Jane Austen Education, despite agreeing with the younger Deresiewicz who the older one mocks for believing sentiments like this one, about Jane Austen’s Emma: “The story seemed to consist of nothing more than a lot of chitchat among a bunch of commonplace characters in a country village. No grand events, no great issues, and, inexplicably for a writer of romance novels, not even any passion.” Deresiewicz is setting himself up to be knocked down, and yet when I read Emma I, too, was bored by the “chitchat” among the bumpkins.

But Deresiewicz goes on to explain why his younger self was totally wrong, and how he grew as a person through closely reading Jane Austen and applying her novels to his life experience. Though his explanation is persuasive, I still don’t buy it. To me, the characters in Emma are still “a pretty unpromising bunch of people to begin with, and then all they seemed to do was sit around and talk: about who was sick, who had had a card party the night before, who had said what to whom. Mr. Woodhouse’s idea of a big time was taking a stroll around the garden.” I usually call the ceaseless chatter without any action referent “empty status games,” because the games don’t refer to anything outside their immediate social situations (granted, it might also be that I don’t usually excel in them). These sorts of situations are akin to the ones Paul Graham describes in “Why Nerds Are Unpopular:”

I think the important thing about the real world is [that. . . ] it’s very large, and the things you do have real effects. That’s what school, prison, and ladies-who-lunch all lack. The inhabitants of all those worlds are trapped in little bubbles where nothing they do can have more than a local effect. Naturally these societies degenerate into savagery. They have no function for their form to follow.

Jane Austen’s societies obviously don’t generate into savagery—unless they’ve been transformed into Pride and Prejudice and Zombies (“Now with Ultraviolent Zombie Mayhem!”)—but their inhabitants do feel “trapped in little bubbles where nothing they do can have more than a local effect,” which makes them unsatisfying, at least to my temperament. Graham might also not be an ideal person to cite, given how much he admires Austen: “Everyone admires Jane Austen. Add my name to the list. To me she seems the best novelist of all time.” Still, strike me from the list: her style is amazing and her content vapid. Consider this description, also from Deresiewicz:

One whole chapter—Isabella had just brought her family home for Christmas—consisted entirely of aimless talk, as everyone caught up on one another’s news. For more than half a dozen pages, the plot simply came to a halt. But the truth was, for long stretches of the book there really wasn’t much plot to speak of.

Or this: “What could be duller, I thought, than a bunch of long, heavy novels, by women novelists, in stilted language, on trivial subjects?” There are much duller books—Beckett’s trilogy, Molloy, Malone Dies, The Unnamable comes to mind, since those are novels written to make some philosophical statement about the meaninglessness of life or to give English professors a bone to gnaw into scholarly papers—but the point stands. I’m not opposed to “women novelists,” and anyone who is on the grounds of perceived unimportance should try The Secret History and Gone Girl, but “long, heavy novels [. . .] on trivial subjects” are tedious regardless of their author’s gender.

Moreover, I’m not alone: “As it turned out, people had been reacting to Jane Austen exactly as I had for as long as they’d been reading her. The first reviews warned that readers might find her stories ‘trifling,’ with ‘no great variety,’ ‘extremely deficient’ in imagination and ‘entirely devoid of invention,’ with ‘so little narrative’ that it was hard to even describe what they were about.” At some level, as happens with much art, a preference for Austen may come down to temperament, and to what a person believes about what The Novel or a novel should do. I’ve never been able to get into novels that don’t have some kind of narrative drive or energy—both vague terms that I could spend the rest of this essay describing, or, rather, trying to describe—and, like Lev Grossman, I think “Plot makes perverts of us all:”

A good story is a dirty secret that we all share. It’s what makes guilty pleasures so pleasurable, but it’s also what makes them so guilty. A juicy tale reeks of crass commercialism and cheap thrills. We crave such entertainments, but we despise them.

For as long as a century, however, if not longer, literary culture has been bifurcating between high-culture, non-plot types who inhabit universities and book reviews and institutions, and common readers, who like something to happen and maybe some T&A or depraved longings in their fiction, even if the language used for the T&A and depraved longings isn’t very interesting. Most of us are taught that long, tedious books written in stilted language are more valuable than those that do the opposite.

To be sure, I don’t think the people who genuinely love Austen have been academically brainwashed—I think they do authentically love her writing—but I also think the original reviewers and the younger Deresiewicz have a point too, but that point is mostly drowned in school-based settings.

At the time Deresiewicz had his Austen breakthrough, he was seeing a waitress, and they “had little in common and had never progressed beyond the sex. She was gorgeous, bisexual, impulsive, experienced, with a look that knew things and a laugh that didn’t give a damn.” Perhaps this is a function of me being in my 20s, but this arrangement doesn’t sound so bad, and, having dated the equivalent woman, I rather enjoyed those things at the time. Furthermore, I don’t think such relationships are wrong—though I would also say, obviously, that they’re not the only kind of relationships available, or the only kind a person should have over the course of their life. Sometimes people eat fast food; other times they dine in fine restaurants, or at the Cheesecake Factory, or cook for themselves, or cook with another person, or cook simple foods, or complex ones, or have potlucks. I leave it to you to map that metaphor onto sexuality and relationships, but the point about variety in relationships is useful. For Deresiewicz, “Austen taught me a new kind of moral seriousness—taught me what moral seriousness really means. It means taking responsibility for the little world, not the big one. It means taking responsibility for yourself.” But people who are always morally serious can also be dull, just as people who are never morally serious are often unintentionally cruel.

The trick is being able to distinguish the two, and to find a middle way, and to develop some self-awareness, which is hard for many if not most of us. Certainly it was hard for Deresiewicz’s younger self:

If you’re oblivious to other people, chances are pretty good that you’re going to hurt them. I knew now that if I was ever going to have any real friends—or I should say, any real friends with my friends—I’d have to do something about it. I’d have to learn to stop being a defensive, reactive, self-enclosed jerk.

On the other hand, being oblivious to other people sometimes means being very tuned into technical or other problems that need solving—for the best example of this I’ve seen in literature, consider Lawrence Waterhouse in Cryptonomicon, who is shockingly oblivious and essential to the Allied war effort and who extends cryptography. It should also be noted that he’s not intentionally mean to others, and in the novel no one is emotionally hurt by him in an obvious fashion, but the depiction of his thought process as an engineer / mathematician seems pretty accurate. You get moments like this: “In particular, the final steps of the organist’s explanation were like a falcon’s dive through layer after layer of pretense and illusion, thrilling or sickening or confusing depending on what you were. The heavens were riven open. Lawrence glimpsed choirs of angels ranking off into geometrical infinity,” perhaps in exchange for attention to other people. To what extent are dispositions trade-offs? It’s a decent question, I think, but also one I can’t really answer.

Which is the kind of thing that I’m encouraged to do; in one moment, Deresiewicz praises the kind of professor we all hope to have: “When my professor asked a question, it wasn’t because he wanted us to get or guess ‘the’ answer; it was because he hadn’t figured out an answer yet himself, and genuinely wanted to hear what we had to say.” This is what I try to do in the classroom, although I’m guessing this kind of strategy works better for humanities students than for, say, math students, when the answer or answers are well-known, at least up to a fairly high level.

There are also intellectual surprises in A Jane Austen Education, and those surprises made me realize things I didn’t before:

Popular music is one giant shout of desire, one great rallying cry for freedom and pleasure. Pop psychology sends us the same signals, and so does advertising. ‘Trust your feelings,’ we are told. ‘Listen to your heart.’ ‘If it feels good, do it.’

And if everything is pointing you in one direction, it might be time to ask what lies in the other. Literature seems to ask this question. Pop music, as Deresiewicz points out, doesn’t. In Deresiewicz’s rendition, Austen herself was reacting against her time, which is to be commended:

Austen lived in the great age of trash fiction: the gothic novel, the sentimental novel, the bodice ripper—crumbling castles, creaking doors, and secret passageways; heavenly maidens and dark seducers, piercing shrieks and floods of tears, wild rides and breathless escapes; shipwrecks, deathbeds, abductions, avowals; poverty, misery, rape, and incest.

In other words, she lived in “the great age” of all the good stuff, though I would argue that the good stuff is still with us if we know where to look—I’m pretty sure Game of Thrones has every element in the Deresiewicz list.

Some weird stylistic quirks recur in the book, like the habit of “Austen was showing me” or “Austen was saying”-style constructions (“I could grow up and finding happiness, Austen was letting me know, but only if I was willing to give up something very important” or “Austen taught me a new kind of moral seriousness—taught me what moral seriousness really means” or “Austen understood that kids are going to make mistakes, and she also understood that making mistakes is not the end of the world”). But the overall effectiveness is tremendous, and not only because I might be a major component of Deresiewicz’s target audience: self-absorbed people who secretly think they have the answers other people lack.

Thinking about the process of being an artist and a writer: Lessons from David Galenson’s Old Masters and Young Geniuses

David Galenson’s Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity is the rare academic book that’s also useful for artists—most academic books are as useful for artists as syphilis is for prostitutes (the metaphor is intentionally gross, as it’s designed to express the artist’s reaction to turgid academic books).* This long quote encapsulates Galenson’s main point:

There have been two very different types of artist in the modern era. These two types are distinguished not by their importance, for both are prominently represented among the greatest artists of the era. They are distinguished instead by the methods by which they arrive at their major contributions. In each case their method results from a specific conception of artistic goals, and each method is associated with specific practices in creating art. I call one of these methods aesthetically motivated experimentation, and the other conceptual execution.

Artists who have produced experimental innovations have been motivated by aesthetic criteria: they have aimed at presenting visual perceptions. Their goals are imprecise, so their procedure is tentative and incremental. The imprecision of their goals means that these artists rarely feel they have succeeded, and their careers are consequently often dominated by the pursuit of a single objective. These artists repeat themselves, painting the same subject many times, and gradually changing its treatment in an experimental process of trial and error. Each work leads to the next, and none is generally privileged over others, so experimental painters rarely make specific preparatory sketches or plans for a painting. They consider the production of a painting as a process of searching, in which they aim to discover the image in the course of making it; they typically believe that learning is a more important goal than making finished paintings. Experimental artists build their skills gradually over the course of their careers, improving their work slowly over long periods. These artists are perfectionists and are typically plagued by frustration at their inability to achieve their goals.

In contrast, artists who have made conceptual innovations have been motivated by the desire to communicate specific ideas or emotions. Their goals for a particular work can usually be stated precisely, before its production, either as a desired image or as a desired process for the work’s execution. Conceptual artists consequently often make detailed preparatory sketches or plans for their paintings. Their execution of their painting is often systematic, since they may think of it as primarily making a preconceived image, and often simply a process of transferring an image they have already created from one surface to another. Conceptual innovators appear suddenly, as a new idea immediately produces a result quite different not only from other artists’ work, but also from the artist’s own previous work. Because it is the idea that is the contribution, conceptual innovations can usually be implemented immediately and completely, and therefore are often embodied in individual breakthrough works that become recognized as the first statement of the innovation.

Malcolm Gladwell steals much of Galenson’s work for his article “Late Bloomers: Why do we equate genius with precocity?” I say “steals” because Gladwell’s treatment doesn’t go very far beyond Galenson’s. That might be overwrought, but I still find it mostly true. Gladwell, however, does cite Galenson, which is how I found Old Masters.

I tend more towards the experimental mode: I rarely feel that I’ve succeeded, per se, although I am committed to finishing works—largely because I’ve discovered that finishing is essential to any artist, and one way to separate posers, of whom there are many, from people with real potential is to see if they have something they can show: a story, a picture, a song, whatever—no matter how bad. Then see if they produce something else. I also often repeat themes about growing up, the possibility of real friendship (especially between men and women), the power and estrangement of metaphor, and how to have an artistic temperament that nonetheless is rigorous and interested in understanding the world. I think so, anyway, although it’s naturally hard to judge one’s own works: perhaps someone else would derive different ideas.

I do, however, “tend to make specific preparatory sketches or plans” when I write, more so than I used to, but I’m not bound by them and those plans tend to be discarded about midway through a novel. Some writers apparently make very elaborate plans that they then simply execute, and I am not one, and I do feel very much like I am in “a process of searching” and of discovery, with the discovery being quite pleasurable. In most of my novels, I want to tell a story—I am not as interested in being able to express or communicate “specific ideas or emotions.” Emotions are the reader’s responsibility. Most of the time I start with characters and/or situations and want to see what might happen when those characters or situations develop. Writers who seem highly conceptual and not very interested in narrative, like Joyce, Pynchon, Morrison, and DeLillo are in turn not very interesting to me; they seem bloodless and dull, whatever their virtuosity with language. Unfortunately, they also occupy the academic high ground at the moment, perhaps because their methods and output lend themselves more easily to abstruse literary articles.

Writers like Robertson Davies, Elmore Leonard, (parts of) Tom Wolfe, and (parts of) Francine Prose are of much more interest. Someone like Philip Roth falls in the middle, but to me many of his novels become dull when their characters get bogged down in family or identity or political dilemmas (think of Sabbath in Sabbath’s Theater). In addition, there are very few writers whose entire oeuvres I like (Davies is an exception); most of the time I like particular books, or one or two books. Umberto Eco’s novels The Name of the Rose and Foucault’s Pendulum have not been matched, not even close, by anything else he’s done; ditto for Neal Stephenson’s Cryptonomicon, or Richard Russo’s Straight Man and Empire Falls. Martin Amis seems to me to be at the peak of his powers with Money, and nothing else he’s written that I’ve read has the same appeal.

Galenson also sees conceptual innovators as tending to peak when they’re younger. I wonder if this is also related to something Doris Lessing discussed in her Nobel Lecture:

Let us now jump to an apparently very different scene. We are in London, one of the big cities. There is a new writer. We cynically enquire: “Is she good-looking?” If this is a man: “Charismatic? Handsome?” We joke, but it is not a joke.

This new find is acclaimed, possibly given a lot of money. The buzzing of hype begins in their poor ears. They are feted, lauded, whisked about the world. Us old ones, who have seen it all, are sorry for this neophyte, who has no idea of what is really happening. He, she, is flattered, pleased. But ask in a year’s time what he or she is thinking: “This is the worst thing that could have happened to me.”

Some much-publicised new writers haven’t written again, or haven’t written what they wanted to, meant to. And we, the old ones, want to whisper into those innocent ears: “Have you still got your space? Your soul, your own and necessary place where your own voices may speak to you, you alone, where you may dream. Oh, hold on to it, don’t let it go.”

Perhaps this happens chiefly because the feted young writers are conceptual innovators who have run out of concepts they wish to explore. If I have eventual fame and critical praise—not likely, and not something I spend a lot of time thinking about, but the idea arose in the course of writing this—I don’t think it would affect me very much. I would still probably spend a lot of time reading and writing, and going running, and so on. I don’t think I’d want to buy a boat, or believe the flattering lies I’d sometimes hear, or perceive myself as literature’s New Jesus.

It’s also possible that artistic innovators are becoming relatively older than they once were, thanks to increases in the artistic search space. Benjamin Jones sees this happening in scientific and technical leaders in “Age and Great Invention:”

Great achievements in knowledge are produced by older innovators today than they were a century ago. Using data on Nobel Prize winners and great inventors, I find that the mean age at which noted innovations are produced has increased by 6 years over the 20th Century. I estimate shifts in life-cycle productivity and show that innovators have become especially unproductive at younger ages. Meanwhile, the later start to the career is not compensated for by increasing productivity beyond early middle age.

It’s also not clear or obvious to me about the extent to which cultures and societies affect artistic and technical innovations. I do suspect the Internet allows these to spread more rapidly, but beyond that somewhat obvious point I don’t have any other useful, or possibly useful, observations. There’s a strong artistic culture of borrowing and adapting ideas that pays off, especially for Galenson’s conceptual innovators, and it may also pay off for his experimental innovators, who can more easily access works and ideas to react against in creating their own works. It does seem like artists are very good at “questioning, experimenting, observing, associating and networking,” to use Steve Lohr’s phrase, with that last one being associated with broader fame and the dissemination of one’s ideas to others. Galeson even mentions this:

Rapid borrowing and utilization of new artistic devices, across ever wider geographic areas, has become increasingly common in recent decades, in which conceptual approaches to art have predominated. One indication of this progressive globalization of modern art is that art historians are finding that they are no longer able to divide their subject as neatly along geographic lines as in the past.

But I suspect I don’t like conceptual visual art very much: most of it looks facile and superficial to me—exactly the claims that Galenson said tend to be made against such art. The Museum of Modern Art in New York was particularly disappointing: a lot of supposed artists there were trying to be sexually shocking, but they still have nothing on what one can find online. A lot of their stuff also simply seemed random. An iMac or a C-class never seem random. Perhaps modern artists only have to please a small coterie of art insiders, while industrial designers have to please people who want to see and use beautiful, not random.

Another note on art and age: Many people who are programmers / hackers make their greatest technical contributions when they’re young—think of Bill Joy, Bill Gates, Linus Torvalds (who created the operating system that bears his name in 1991, while he was a 22-year-old student), Mark Zuckerberg, or the general cult of the young hacker genius. This might be because computer programming is a relatively young field, and it’s still relatively easy for people without a lot of formal training to make major contributions to it at an early age. There are also other effects related to Moore’s Law, the Internet, and so on, but I still find the young age of many major contributors intriguing. It’s possible that people in their 40s or older have made major contributions that I’m simply not aware of, and that the press has an obsession with youth that means I’m drawing on unrepresentative sample because the examples I can come up with are only the salient ones.

Galenson shouldn’t be considered the final word in artistic methods or outcomes, and he knows that his binary is not absolute (“it may be useful to consider the experimental-conceptual distinction not simply as a binary categorization, but rather as a quantitative difference. In this view there is a continuum, with extreme practitioners of either type at the far ends, and moderate practitioners of the two categories arrayed along the intermediate positions of the scale”). Nonetheless, Galenson offers a useful framework for considering how different people with different sorts of artistic temperaments tend to work. I would also add that he can only categorize artists who have actually finished work. Those who start many works and finish none presumably never achieve the fame that would be necessary for him to discuss.

Many artists probably don’t need or want a meta-awareness of their processes. Still, I don’t think anyone who is any kind of artist fails to think at all about how they do what they do, or how their processes might affect their outcomes. Some, however, publicly say that they just follow their feelings, or that they go into a kind of trance. When artists say things like that, they’re probably being partially truthful, but they could start asking: where do feelings come from, and how do I translate feelings that begin as chemicals or electrical impulses in the brain to colors or words? What’s the nature of the artistic trance? But they don’t ask those questions, or, if they do, they don’t share the answer publicly. That’s okay, but it strikes me as deliberate mystification (they’d probably see my relatively high level of awareness as false, as a set of intellectual pretenses masquerading as method).

Nor is one kind of artist necessarily better than the other: notice that I have said I have tendencies towards being experimental more than conceptual, but that doesn’t mean I would denigrate conceptual artists.

Other interesting moments from Old Masters:

“[A]rtistic innovations are not made by isolated geniuses, but are usually based on the lessons of teachers and the collaboration of colleagues.”

“What appears to be necessary for radical conceptual innovation is not youth, but an absence of acquired habits of thought that inhibit sudden departures from existing conventions.”

“Experimental movie directors typically stress the importance of telling a story, with a clear narrative. They generally consider visual images the most important element of a movie, with the script and sound track used to support the images. Many experimental directors specifically state that their primary goal is to entertain the audience, and they often take commercial success to be a sign of their achievement of that goal. Experimental directors typically aim to make the technical aspects of their movies unobtrusive, for they usually believe that the purpose of technique is to create an illusion of reality.”


* Galenson also wrote Conceptual Revolutions in Twentieth-Century Art, which might be interesting to visual artists; I haven’t read it, because I don’t find paintings and other non-cinematic forms of visual art compelling for consumption, let alone production.

Thinking about the process of being an artist and a writer: Lessons from David Galenson's Old Masters and Young Geniuses

David Galenson’s Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity is the rare academic book that’s also useful for artists—most academic books are as useful for artists as syphilis is for prostitutes (the metaphor is intentionally gross, as it’s designed to express the artist’s reaction to turgid academic books).* This long quote encapsulates Galenson’s main point:

There have been two very different types of artist in the modern era. These two types are distinguished not by their importance, for both are prominently represented among the greatest artists of the era. They are distinguished instead by the methods by which they arrive at their major contributions. In each case their method results from a specific conception of artistic goals, and each method is associated with specific practices in creating art. I call one of these methods aesthetically motivated experimentation, and the other conceptual execution.

Artists who have produced experimental innovations have been motivated by aesthetic criteria: they have aimed at presenting visual perceptions. Their goals are imprecise, so their procedure is tentative and incremental. The imprecision of their goals means that these artists rarely feel they have succeeded, and their careers are consequently often dominated by the pursuit of a single objective. These artists repeat themselves, painting the same subject many times, and gradually changing its treatment in an experimental process of trial and error. Each work leads to the next, and none is generally privileged over others, so experimental painters rarely make specific preparatory sketches or plans for a painting. They consider the production of a painting as a process of searching, in which they aim to discover the image in the course of making it; they typically believe that learning is a more important goal than making finished paintings. Experimental artists build their skills gradually over the course of their careers, improving their work slowly over long periods. These artists are perfectionists and are typically plagued by frustration at their inability to achieve their goals.

In contrast, artists who have made conceptual innovations have been motivated by the desire to communicate specific ideas or emotions. Their goals for a particular work can usually be stated precisely, before its production, either as a desired image or as a desired process for the work’s execution. Conceptual artists consequently often make detailed preparatory sketches or plans for their paintings. Their execution of their painting is often systematic, since they may think of it as primarily making a preconceived image, and often simply a process of transferring an image they have already created from one surface to another. Conceptual innovators appear suddenly, as a new idea immediately produces a result quite different not only from other artists’ work, but also from the artist’s own previous work. Because it is the idea that is the contribution, conceptual innovations can usually be implemented immediately and completely, and therefore are often embodied in individual breakthrough works that become recognized as the first statement of the innovation.

Malcolm Gladwell steals much of Galenson’s work for his article “Late Bloomers: Why do we equate genius with precocity?” I say “steals” because Gladwell’s treatment doesn’t go very far beyond Galenson’s. That might be overwrought, but I still find it mostly true. Gladwell, however, does cite Galenson, which is how I found Old Masters.

I tend more towards the experimental mode: I rarely feel that I’ve succeeded, per se, although I am committed to finishing works—largely because I’ve discovered that finishing is essential to any artist, and one way to separate posers, of whom there are many, from people with real potential is to see if they have something they can show: a story, a picture, a song, whatever—no matter how bad. Then see if they produce something else. I also often repeat themes about growing up, the possibility of real friendship (especially between men and women), the power and estrangement of metaphor, and how to have an artistic temperament that nonetheless is rigorous and interested in understanding the world. I think so, anyway, although it’s naturally hard to judge one’s own works: perhaps someone else would derive different ideas.

I do, however, “tend to make specific preparatory sketches or plans” when I write, more so than I used to, but I’m not bound by them and those plans tend to be discarded about midway through a novel. Some writers apparently make very elaborate plans that they then simply execute, and I am not one, and I do feel very much like I am in “a process of searching” and of discovery, with the discovery being quite pleasurable. In most of my novels, I want to tell a story—I am not as interested in being able to express or communicate “specific ideas or emotions.” Emotions are the reader’s responsibility. Most of the time I start with characters and/or situations and want to see what might happen when those characters or situations develop. Writers who seem highly conceptual and not very interested in narrative, like Joyce, Pynchon, Morrison, and DeLillo are in turn not very interesting to me; they seem bloodless and dull, whatever their virtuosity with language. Unfortunately, they also occupy the academic high ground at the moment, perhaps because their methods and output lend themselves more easily to abstruse literary articles.

Writers like Robertson Davies, Elmore Leonard, (parts of) Tom Wolfe, and (parts of) Francine Prose are of much more interest. Someone like Philip Roth falls in the middle, but to me many of his novels become dull when their characters get bogged down in family or identity or political dilemmas (think of Sabbath in Sabbath’s Theater). In addition, there are very few writers whose entire oeuvres I like (Davies is an exception); most of the time I like particular books, or one or two books. Umberto Eco’s novels The Name of the Rose and Foucault’s Pendulum have not been matched, not even close, by anything else he’s done; ditto for Neal Stephenson’s Cryptonomicon, or Richard Russo’s Straight Man and Empire Falls. Martin Amis seems to me to be at the peak of his powers with Money, and nothing else he’s written that I’ve read has the same appeal.

Galenson also sees conceptual innovators as tending to peak when they’re younger. I wonder if this is also related to something Doris Lessing discussed in her Nobel Lecture:

Let us now jump to an apparently very different scene. We are in London, one of the big cities. There is a new writer. We cynically enquire: “Is she good-looking?” If this is a man: “Charismatic? Handsome?” We joke, but it is not a joke.

This new find is acclaimed, possibly given a lot of money. The buzzing of hype begins in their poor ears. They are feted, lauded, whisked about the world. Us old ones, who have seen it all, are sorry for this neophyte, who has no idea of what is really happening. He, she, is flattered, pleased. But ask in a year’s time what he or she is thinking: “This is the worst thing that could have happened to me.”

Some much-publicised new writers haven’t written again, or haven’t written what they wanted to, meant to. And we, the old ones, want to whisper into those innocent ears: “Have you still got your space? Your soul, your own and necessary place where your own voices may speak to you, you alone, where you may dream. Oh, hold on to it, don’t let it go.”

Perhaps this happens chiefly because the feted young writers are conceptual innovators who have run out of concepts they wish to explore. If I have eventual fame and critical praise—not likely, and not something I spend a lot of time thinking about, but the idea arose in the course of writing this—I don’t think it would affect me very much. I would still probably spend a lot of time reading and writing, and going running, and so on. I don’t think I’d want to buy a boat, or believe the flattering lies I’d sometimes hear, or perceive myself as literature’s New Jesus.

It’s also possible that artistic innovators are becoming relatively older than they once were, thanks to increases in the artistic search space. Benjamin Jones sees this happening in scientific and technical leaders in “Age and Great Invention:”

Great achievements in knowledge are produced by older innovators today than they were a century ago. Using data on Nobel Prize winners and great inventors, I find that the mean age at which noted innovations are produced has increased by 6 years over the 20th Century. I estimate shifts in life-cycle productivity and show that innovators have become especially unproductive at younger ages. Meanwhile, the later start to the career is not compensated for by increasing productivity beyond early middle age.

It’s also not clear or obvious to me about the extent to which cultures and societies affect artistic and technical innovations. I do suspect the Internet allows these to spread more rapidly, but beyond that somewhat obvious point I don’t have any other useful, or possibly useful, observations. There’s a strong artistic culture of borrowing and adapting ideas that pays off, especially for Galenson’s conceptual innovators, and it may also pay off for his experimental innovators, who can more easily access works and ideas to react against in creating their own works. It does seem like artists are very good at “questioning, experimenting, observing, associating and networking,” to use Steve Lohr’s phrase, with that last one being associated with broader fame and the dissemination of one’s ideas to others. Galeson even mentions this:

Rapid borrowing and utilization of new artistic devices, across ever wider geographic areas, has become increasingly common in recent decades, in which conceptual approaches to art have predominated. One indication of this progressive globalization of modern art is that art historians are finding that they are no longer able to divide their subject as neatly along geographic lines as in the past.

But I suspect I don’t like conceptual visual art very much: most of it looks facile and superficial to me—exactly the claims that Galenson said tend to be made against such art. The Museum of Modern Art in New York was particularly disappointing: a lot of supposed artists there were trying to be sexually shocking, but they still have nothing on what one can find online. A lot of their stuff also simply seemed random. An iMac or a C-class never seem random. Perhaps modern artists only have to please a small coterie of art insiders, while industrial designers have to please people who want to see and use beautiful, not random.

Another note on art and age: Many people who are programmers / hackers make their greatest technical contributions when they’re young—think of Bill Joy, Bill Gates, Linus Torvalds (who created the operating system that bears his name in 1991, while he was a 22-year-old student), Mark Zuckerberg, or the general cult of the young hacker genius. This might be because computer programming is a relatively young field, and it’s still relatively easy for people without a lot of formal training to make major contributions to it at an early age. There are also other effects related to Moore’s Law, the Internet, and so on, but I still find the young age of many major contributors intriguing. It’s possible that people in their 40s or older have made major contributions that I’m simply not aware of, and that the press has an obsession with youth that means I’m drawing on unrepresentative sample because the examples I can come up with are only the salient ones.

Galenson shouldn’t be considered the final word in artistic methods or outcomes, and he knows that his binary is not absolute (“it may be useful to consider the experimental-conceptual distinction not simply as a binary categorization, but rather as a quantitative difference. In this view there is a continuum, with extreme practitioners of either type at the far ends, and moderate practitioners of the two categories arrayed along the intermediate positions of the scale”). Nonetheless, Galenson offers a useful framework for considering how different people with different sorts of artistic temperaments tend to work. I would also add that he can only categorize artists who have actually finished work. Those who start many works and finish none presumably never achieve the fame that would be necessary for him to discuss.

Many artists probably don’t need or want a meta-awareness of their processes. Still, I don’t think anyone who is any kind of artist fails to think at all about how they do what they do, or how their processes might affect their outcomes. Some, however, publicly say that they just follow their feelings, or that they go into a kind of trance. When artists say things like that, they’re probably being partially truthful, but they could start asking: where do feelings come from, and how do I translate feelings that begin as chemicals or electrical impulses in the brain to colors or words? What’s the nature of the artistic trance? But they don’t ask those questions, or, if they do, they don’t share the answer publicly. That’s okay, but it strikes me as deliberate mystification (they’d probably see my relatively high level of awareness as false, as a set of intellectual pretenses masquerading as method).

Nor is one kind of artist necessarily better than the other: notice that I have said I have tendencies towards being experimental more than conceptual, but that doesn’t mean I would denigrate conceptual artists.

Other interesting moments from Old Masters:

“[A]rtistic innovations are not made by isolated geniuses, but are usually based on the lessons of teachers and the collaboration of colleagues.”

“What appears to be necessary for radical conceptual innovation is not youth, but an absence of acquired habits of thought that inhibit sudden departures from existing conventions.”

“Experimental movie directors typically stress the importance of telling a story, with a clear narrative. They generally consider visual images the most important element of a movie, with the script and sound track used to support the images. Many experimental directors specifically state that their primary goal is to entertain the audience, and they often take commercial success to be a sign of their achievement of that goal. Experimental directors typically aim to make the technical aspects of their movies unobtrusive, for they usually believe that the purpose of technique is to create an illusion of reality.”


* Galenson also wrote Conceptual Revolutions in Twentieth-Century Art, which might be interesting to visual artists; I haven’t read it, because I don’t find paintings and other non-cinematic forms of visual art compelling for consumption, let alone production.

Tina Fey's Bossypants and its relationship to James Fallows' Breaking the News

This passage appears in Tina Fey’s memoir / how-to guide Bossypants:

And Oh, the Cable News Reportage! The great thing about cable news is that they have to have something to talk about twenty-four hours a day. Sometimes it’s Anderson Cooper giggling with one of the Real Housewives of Atlanta. Sometimes it’s Rick Sanchez screaming about corn syrup. They have endless time to filler, but viewers get kind of ‘bummed out’ if they supply actual information about wars and stuff, so ‘Media Portrayal of Sarah Palin’ and SNL and I became the carrageenan in America’s news nuggets for several weeks. I was a cable news star, like a shark or a missing white child!

The downside of being a cable news star is that nay ass-hair with a clip-on tie can come on an as ‘expert’ to talk about you. One day, by accident, I caught this tool Tom something on MSNBC saying that he thought I had not ‘conducted myself well’ during all this. In his opinion, Mrs. Palin had conducted herself with dignity and I had not. (I’m pretty sure Tom’s only claim to expertise is that he oversees a website where people guess incorrectly about who might win show biz awards.) There was a patronizing attitude behind Tom’s comments that I certainly don’t think he would have applied to a male comedian. Chris Rock was touring at the time and he was literally calling George W. Bush ‘retarded’ in his act. I don’t think Tom something would have expressed disappointment that Chris was not conducting himself sweetly. I learned how incredibly frustrating it is to watch someone talk smack about you and not be able to respond.

I love the word “reportage,” which sounds like “personage,” and bears the same relationships to real news or reports that McDonald’s does to real food with real nutritional value. And the phrase “wars and stuff” lets Fey drop into the mindset of a network executive, perhaps just a few years out from his or her MBA, who is trying to decide what might maximize revenue this quarter. Answer: sharks, missing white girls, and fake controversy. We don’t need any stuff about wars, tough compromises, or deep trends! Let’s dazzle them with superficial bullshit, which a subset of them really like, and hope no one notices what we’re not covering!

(Unfortunately, this works because we, collectively, don’t demand better. But that’s a subject for another time.)

Fey’s critique is close to James Fallows’ in Breaking The News: How the Media Undermine American Democracy. Fey is being funny and Fallows serious, and Fey is dealing with a media environment a decade and change later than the one Fallows describes, but on a basic level the environment has barely changed. If anything, the explosion in cable news has made it worse in many ways, with only a handful of exceptions (The Daily Show, which fights against the dumbest parts of the contemporary media, or coverage of Trayvon Martin’s murder). The net result of this is Americans losing confidence in the institutions that are supposed to serve us. The responsibility is partially ours, but it’s also partially that of the people who nominally serve us.

Everyone who pays attention to the media knows it’s broken, and that the brokenness seems to have seeped into the larger culture as a form of blanket cynicism and condemnation. I don’t have a strong sense of how to reverse this dynamic, save perhaps on an individual level.

See also David Brin on how an idea has, over the last twenty years, become “fundamental dogma to millions of Americans:” “The notion that assertions can trump facts.” I wonder if the Western world’s enormous wealth insulates people from the potential consequences of their beliefs; very people die or are seriously injured as a result of dumb beliefs based on erroneous or completely absent information. In other words, it’s now much cheaper to believe nonsense.

On a separate, and more pleasant note, Fallows’ new book, China Airborne, will be published on May 15. In addition, Bossypants itself is funny throughout. Samples:

* “Politics and prostitution have to be the only jobs where inexperience is considered a virtue. In what other profession would you brag about not knowing stuff? ‘I’m not one of those fancy Harvard heart surgeons. I’m just an unlicensed plumber with a dream and I’d like to cut your chest open.’ The crowd cheers.”

* “In 1997 I flew to New York from Chicago to interview for a writing position at Saturday Night Live. It seemed promising because I’d heard the show was looking to diversify. Only in comedy, by the way, does an obedient white girl from the suburbs count as diversity.”

* “I feel about Photoshop the way some people feel about abortion. It is appalling and a tragic reflection on the moral decay of our society . . . unless I need it, in which case, everybody be cool.”

* “If you are a woman and you bought this book for practical tips on how to make it in a male-dominated workplace, here they are. No pigtails, no tube tops. Cry sparingly. (Some people say ‘Never let them see you cry.’ I say, if you’re so mad you could just cry, then cry. It terrifies everyone.) When choosing sexual partners, remember: Talent is not sexually transmittable. Also, don’t eat diet foods in meetings.”

Worthless: The Young Person’s Indispensable Guide to Choosing the Right Major — Aaron Clarey

A lot of the content but little of the rhetoric in Worthless can be found in articles like Jordan Weissmann’s “53% of Recent College Grads Are Jobless or Underemployed—How? A college diploma isn’t worth what it used to be. To get hired, grads today need hard skills,” which says:

not all degrees are created equal [. . . graduates in the] sciences or other technical fields, such as accounting, were much less likely to be jobless or underemployed than humanities and arts graduates. You know that old saw about how college is just about getting a fancy piece of paper?

Weissman is right; Clarey is right in places too, but even when he is mostly right, he overstates his case: the American education system has become like the proverbial elephant being described by blind men: one touches its tusk, and its trunk, a third its legs, and a fourth its back, and each proclaims that he understands the essential shape of the elephant, while none of them see the whole.

Derek Thompson describes the elephant problem in “The Value of College Is: (a) Growing (b) Flat (c) Falling (d) All of the Above.” The right answer is “d,” but even if the value of college is falling, it’s still an improvement, for most people, over not going to college. More people should probably major in science, technology, engineering, and math, as Clarey writes, but if your margin is between not going to college and entering the workforce straight from high school, or going to college and getting a comm or English degree, which is more valuable? To be sure, more people who are marginal candidates at colleges should consider vocational education, which Clarey says.

In an early passage, Clarey—who used to teach at a college—asks students to list what they want to buy. Most say gas, cars, or gadgets. He goes on to say that “there was a huge mismatch between what people wanted and what they were studying.” He’s partially correct. But he neglects to say that many people say they want one thing and then spend money on something else.

In the United States, for example, government expenditures consumed about 42% of GDP in 2012. Regardless of what this group of students say they want, voters in the aggregate want relatively high levels of government spending—and they get it. None of the students mentioned “thousands of dollars in subsidized debt,” even though many if not most are getting it. None mentioned health care, either, even though health care consumes a growing percentage of GDP. Clarey writes, “There was also no shortage of psychology majors, but not one person ever listed ‘therapy’ on their wish list.” But few people wish to admit in public, or to their instructor, that they want or need therapy, which doesn’t signal reproductive or intellectual fitness. The quoted sentence also doesn’t need the word “also,” which appears in the awkward first sentence of the preceding paragraph: “Also ironic was how there were so many sociology majors, but not one person listed ‘social work’ in their wish list.”

While I agree with part of the larger point—you should think about how the things you want to consume match with the things you are learning how to produce, and you should focus on making things that people want—people don’t always know what they want, or what they’ll pay for, and what they say they want and what they actually buy are often quite different. Whenever possible, shoot for observed rather than reported behavior. Americans are willing to say that buying American products are important to them, but very few actually take place-of-origin into account in actual purchases. Pay attention to those gaps. In his example, Clarey doesn’t.

There also appears to be a growing dynamic in this country by which people who work in highly competitive tradable sectors, like software and finance, support a large and growing non-tradable sector (baristas, yoga teachers, people dependent on Social Security / Medicare). Like any trend, this one might change, but it might also lead to the kinds of problems Tyler Cowen describes in The Great Stagnation.

Clarey writes that “You will inevitably work eight hours a day for 30-40 years. This will be, hands down, the single biggest plurality of your conscious time on this planet.” There are a couple of problems with this description: first, not everyone works for eight hours a day for 30-40 years. As Paul Graham observes in “How to Make Wealth,” “Economically, you can think of a startup as a way to compress your whole working life into a few years. Instead of working at a low intensity for forty years, you work as hard as you possibly can for four.” Beyond that, if you’re the kind of person who doesn’t spend a lot of money, you could conceivably work a normal job for a shorter period of time and then do something else; personally, I’d find idleness dull, but I suppose some people like it, or the idea of it. Stylistically, notice too the use of the cliche “hands down:” it adds nothing to the sentence. And notice too how he uses the phrase “plurality of your consciousness.” I’m not really sure how consciousness gets divided into pluralities; the usage note in the Oxford AMerican Dictionary distinguishes plurality from majority by saying, “A plurality is the largest number among three or more.” But what are your other “consciousnesses?” Clarey doesn’t say.

There are other moments of overstatement—like the next page, where Clarey describes how you will be working, and then says “How enjoyable and rewarding all of this is boils down to one simple decision – what are you going to major in?” Leaving aside the further use of cliche, I’m not convinced this is true: many if not most people end up working in fields unrelated to their major. I suspect that the pleasure or lack thereof in one’s work life depends on temperament, attitude, motivation, and a myriad of other factors unrelated to college major. The issue doesn’t boil “down to one simple decision”—it relates to a whole host of personal, social and economic factors.

He also writes that degrees like “Sociology” and “Non-profit Administration” “are in the financial sense LITERALLY worthless.” This doesn’t appear to be true, given the well-known data on earnings premiums to college degrees—many of which are linked to earlier in this post.

Still, Worthless excels at telling you what The Atlantic won’t: if you want to make a lot of money and a difference in people’s lives, major in STEM fields, but you’re probably reluctant to do so because you’re lazy and those fields are hard. They haven’t experience the same level of grade inflation as other fields. In this respect, the book is right. But it doesn’t excel in asking larger questions what kind of people major in each discipline and how many opportunities a degree—any degree—can still open. If you’re a generic student who isn’t especially passionate about anything and aren’t sure what you want to do, stay upwind. Increasingly, that means STEM. You can say it softly or brusquely and still get the same result.

But majoring in something you despise in pursuit of a paycheck might isn’t optimal either. In Bronnie Ware’s Regrets for the Dying, Ware, who worked in “palliative care,” lists the regrets she listened to patients express as they died. They said things like “I wish I’d had the courage to live a life true to myself, not the life others expected of me,” “I wish I didn’t work so hard,” and “I wish I’d had the courage to express my feelings.” In her telling, none say, “I wish I’d been a Senior Account Supervisor Level 5,” or “Making Executive Vice President was the apex of my life,” or “If only I’d been an engineer, everything would’ve been different.”

This isn’t argument against majoring in the hard sciences, since no one is stopping engineers or hackers from working less hard or expressing their feelings. But it is an argument about the value of a life as measured in non-financial terms, and attempting to measure life in solely financial terms might yield a less than optimal return on investment. Daniel Gilbert’s book Stumbling on Happiness offers an enormous amount of research that shows how most people do not become substantially happier when they earn additional income beyond $40,000 per year, and most of them value meaningful work, their sex lives, and friends much more than extra marginal income. Again, I’m not arguing against majoring in STEM fields, but if your sole purpose in majoring in a STEM field is to maximize your lifetime earning potential, you might be maximizing the wrong thing. If you major in something easy because it’s the default path, you’re making a mistake. But if you want the easy route, I don’t think Worthless is going to convince you to avoid that route, even if its content will let you avoid saying, “No one told me.”

Arguing in favor of majoring in STEM fields might sound ironic coming from an English major and now English grad student like me, but I do so largely based on the observation of the life trajectories of the people around me. You can find innumerable arguments for liberal arts degrees—here’s a recent one, from Stanley Fish at the New York Times—but very few get around the income data problem combined with the rising cost of degree problem, let alone the way technology is ripping up and reshaping large parts of human life—which history, English, and philosophy aren’t doing (I’d argue that economics, neuroscience, and biology are doing more to shape the way we think about human behavior than history, English, philosophy, and the rest of the humanities; why argue about human nature when you can try to measure it?*).

Still, if your mostly view a degree as a signaling device—as Bryan Caplan does, and as he’s going to argue in The Case Against Education (you can read more about the ideas on his blog), then what you major in doesn’t matter that much because you’ve already signaled that you’re diligent and conscientious. In many fields, if you’re any good, you’ll be able to teach yourself those fields: there are numerous people working as programmers with little or no formal training in programming. Ditto for business; indeed, no one in my family had any formal training in any aspect of business, yet we’ve been running Seliger + Associates for decades; watching the experience of many tech entrepreneurs makes me skeptical of the value of formal business training that is devoid of content from the business one presumably wants to enter. I read stories like “Patagonia’s Founder Is America’s Most Unlikely Business Guru: For years, Yvon Chouinard kept his eco-conscious, employee-friendly practices largely to himself. Now megacorporations like Walmart, Levi Straus and Nike are following his lead” and wonder what the homo economicuses are learning in B-school.

In dealing with life, rather than just your major, a more viable book might be something like Po Bronson’s What Should I Do With My Life?, which is less didactic and certain—although it is also vague, wishy-washy, and overly long. It might have pointed out that, if you are defined primarily by external structures and expectations instead of an inner quest for growth, knowledge, and understanding, you will probably never be able to accomplish the kinds of things you should. For people externally motivated, hard degrees are especially important, because they’re not going to pick up a copy of Learn Python the Hard Way and learn Python the hard way. They’re not going to take charge of a business and figure out how to lead from the front.

If you find work that you love, it doesn’t really feel like work. Perhaps more people should work on finding that, if they can—not everybody can—and then seeing if they can extract money from what they like doing (see also Robin Hanson’s short post on the subject).

Are you better off reading this book, or reading links above? The answer depends on the extent you value judiciousness versus the extent you value someone telling you what to do without exploring the nuances inherent to the situations. I did not notice any sentences that were beautiful, moving, or surprising. Many needed basic copy editing (sample: “You would obviously like to choose a field that you have an interest in” should be “You would obviously like to choose a field that interests you”), and the book works best if you don’t read it closely, which reinforces the question I posed in the first sentence of this paragraph. Nonetheless, Worthless is a symptom of larger problems in American education, and I expect those symptoms to get worse before they get better.


* Not everything can be measured, but given the choice between measurement and not, shoot for measurement.