Why is “young people” a socially and politically palatable term, but not “old people?”
For decades, books got published something like this: you, the writer, wrote and polished your book; you submitted a query letter and perhaps sample chapters to literary agents; an agent read the full manuscript; an agent took you on; the agent pitched your book to large publishing houses in New York; the editor, or ideally more than one editor, made an offer; the agent negotiated; and you got a book deal. This system worked kind of okay, and there wasn’t a better way to do it, but a lot of writers, including me, got hung up in the “an agent took you on” step.
Now, self-publishing has a realistic chance of success—defined as getting your work to readers and getting some amount of money from those readers—which offers opportunities and headaches. Big publishers know change is coming. The opportunities are obvious, and the headaches stem from having to learn a lot of stuff that publishers used to do, like cover design, knowing what a “widow” is, and figuring out how to hire a copy editor. APE: Author, Publisher, Entrepreneur wants to explain the new world, and it’s a book for a very specific group: people who are, for whatever reason, deeply interested in the publishing industry, and people who want to write a book, have written a book, or want to publish the book they’ve written. If you’re sure you don’t fall into those categories and aren’t likely to, stop reading. You’re probably wasting your time. If you want to know, keep going.
When they are starting out writers rarely make anything at all for what they do. I wrote seven novels over a period of six years before one was accepted for publication. Rejected by some twenty publishers that seventh eventually earned me an advance of £1,000 for world rights. Evidently, I wasn’t working for money. What then? Pleasure? I don’t think so; I remember I was on the point of giving up when that book was accepted. I’d had enough. However much I enjoyed trying to get the world into words, the rejections were disheartening; and the writing habit was keeping me from a “proper” career elsewhere.
These kinds of stories infect writer interviews, as do tales of heroic perseverance. John Barth and William Goldman almost quit writing too. But more interesting still are the dark matter writers, the ones we don’t hear about because they gave up and aren’t being interviewed or writing introductions to reprints of their older books. I don’t want to be one of them. And I bet I can make more than £1,000, though I don’t know how long ago Parks began writing: adjusted for inflation, £1,000 might be a lot of money.
Kawasaki and Welch explain how to avoid being a dark matter writer. They say, “Will your book add value to people’s lives? This is a severe test, but if your answer is affirmative, there’s no doubt that you should write a book.” Still, people write books for all sorts of reasons, though I suspect the major reasons are related and twofold: the book they’d like to read doesn’t already exist, and they have something to say. Answers like “to add value to people’s lives” are good reasons to write a book, and good reasons to do many things. There is still some doubt. Writing a book can consume all your mental energy. It might add value to, say, two people’s lives, which might not justify the costs. Not everyone has the impetus towards book writing; to get through the difficulties of writing a book, I think that writing itself has to be fun, or fun at times (more on that later).
But the number of people who could write books and aren’t, in part because of the daunting publishing process, is much larger than the number who do write books. And that pool is getting larger. One challenge is that writers are going to have to think more like publishers, and publishers are going to have to think more like entrepreneurs. APE is about these transformations, and it takes its place near J.A. Konrath and Jack Kilborn’s The Newbie’s Guide to Publishing (Everything A Writer Needs To Know) and Kristine Kathryn Rusch’s Surviving the Transition: How Writers Can Thrive in the New World of Publishing (one thing writers evidently do, once they spend the painful time learning to self-publish, is write guides so that others can learn the same).
How useful APE will be to you depends on how much other reading you’ve done in the how-to-be-a-writer genre. I have trouble resisting it, and so sections of APE are less useful; some, like chapters 6, were fun but already well-known to me. The later ones, on the finer point of Kindle, Nook, and iBooks publishing, were exceedingly useful. I follow digital publishing closely, because I’m going to do it, but I still learned things: for example, I didn’t realize that Google Play exists. Google Play might not matter for me, or for you, but uploading to it requires little time beyond the effort necessary for iBooks and Barnes and Noble’s Nook.
Kawasaki and Welch also have overly strong views on tools (which may make sense given Kawasaki’s background: “For four years I evangelized Macintosh to software and hardware developers and led the charge against world-wide domination by IBM;” the word “evangelized” is key here, implying religious fervor that’s been transferred from God to Mac). I’ve learned some about photography in the last two years, perhaps a reaction against the extreme amount of reading and writing I’ve done, and in cameras, there’s a continual debate between the people who want the newest, coolest gear and who argue that the latest gear enables them to get shots they couldn’t have gotten before. Their intellectual adversaries argue that the most important tool is between the photographer’s ears and that composition, subject matter, and skill with what you have matters more than the newest cameras and the best lenses.
I’ve read impassioned pleas from both sides, and agreed fully with one side, then read the opposite, and agreed fully with them. There isn’t a right answer. One cliche in the photography community holds that every image you’ve admired was captured with worse gear than what you’ve got. Yet there’s also no reason to ignore the tools you’re using and the potential that new tools may unlock.
Kawasaki and Welch write, “In our book (again, pun intended), you should use a Macintosh. No computer makes you more creative and productive, because a Macintosh becomes part of you whereas you need to overcome other operating systems.” I don’t think it matters that much, which is somewhat funny because I’m writing this on an iMac. But pretty much any computer made in the last ten years will due, because, the most important parts of the writing process are a) a word processor and b) there is no b.
There are some nifty tools I use extensively, like Devonthink Pro, and some nifty tools that I’ve used less extensively but still helpfully at times, like Scrivener. Nonetheless, 95% of the real “work” of writing still happens on the level of the sentence and paragraph (though Kawasaki and Welch say of Scrivener, “I pride myself in having an organized mind, but my mind isn’t this organized”)*. A Mac is not going to give you great sentences. Neither is Windows or Linux or the tea you drink or the cafe you write at or the hot literary groupie offering you head or the pen you use. Great sentences, like change, come from within.
They also say, “We have never met anyone who regretted buying a Macintosh.” I have—like those who need perfect Exchange synchronization, or people who are seduced by the Mac’s cool factor, only to realize that the paying-the-rent factor is even more important. These are quibbles. Still, in one chapter the writers quote Zoe Winters, and I would repurpose her advice to apply to technology: “There is no shortcut to awesome.” Writing well is always a longcut, not a shortcut, and self-publishing arguably makes the road longer. There’s no real alternative, through software, hardware, or anything else.
The road may be long, but one can find comfort and encouragement along the way. Kawasaki and Welch write, “If You Want to Write by Brenda Ueland [. . .] changed my life by empowering me to write even though I didn’t consider myself a writer.” This is a common feeling, but it’s also one that’s long puzzled me: I spend very little, if any, time considering whether or not I’m “a writer.” I just do it. I didn’t need permission to be a writer, and neither do you. Alternately, if you do need permission, let me bestow it on you: a random stranger on the Internet has now dubbed thee a writer. Feel better?
You should. You should also realize that writing may be lonely in the moment, but it’s a way of bringing people together over time. This tension is implied in moments like these:: “Authors who write to impress people have difficulty remaining true to themselves. A better path is to write what pleases you and pray that there are others like you.” I would also add that few people are likely to be impressed anyway, and those who might be impressed will be more impressed if your book is written, at least some of the time, because you’re having fun and seeing where things go. Think about your favorite sexual experiences: few of them probably arose because you were putting a lot of pressure on yourself or your partners to have a Great Sexual Experience. Most of them probably arose because you and your partner(s) were relaxed and ready to have a good time by seeing where things go. So too with writing, and many other activities.
Sometimes writing will be painful, as Kawasaki and Welch note. I won’t deny it. But parts should be fun, and the fun will show in the final product.
In a few places, I’d like to see better writing in a book about writing. One chapter begins, “This section explains how to take a manuscript and turn it into a book. We assume that you have a rock-solid draft of your book.” “Rock-solid” turns up 74 million hits on Google. It’s a cliche. A book about writing should itself be impeccably written. This one is close—very close. Perhaps the next update will fix that.
Elsewhere, the writers say, “For example, The Schmoe Way by Joe Schmoe from Schmoe Press doesn’t cut it.” And “Pure text posts don’t cut it in the highly visual world of social media.” And “While printed books may never die (an ebook of Annie Leibovitz’s photographs won’t cut it) [. . . .]” What does “cut it,” and what is being cut? All of these could be improved: for example, “an ebook of Annie Leibovitz’s photographs is as useful as sheet music for someone who wants to hear Beethoven’s Fifth.” Maybe that’s a little clunky too, but it’s still an improvement because the metaphor is fresh. One could say, “Pure text posts in the highly visual world of social media make more sense than a pure text movie, but both are improved by images.”
Some words are wasted. The last sentence in this paragraph:
Undaunted, [Amanda] Hocking decided to self-publish her novels with Kindle Direct Publishing to pay for the $300 trip. She started with My Blood Approves, and by October 2010, she made over $20,000. Over the next twenty months, she made $2.5 million. The rest, as the saying goes, is history.
could be removed. I can only think of two similar nonfiction books that had no wasted words: Rework (the 37signals book, and one of the few books I’ve read that should be expanded) and Derek Sivers’ Anything You Want (where Sivers even talks about brevity and clarity in “You should feel pain when unclear“—”Writing that email to all customers would take me all day, carefully eliminating every unnecessary word, and reshaping every sentence to make sure it could not be misunderstood”). The best writing advice I’ve ever received is “omit unnecessary words.” Almost everyone is guilty of this crime at times, including me, in this post, in this blog, and in my other writing.
Their advice on serial commas is askew; Kawasaki and Welch favor serial commas (“A serial comma (or Oxford comma, as they say across the pond) prevents confusion when you are listing several items”), but serial commas can also create ambiguity.
These are minor issues, but I bring them up because nonfiction should aspire to be art. Kawasaki and Welch agree—they say, “Metaphors and similes beat the crap out of adjectives and adverbs, so use them when you can. For example, rather than saying, ‘Hockey is very violent,’ you could say, ‘Hockey is war on ice.'” Perhaps I’m overly fastidious about the War Against Cliche. Others who are highly attuned to language will notice too.
Some sections of APE linger in the mind long after they’re read, like this:
There are two kinds of people: eaters and bakers. Eaters think the world is a zero-sum game: what someone else eats, they cannot eat. Bakers do not believe that the world is a zero-sum game because they can bake more and bigger pies. Everyone can eat more. People trust bakers and not eaters.
It expresses a sentiment I’ve discussed in many contexts, but in a way I hadn’t conceived. My closest approximation came in “How to think about science and becoming a scientist:”
while society needs a certain number of lawyers to function well, too many lawyers leads to diminishing returns as lawyers waste time ginning up work by suing each other over trivialities or chasing ambulances.
By contrast, an excess of scientists and engineers means more people who will build the stuff that lawyers then litigate over. Scientists and engineers expand the size of the economic pie; lawyers mostly work to divide it up differently. Whenever possible, work to be a person who creates things, instead of a person who tries to take stuff created by someone else.
Kawasaki and Welch are bakers. They’re creators. They want to help you be one too. Still, according to them, you have to be the kind of writer who wants to “take control of their fate and embrace the ideas here in order to maximize their success.” A fair number of writers don’t appear to care about being able to “maximize their success” as measured by sales and finances, and in some literary circles cachet comes from not marketing one’s book, or appearing not to market it; sometimes not marketing becomes marketing, as examples like J. D. Salinger and Cormac McCarthy show.
This underlying model of success can seem claustrophobic, and, just I gave you permission to be be a writer above, I give you permission to be selective with social networks here: plans for Facebook, Twitter, LinkedIn, e-mail, Google+, and more would leave me with less writing time. I want to do things that really interest me, and that’s mostly long-form writing. Facebook and Twitter aren’t interesting, and I want the mental space they would otherwise occupy to be occupied by better things. I’m also reluctant to trust Facebook and Google+ because that gives those companies so much control over what I do and who I talk to. There was a recent kerfuffle when Facebook “turned down the volume” of businesses that had Facebook pages. That’s good for Facebook’s users but terrible for anyone who spent time and money encouraging people to interact on Facebook.
Facebook is, of course, where the people are. Using it is good advice, but it might also be useful to ask what you can say no to. In Anything You Want, Derek Sivers has a chapter called “No more yes. It’s either HELL YEAH! or no,” where he says that your reaction to most propositions should be one of those two extremes. To me, Facebook, Google+, and Twitter are in the lukewarm middle. Kawasaki and Welch “recommend using Google+ as a blogging platform.” Does it allow one to export nicely-formatted XML that will allow you to easily switch, if necessary? That’s a prerequisite, at least to me.
Kawasaki and Welch might be overly enamored of social media, and me underly enamored, but unless you want a Salinger-like existence you probably need to do something. There are few alternative to social media, e-mail, and other promotional efforts, and those efforts are a boon to outsiders. The authors say, “I’ve never come across an author who was happy with the marketing efforts of his publisher.” That might be because publishers have one thing that can’t be replicated by outsiders: distribution. Publishers are set up for a world where they control distribution. That advantage is eroding over time.
The chapters about social networking show you how to make sure you have access to new advantages.
The downside is that learning the business consumes time like space shuttles consume jet fuel. At the moment, however, APE is a relatively easy, comprehensible way of learning about all the steps that one should take to move from “guy with a story” or “guy with a long document” to “writes books that other people value and read.”
* I’ve only used Scrivener for one novel, called THE HOOK, that has different, named narrators at different times, like Tom Perrotta’s Election, Anita Shreve’s Testimony, or William Faulkner’s As I Lay Dying. Scrivener was an ideal tool for this task because it made rearranging sections easy, and it made reading each speaker’s full narrative, in order, easy. I can also see it being very useful for non-narrative nonfiction and or dissertations / academic books (James Fallows is a convert). For most fiction, I think the bigger problem is making the story cohere, not rearranging it.
This essay started its life as an e-mail to a student who wanted to know if all writing was, on some level, “just subjective,” which would imply that grading is bogus and so is much of what we do in English classes. I didn’t have time to offer a nuanced explanation of what makes good writing good, so I wrote to him later that night. He didn’t reply to this e-mail.
I was thinking about our conversation and realized that I have more to say about the issues of subjectivity and skill in writing. As you observed, there is an element of subjectivity in judging what’s good writing and what isn’t. But it’s also worth noting that dominant opinions change over time—a lot of the writing from the 18th and 19th Century, for example, was considered “good” if it contained long sentences with balanced, nested clauses. Such stylistic preferences are one reason why a lot of contemporary students have trouble reading such material today, because most of us value variety in sentence structure and value less complexity, overall, in sentence structure. This is normally the place where I could go off on a rant about Facebook and cell phones and texting speak and how the kids these days are going to hell, but I’ll avoid that because it doesn’t appear true overall and certainly isn’t true regarding writing. The trend, including among professional writers writing for other expert writers, has been towards simpler structures and informality (which may speak about the culture as a whole).
That being said, if you want to write a paper full of long, windy clauses and abstruse classical allusions, I’m not going to stop or penalize you and may even reward you, since few if any students write in such a fashion, and I (like most contemporary people) value novelty. As long as the content is strong, I’m willing to roll with somewhat unusual stylistic quirks, and I’m fairly pluralistic in my view of language use.
So how do you, the seeker, figure out what good writing is? You practice, you read, you think about it, you practice some more, like you would if you were learning to play a guitar. I’ve never heard guitar instructors say that their students say all music is subjective; playing the guitar appears to be transparently hard, in the sense that you know you’re bad at it, in a way that writing isn’t. Still, if you’d like to know a lot more about good writing, take a look at Francine Prose’s Reading Like a Writer, James Wood’s ıHow Fiction Works, and Jan Venolia’s Write Right!
When you’re done with those, move on to B. R. Myers’ A Reader’s Manifesto. When you’re done with that, move on to the New York Times’ series Writers on Writing. Collectively, these books will teach you that every word counts and every word choice says something about the writer and the thing the writer is conveying, or trying to convey. Not only that, but every word changes, slightly, the meaning of every word around it. Good writers learn to automatically, subconsciously ask themselves, “Does this word work? Why? Why not? How should I change it? What am I trying to convey here?”
Eventually, over time, skilled writers and thinkers internalize these and other ideas, and their conscious mind moves to other issues, much like a basketball player’s shot happens via muscle memory after it’s been practiced and tweaked over 100,000 repetitions.
In addition, skilled writers are almost always skilled readers, so they have a fairly large, subconscious stock of built-in phrases, ideas, and concepts. Somewhere along the line I’ve read a fair amount about how athletes practice and how athletes become good (perhaps some of that material came from Malcolm Gladwell’s Outliers, or Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience). I know how important practice and repetition are to any skill-based human endeavor. So I combined the idea of skill with writing and skill in basketball, since many students are more familiar with sports than with writing. Where did that analogy come from? I don’t know, exactly, but it’s there now, along with the idea that analogies are good, and explaining what I’m doing is good, and so are many other things.
To return to the athletic analogy, skill in sports also has a subjective element. Is Lebron James now better than Michael Jordan was when Jordan ruled? You can have this argument with morons in bars all day long. I’ve heard it and find it particularly tedious because the outcome is so unimportant. But both players are very clearly good, and at the top of their peers in their respective eras. The comparison at least makes sense.
One could also argue about whether Elmore Leonard or Alain de Botton is the better writer, although I would argue that they’re too different to make that a fruitful comparison; Elmore Leonard would be better matched against someone like Raymond Chandler or Patricia Highsmith. But Leonard and de Botton are both fantastically better writers than most freshmen; for one thing, most freshmen haven’t yet mastered the mechanical parts of writing, like how to use commas consistently and correctly (if they wish to), let alone higher questions about vocabulary, metaphor, and so on.
If you really want to get better, spend a lot of time reading, writing, and thinking about those activities. Then look back at your earlier work and judge its quality for yourself. Few students think the first draft of their first paper is as good as the final draft, and I tend to agree. Few people who consciously work throughout their lives think their work as, say, 20-year-old students is as good as their work at age 30.
With regard to thesis statements, good ones tend to have some aspect of how a text (I hate the term “text,” but it fits here) shows something (“Free-indirect speech in ‘She Wasn’t Soft. . .'”), what a text shows, usually symbolically (“is used to demonstrate how Paula and Jason, despite being a couple, really disdain each other”) and have some larger point to make (“which shows that what people think and how people behave don’t always match”).
That’s not a great thesis statement because I’m doing it quickly and freeform, but a better one might say something like, “The use of free-indirect speech in ‘She Wasn’t Soft’ demonstrates that Paula is actually soft, despite her repeated claims to the contrary, and that Jason and Paula’s mutual loathing sustains their relationship, despite what they say.” That’s still not the sort of thesis statement I’d use to write a publishable academic paper, but it’s closer. Many if not most student papers are missing one of those elements. Not every thesis needs all three, but they’re not bad ideas to check for.
Over time and with experience, I’ve developed, and you’ll develop, a fairly good eye for thesis statements. Eventually, when you’re sufficiently practiced, you won’t necessarily use explicit thesis statements—your thesis will be implied in your writing. Neal Stephenson doesn’t really have an explicit thesis statement in “Turn On, Tune In, Veg Out,” although his last line may function as one, and Roland Barthes definitely doesn’t have an explicit one in “The Brain of Einstein.” Thesis statements aren’t necessarily appropriate to all genres, all the time.
When I started teaching, I actually thought I was going to be a revolutionary and not teach thesis statements at all. I wrote about that experience here. The experiment didn’t work. Most undergrads need thesis statements. So I started teaching them, and student papers got better and more focused, and I’ve been doing so ever since.
Your question or questions are about the inherent challenges of writing, and those don’t have easily summarized answers. The problem also comes from language. Language itself is imprecise, or, alternately, layered with meaning; that’s where so much humor and misunderstanding comes from (and humor could be considered a kind of deliberate misunderstanding). I’ve read about how, when computer scientists tried to start making translation systems and natural-language processing systems, they ran into the ambiguity problem—and that problem still hasn’t been fully solved, as anyone who’s tried to use text-to-speech software, or Google translate, can easily find (I wish I could find any citations or discussions regarding this issue; if you happen to run across any, send them over).
This line of questioning also leads into issues of semiotics—how signs, signaling, and reception function—and the degree of specificity necessary to be good. Trying to specify every part of good writing is like trying to specify every aspect of good writing: you get something like McDonald’s. While McDonald’s does a lot of business, I wouldn’t want to eat there, and it’s pretty obvious that something is lost is the process (Joel Spolsky’s article “Big Macs vs. the Naked Chef” (sfw) also uses McDonald’s as a cautionary tale, this time for software developers; you should definitely read it).
Actually, I’m going to interrupt this essay to quote from Joel:
The secret of Big Macs is that they’re not very good, but every one is not very good in exactly the same way. If you’re willing to live with not-very-goodness, you can have a Big Mac with absolutely no chance of being surprised in the slightest.
Bad high school teachers often try to get students to write essays that are not very good in exactly the same way. I’m trying to get students, and myself, to write essays that are good and that a human might actually want to read. This guarantees that different students will approach the problem space in different ways, some more successfully than others, and different essays are going to be good in different ways. I’m trying to get students to think about the process and, more broadly, to think not just about the solutions, but about the domain; how you conceptualize the problem domain will change what you perceive as the solution. Learning to conceptualize the problem domain is an essential part of the writing process that’s often left out of high school and even college. That being said, if you ever find yourself in front of 20 or 30 novice writers, you’ll quickly see that some are much better than others, even if there’s much wiggle room between a C and C+.
I don’t get the sense that students who are unhappy with their grades are unhappy out of a deeply felt and considered sense of aesthetic disagreement about fundamental literary or philosophical principles. I suspect I feel this way partially because I actually have a fairly wide range of what I would consider “good” writing, or at least writing good enough to get through undergrad English classes, and someone with sufficient sophistication and knowledge to make a good argument about aesthetics or the philosophy of writing would be very unlikely to get a sufficiently low mark to want to argue about it. Rather, I think most students who are unhappy about their grades just want better grades, without doing the thinking and writing necessary to get them.
These issues are compounded by the a meta-issue: many if not most K – 12 English (and other humanities) teachers are bad. And many of them aren’t that smart or knowledgeable (which tends to overlap with “bad”). So a lot of students—especially those on the brighter side—inchoately know that their teachers are bad, and that something stinks, and therefore they conclude that English is bogus anyway, as are related fields. This has a lot of unfortunate consequences on both the individual and societal level; books like C.P. Snow’s The Two Cultures are one manifestation of this larger problem.
In general, I would like for people to try and get along, see each other’s points of view, and be tolerant—not only in fields like religion and politics, but also things like the humanities / sciences, or reason / emotion, or any number of the other possibly false binaries that people love to draw for reasons of convenience.
Finally, at least on this topic, it’s worth noting that, if you think I’m completely wrong about what makes good writing (and what makes writing good), you have a huge world out there and can judge the reaction to your writing. Twilight and The Da Vinci Code are poorly written novels, yet millions of people have read and enjoyed them—many fewer than have read Straight Man, one of my favorite novels and one that’s vastly better written. Who’s right: the millions of teenage girls who think they’re in love with the stilted, wooden prose that makes up Edward, or me, who sees the humor in a petulant English department? It depends on what you mean by “right.” If I were a literary agent or editor, I would’ve passed on both Twilight and The Da Vinci Code. Definitions of “good” are uncertain, and the ones I embrace and impose on students are worth questioning. If you can at least understand where I’m coming from and why I hold the views I do, however, I’ll consider my work a relative success.
Most people also have different definitions of “good” at different points in their lives; I’m in my 20s and view writing very differently than I did in my teens. I would be surprised if I view writing the same way in my 40s. One major change is that I’ve done so much reading, and probably will do so much reading. Someone who doesn’t read very much, or doesn’t challenge themselves when they do read, may find that their standards don’t change as much either. I could write much more on this point alone, but for the most part you’ll have to trust me: your tastes will probably change.
This email is a long way of saying, “I’m not trying to bullshit you, but the problem domain itself is hard, and that domain is not easy to explain, without even getting into its solution.”
The short version of this email is “trust me,” or, alternatively, spend the next ten years of your life pondering and contemplating these issues while reading about them, and then you’ll have a pretty good grasp of what good writing means. Writing is one of these 10,000 hour skills in that it probably takes 10,000 hours of deliberate practice to get good. Start now and you’ll be better in a couple years.
A friend wrote this story and sent it to me, which I post both as a warning and for its own sake.
Police officers have a long and storied history of lying during arrests, on police reports, and even perjuring themselves under oath. They’ve lied about me, in front of me. As Americans, we’re fortunate to live in a country where the burden of proof lies upon the law enforcer instead of the lawbreaker. This is why routine traffic stops should follow strict protocols. Even ruthless murderers have gone free because of technicalities. My particular story contains more of the former (protocol violation) and none of the latter (murder), but if you follow my recommendations you may find yourself where I was this morning—at the end of a gavel hearing the words, “Case dismissed.”
Throughout my life, I’ve had piss-poor luck when it comes to getting caught by authority figures. I nearly got suspended in middle school for de-pantsing a friend in biology class. Freshman year I was suspended for dicking around in English class (and inexplicably promoted to an honors class as a result). Junior year was a whirlwind of parties, subpar oral sex, and death threats from parents of said subpar fellatio perpetrators. The cherry on top of this year was a cold, rainy weekend in November where the police caught me drinking before a football game with a freshman girl. The very next night, I had the supreme idea that I should drink a beer and get behind the wheel of a drug dealer’s car. This left me with one DUI, a narrowly avoided felony possession charge, a night in jail, and $10,000 subtracted from my parents’ bank account.
In college I had a few brush-ins with the law. A couple of fake ID charges, a minor in possession conviction, and assault charges (dropped because I was defending myself, of course). My point is this: I clearly never had a future as a cat burglar, or any other kind, and learned that I have the dubious ability to get caught every time I do something bad and/or illegal.
I live in Los Angeles now, a city which not only thrives on the fumes of automobiles but willfully ignores the need for public transportation. About 45 minutes south of LA lies the idyllic, WASP-y cove of Newport Beach, made famous by moronic reality television depicting the spoiled teenagers and the neurotic housewives who produced them. One of my best friend-girls, “Anastasia”—not her real name, but it’s equally stripper-esque—was dating one of Newport’s denizens and invited me to join her on his massive yacht for Memorial Day. She promised enough silicone to keep me afloat for days should the boat sink, and unlimited expensive booze served by nubile models and tennis instructors. Needless, to say, I agreed.
I invited “Kelly,” another of my best friend-girls along for the ride. Kelly is the rarest kind of woman in LA: an attractive blonde with a brain better served for advanced biochemical formulae than destroying douchey pseudo-actors in Hollywood, but she regularly used it for both (this will be important later).
The day passed as you might expect: I hopped onto an 80-foot yacht with thirty incredibly attractive women and five men I didn’t know, and I proceeded to drink and eat my way into perpetual bliss. Just kidding—you wouldn’t expect that unless you’ve been living in Los Angeles for long enough to meet these types. I met Anastasia’s boyfriend, “Alladin”—the owner of the boat. Curious about his opulent lifestyle, I asked him what he did for a living. He mumbled something about buying and selling “web properties.” I was in a similar industry at the time, but I elected not to press for details: I’ve learned many things in LA, and one of them is that if someone can’t explain to you what he does for work, you probably don’t want to know.
Several glasses of a champagne, a few beers, a couple Grey Goose and tonics, at least five makeout sessions with some of the MILFier attendees, and one botched threesome attempt in the captain’s cabin later, we found ourselves heading back to shore. We ordered enough Chinese food to feed a clan of Hutts and watched the sun go down over Newport Harbor. Three hours later, after Kelly and I made the expert determination that I hadn’t imbibed for several hours and thus was capable of driving, we said our goodbyes.
This was my first mistake of the day, aside from failing my attempt at a threesome. Kelly and I were busy jabbering about how awesome our day was and how we couldn’t wait for our next yachting adventure. About fifteen minutes after getting on the 405 freeway (known as the “four or five hours” to LA residents), Kelly noticed a black-and-white pacing us. I remained calm—I wasn’t speeding and, to my knowledge, I was no longer drunk.
It turns out that the opposite was true in the eyes of the law.
The cop, who will henceforth be known as Officer Dipshit, turned on his flashers and directed me to get off the freeway. Before I could let out the breath I didn’t realize I was holding, he’s yelling at me over his PA system. Within five minutes he’s at my window telling me he detects the odor of alcohol and administering a preliminary eye test (without my consent) known ominously as the “Nystagmus.” He’s asking me to exit the vehicle. He’s asking me to submit to voluntary tests. Remembering my first encounter with a potential DUI in Washington State, and the video “Never Talk to the Police,” I raise my hand and emphatically state, “I REFUSE.”
Police don’t like their authority being questioned. They especially dislike it when a citizen knows his rights and chooses to exercise them. Read these words very carefully: voluntary DUI tests are designed for you to fail. You can be Michael Phelps or Usain Bolt on a midsummer’s day in 2012, sober as a rock, and still fail the tests. DUI tests were created to stack evidence against you in order to give the officer a defensible reason for arresting you. Let me repeat: if you have had anything to drink, ANYTHING AT ALL, do not submit to these tests.
Some of you are familiar with Tucker Max’s work, but most of you haven’t read his poignant piece (written with friend and business partner Nils Parker) about the different types of people who become cops. My arresting officer was certainly a “High School Napoleon”—5’4”, 220 lbs of seething, blubbery vengeance for all the wedgies and rejections from women throughout his life. I’m not stupid enough to be disrespectful to a cop. Should you find yourself in my position, do what I did: give him short, courteous answers, do not admit any guilt, and above all do not submit to his tests, no matter how much he tries to scare you.
Upon rejecting his voluntary DUI tests, Officer Dipshit threw his pad into the air and informed me of my imminent arrest. He pushed me against the car as he slammed the cuffs on my wrist, whispering that his colleagues had “Fucked up bigger guys than me,” and tightened the steel links until my hands went numb. I knew that I was in for a joyous night. The officer then proceeded to threaten Kelly, telling her that he was going to arrest her too, asking her if she would like to spend a night in jail. The only thing I was guilty of thus far was driving a red sports car with a hot blonde in the passenger seat.
I was arrested, taken to the police station, and booked for DUI. After sixteen miserable hours, I was released to my disappointed mother. Eventually they got around to testing my blood. You must submit to this test, otherwise you’ll be automatically convicted of a DUI and your license revoked for a year, but you want to have it administered at the police station. The results wouldn’t be known for weeks, but it turns out that my Blood Alcohol Content was .09, otherwise known as the equivalent of a little more than a beer per hour. In the eyes of the law, I was intoxicated. It doesn’t matter if you think you’re drunk. It only matters what your blood says (Dexter Morgan would appreciate this sentiment).
After receiving a citation for a DUI, you have a few options. You can go to your hearing, plead guilty, go to your alcohol classes, attend the M.A.A.D. panel, install the breathalyzer in your car, and deal with a suspended license for six months. Most people choose this—especially the ones that have an egregiously high BAC. All told, your first DUI will cost around $5,000 even if you choose not to hire a lawyer. This doesn’t count the peripheral costs, like explaining to your employer why your license is suspended, telling your date why you have to blow into a tube before you can start your car, or attempting to bum rides from your friends while you’re carless.
I hired a lawyer and went to war with Officer Dipshit. The truth is that most of you won’t be able to do anything about your DUI. Your case will be open and shut, the same kind of case that passes through municipal courts hundreds of thousands of times a year in the U.S. However, there are a few things that you can exploit to your advantage:
- If you were smart, you didn’t take the voluntary tests. Now the officer has to prove that he had a legitimate reason to pull you over in the first place.
- Your lawyer should subpoena the dashboard video of your arrest. Ever gotten stoned and watched an episode of COPS? It’s hilarious, right? Not so much when you’re the perp. Luckily for you, most states require police to videotape their arrests—thanks, Rodney King!
- The dashboard video will sometimes allow you to systematically refute most of the cop’s police report. As I mentioned in the opening paragraph, police will almost always embellish or outright lie on the police report. In my case, the cop claimed that I was speeding, swerving, driving erratically, refused to pull over, stumbled upon getting out of the vehicle, displayed an aggressive demeanor, and generally acted like a drunk lunatic. None of this was true. Cops, incidentally, are trying to eliminate the ability of citizens to record them, because they dislike objective evidence that documents their actions.
My lawyer built a case against Officer Dipshit, culminating in this morning’s hearing. Officer Dipshit took the stand and said all the “facts” in his police report were accurate. The police report was inadmissible as evidence – the dashboard video, however, was admissible, and showed him contradicting himself. Officer Dipshit lied his way into a corner, and the prehistoric judge presiding over the hearing ruled that he was an idiotic, lying, power-tripping asshole, just as we suspected all along.
Lawyer fees: $5,000
Hours spent worrying: countless
The look on the cop and district attorney’s faces when they realize that their asses have been handed to them on a silver platter: priceless
Your life lesson: Don’t talk to cops, and learn how to fight the system.
The Generals has one of the best qualities a general nonfiction book can have: it’s about a specific topic that it covers well, but its lessons and ideas also transcend its topic and apply to many others. Let me explain. Take this section, about General Patton:*
Even now, more than six decades after his death, Patton remains one of our most remarkable generals. ‘You have no balance at all,’ Marshall’s wife once scolded the young Patton, correctly, years before World War II. Maj. Gen. Ernest Harmon, one of his peers, wrote that he was ‘strange, brilliant, moody.’ The blustery Patton behaved in ways that would have gotten other officers relieved, but he was kept on because he was seen, accurately, as a man of unusual flaws and exceptional strengths. Marshall concluded that Patton was both a buffoon and a natural and skillful fighter.
Knowledge, skill, and expertise in one domain don’t necessarily transfer to other domains. A brilliant physicist may be a terrible marriage therapist, and vice-versa. Someone who is a “buffoon” might also have a compensating skill that makes up for their possible deficits. Paul Graham implicitly writes about this in Is It Worth Being Wise?:
‘wise’ means one has a high average outcome across all situations, and ‘smart’ means one does spectacularly well in a few. [. . .] The distinction is similar to the rule that one should judge talent at its best and character at its worst. Except you judge intelligence at its best, and wisdom by its average. That’s how the two are related: they’re the two different senses in which the same curve can be high.
A lot of people seem to have trade-offs between peaks and averages. Steve Jobs comes to mind: Walter Isaacson’s biography is rife with examples of Jobs being wrong, cruel, and occasionally outright stupid. His lows were low. But he got big, important stuff right—and not just right, but very, spectacularly right. He found (or made) the right environment for his skills. It’s almost impossible to imagine Jobs being a good employee at, say, Wal-Mart, or any large company that values homogeneity over creativity.
It’s obviously possible to have high averages and high peaks, but that doesn’t appear to be common. Really spectacular peaks often come in unusual packages. Those unusual packages are often easy to dismiss by someone not paying attention.
Unfortunately, as Ricks points out, America since the Korean War hasn’t judged its generals by their peaks or their averages: in fact, we haven’t judged generals on their competence much at all. That’s a tremendous, underappreciated problem. In Ricks’ description, the generals cut from the Marshall style were primarily “team players” who needed to work effectively with others and defer to the group. That’s not necessarily a bad thing; as Ricks says:
Perhaps those who rose highest in World War II were organization men. But for the most part they were members of a successful organization, with the failures among them weeded out instead of coddled and covered up. That would not be in the case in our subsequent wars, in which it would be more difficult to know what victory looked like or even whether it was achievable.
Different time periods reward different forms of industrial organization. If World War II rewarded “organization men,” many of today’s organizations reward people who figure out the weaknesses of large organizations, and then offer alternatives. But that can’t happen in the military, where the closest analogue to startups might be defense contractors and private, Blackwater-style armies. Those, however, have their own problems.
There’s also an analogy to teaching: almost no public school teacher is fired, ever, for bad teaching. Not being able to fire transparently terrible teachers is an impediment to getting better teachers, as almost anyone who’s ever been in a public school knows.
Organizations also need to focus on making sure that they’re focused on their major purpose, not on primarily serving the interests of the people inside them:
Trying to be fair to officers can be lethal to the soldiers they lead on the battlefield. The Army was using the Korean War to give the staff officers of the earlier war ‘their chance’ to command in combat—with disastrous results. Well before Chosin, the Army had recognized that it had a problem with inexperienced combat leadership in the war.
The problem is “inexperienced combat leadership,” but the solutions became worse in some respects than the problem itself. Fairness to one group can mean extreme unfairness to others, who often have much less of a voice. No one speaks for the enlisted men who are led by incompetent generals. (No one speaks for those led by an incompetent president, either, but that’s a separate issue related to larger American society.)
Misaligning incentives creates a deeper sense of rot; Ricks says that generals, by the post-Korean-War era,
were acting less like stewards of their profession, answerable to the public, and more like keepers of a closed guild, answerable mainly to each other. Becoming a general was now akin to winning a tenured professorship, liable to be removed not for professional failure but only for embarrassing one’s institution with moral lapses.
Notice what this says about Ricks’s view of the university: by comparing one system that advances mediocrity with tenure, he implies that tenure advances mediocrity. He doesn’t go on to explain why he uses the metaphor, because he assumes that his readers already believe as much. But tenured professors aren’t putting their students in life-or-death situations, and students can choose to pick a different department or university. Service members can’t. During World War II, as Ricks says, the road to victory and home led through Berlin and Tokyo. In recent wars, the road to victory has been murkier, the politico-military establishment mostly hasn’t selected generals adept at operating in the murk. The consequences are clear.
The Generals is too detailed for people who aren’t deeply interested in military affairs and history. It probably isn’t detailed enough for those who are immersed.
But it’s also the best intellectual explanation of why one should be wary of enlisting in today’s American military: you might get killed by someone incompetent but unaccountable on the basis of performance. Contemporary generals who lose wars and cost soldiers their lives are fêted. They “retire” to lucrative consulting gigs with defense contractors and lobbying firms. The soldiers are disabled or dead. To me that argues against becoming a soldier or junior officer. In most businesses, if you think your boss is an asshat, you can quit and start a rival firm. In the military, obeying is the only option, and no one is making sure that your boss is actually good at his job.
EDIT: B.J. Khalifah has an interesting letter in The Atlantic:
Thomas Ricks overlooked something important. Sadly, nobody becomes a general (or equivalent) in the military until they have served for many years. Most colonels are 50 by the time they get promoted. Many younger officers have experience and drive; as a group, they adapt well. Older officers are more cautious, members of the “cover your ass and do not make waves” category. They know how to manipulate the good-old-boy game. The service should be, but is not, a strict meritocracy. In effect, it follows union-style rules of seniority and time in grade. From second lieutenant to first lieutenant to captain is automatic. Some lousy officers have made it past captain to become major by being on court-martial or combat duty when they are promoted. The rules are not negotiable.
This contrasts hugely with startup and good corporate cultures, which judge people almost purely on merit. Successful startups have famously been founded by 18 year olds. Even law firm partners can be promoted within as little of five years of hiring, while associates frustrated by a firm’s practices can start their own. The military apparently doesn’t do that, and I haven’t seen any evidence that 50-year-old generals will necessarily be better than 26-year-old (hypothetical) generals. Certainly among startups this isn’t true.
The comparison isn’t perfect—markets reward innovators for making things people want, and the military doesn’t have a clear feedback loop. But at the moment almost no one is even discussing the issue, or making the comparison.
* The movie Patton is also remarkably good, especially the speech at the beginning. Patton doesn’t have the American character down correctly—Americans don’t love the sting of battle unless we’re provoked—but the speech demonstrates a lot about the man doing the speaking.
The bit about loving a winner and not tolerating a loser is also fascinating in light of The Generals: we’ve tolerated a lot of losers, like Donald Rumsfeld and Tommy Franks, and sacked winners like Eric Shinseki.
I’m an on-the-record fan of William Deresiewicz, which made reading “Tsunami: How the market is destroying higher education” distressing. It blames problems in contemporary higher education on capitalism and markets, but I think it ignores a couple of things, the most important of which is the role in colleges in raising prices, increasing the number of administrators, and reducing teaching loads for tenured faculty.
Beyond that, Deresiewicz discusses Naomi Klein’s The Shock Doctrine, which is a dubious place to start; see, for example, “Shock Jock” for one critique. In it, Tyler Cowen notes that “Most of the book is a button-pressing, emotionally laden, whirlwind tour of global events over the last 30 years” and that “The book offers not so much an argument but rather a Dadaesque juxtaposition of themes and supposedly parallel developments in the global market.” Klein’s book reminds me of the bad academic writing that assumes the dubious evils of capitalism without quite spelling out what those dubious evils are or what plausible alternatives exist.
Returning to Deresiewicz: “College is now judged in terms of ‘return on investment,’ the delivery of immediately negotiable skills.” But this might simply be due to rising costs: when college was (relatively) inexpensive, it was easy to pay less attention to ROI issues; when it’s almost impossible to afford without loans for middle-class families, it becomes much harder. ROI on degrees that, in contemporary terms, cost $20,000 can be safely ignored. ROI on degrees that cost $150,000 can’t be.
Second, even at public (and private non-profit) schools, some people are getting rich: the college presidents and other managers (including coaches) whose salaries range well into the six figures and higher.
Presidents and other bureaucrats make popular punching bags—hell, I took a couple whacks in my first paragraph—and perhaps they are “overpaid” (though one should ask why Boards of Trustees are willing to pay them what they do), but such highly-paid administrators still aren’t very expensive relative to most colleges’ overall budgets. I would like to see universities exercise greater discipline in this area, but I doubt they will until they’re forced to by markets. At the moment, schools are underwritten by federally-backed, non-dischargeable loans taken out by students. Until we see real reform,
The only good answer about the rise in college costs that I’ve seen come from Robert Archibald and David Feldman’s Why Does College Cost So Much? Their short answer: “Baumol’s Cost Disease.” Unfortunately, it’s more fun pointing fingers at evil administrators, evil markets, evil capitalism, and ignorance students who want to know how much they’re going to make after they graduate.
At the very least, Why Does College Cost So Much? is a better place to start than The Shock Doctrine.
These questions are getting more and more play in the larger culture. Is College a Lousy Investment? appears in The Daily Beast. “A Generation Hobbled by the Soaring Cost of College” appears in The New York Times. A surprisingly large number of people with degrees are working in jobs that don’t require them: in coffee shops, as bartenders, as flight attendants, and so on. That’s a lot of money for a degree that turns out to be primarily about personal development and partying. So what should students, at the individual level, do?
To figure out whether college is a good idea, you have to start with what you’re trying to accomplish: getting a credential or gaining knowledge. If the primary purpose is the latter, and you have a strong sense of what you want to do and how you want to do it, college isn’t automatically the best option. It probably is if you’re 18, because, although you don’t realize this now, you don’t know anything. It might not be when you’re, say, 23, however.
Part of the problem with discussing “college” is that you’re discussing a huge number of varied institutions that do all sorts of things for all sorts of people. For people getting $200,000 English degrees from non-elite universities, college makes less sense (mine cost about half that much, and in retrospect I might’ve been better off with a state school for half again as much, but it seemed like a good idea at the time and seems to have worked out for me, as an individual). For people getting technical degrees from state schools, college does a huge amount for lifetime earnings. Talking about these two very different experiences of “college” is like talking about eating at McDonald’s and eating at New York’s best restaurant: they’re both about selling food, but the differences dwarf the similarities. College is so many different things that generalizing is tough or simply dumb.
In response to paragraphs like mine, above, we’re getting essays like Keith Burgess-Jackson’s “You Are Not My Customer.” Burgess-Jackson is correct to say that not everything can be valued in terms of dollars—that’s a point that Lewis Hyde makes in The Gift and others have made in terms of market vs. non-market economies. The question is whether we should view university education through a market lets.
When tuition was relatively cheap and quite affordable in absolute and relative terms, it made sense to look at universities through a “gift”-style lens, as Burgess-Jackson wants us to. Now that tuition is extremely high, however, we basically don’t have the luxury of making this choice: we can’t be paying $50,000 – $250,000 for an undergrad degree and have the attitude of “Thank you sir, may I have another.” It’s one or the other, not both, and universities are the ones setting prices.
Comments like this: “Good teachers know that most learning, certainly all durable learning, is self-effected” are true. But if Burgess-Jackson thinks that his students aren’t customers, wait until the administration finds that no one will or wants to take his classes. Unless he’s a publishing superstar, I suspect he’ll find out otherwise. I’d like universities to be less market-oriented and more gift-oriented, but an era of $20,000+ comprehensive costs for eight to nine months of instruction just doesn’t make that orientation plausible.
In Mark McGurl’s excellent The Program Era: Postwar Fiction and the Rise of Creative Writing, he writes this about Vladimir Nabokov:
In fact, one of his best-known quirks was a scientific passion for a certain family of butterflies, the Blues.
The word “was” is interesting, because Nabokov’s quirks are still well-known, in the present. But Nabokov’s quirks happened in the past—he’s obviously dead. So there’s a moment of verb tense weirdness in this sentence, which might otherwise read something like, “In fact, one of his best-known quirks is the scientific passion Nabokov had for a certain family of butterflies, the Blues.”
There’s no particular point to this post other than a writer’s duty to notice language, and the opportunity to observe a specific example of language’s sometimes bizarre ambiguity.
(Those of you who are reading the last sentence and thinking about its own weirdness, be aware: that is intentional.)