Journalism, physics and other glamor professions as hobbies

The short version of this Atlantic post by Alex C. Madrigal is “Don’t be a journalist,” and, by the way, “The Atlantic.com thinks it can get writers to work for free” (I’m not quoting directly because the article isn’t worth quoting). Apparently The Atlantic is getting writers to work for free, because many writers are capable of producing decent-quality work, and the number of paying outlets are shrinking. Anyone reading this and contemplating journalism as a profession should know that they need to seek another way of making money.

The basic problems journalism faces, however, are obvious and have been for a long time. In 2001, I was the co-editor-and-chief of my high school newspaper and thought about going into journalism. But it was clear that the Internet was going to destroy a lot of careers in journalism. It has. The only thing I still find puzzling is that some people want to major in journalism in college, or attempt to be “freelance writers.”

Friends who know about my background ask why I don’t do freelance writing. When I tell them that there’s less money in it than getting a job at Wal-Mart they look at me like I’m a little crazy—they don’t really believe that’s true, even when I ask them how many newspapers they subscribe to (median and mode answer: zero). Many, however, spend hours reading stuff for free online.

In important ways I’m part of the problem, because on this blog I’m doing something that used to be paid most of the time: reviewing books. Granted, I write erratically and idiosyncratically, usually eschewing the standard practices of book reviews (dull, two-paragraph plot summaries are stupid in my view, for instance), but I nonetheless do it and often do it better than actual newspapers or magazines, which I can say with confidence because I’ve read so many dry little book reports in major or once-major newspapers. Not every review I write is a critical gem, but I like doing it and thus do it. Many of my posts also start life as e-mails to friends (as this one did). I also commit far more typos than a decently edited newspaper or magazine. Which I do correct when you point them out.

The trajectory of journalism is indicative of other trends in American society and indeed the industrialized world. For example, a friend debating whether he should consider physics grad school wrote this to me recently: “I think physics is something that is fun to study for fun, but to try to become a professional physicist is almost like too much of a good thing.” He’s right. Doing physics for fun, rather than trying to get a tenure-track job, makes more sense from a lifestyle standpoint.

A growing number of what used to occupations seem to be moving in this direction. Artists got here first, but others are making their way here. I’m actually going to write a post about how journalism increasingly looks like this too. The obvious question is how far this trend will go—what happens when many jobs that used to be paid become un-paid?

Tyler Cowen thinks we might be headed towards a guaranteed annual income, an idea that was last popular in the 60s and 70s. When I asked Cowen his opinions about guaranteed annual incomes, he wrote back to say that he’d address the issue in a forthcoming book. The book hasn’t arrived yet, but I look forward to reading it. As a side not, apparently Britain has, or had, a concept called the “Dole,” which many people went on, especially poor artists. Geoff Dyer wrote about this some in Otherwise Known as the Human Condition. The Dole subsidized a lot of people who didn’t do much, but it also subsidized a lot of artists, which is pretty sweet; one can see student loans and grad school serving analogous roles in the U.S. today.

IMG_1469-1Even in programming, which is now the canonical “Thar be jobs!” (pirate voice intentional) profession, some parts of programming—like languages and language development—basically aren’t remunerative. Too many people will do it free because it’s fun, like amateur porn. In the 80s there were many language and library vendors, but nearly all have died, and libraries have become either open source or rolled into a few large companies like Apple and Microsoft. Some aspects of language development are cross-subsidized in various ways, like professors doing research, or companies paying for specific components or maintenance, but it’s one field that has, in some ways, become like photography, or writing, or physics, even though programming jobs as a whole are still pretty good.

I’m not convinced that the artist lifestyle of living cheap and being poor in the pursuit of some larger goal or glamor profession seems is good or bad, but I do think it is (that we have a lot of good cheap stuff out there, and especially cheap stuff in the form of consumer electronics, may help: it’s possible to buy or acquire a nearly free, five-year-old computer that works perfectly well as a writing box).* Of course, many starving artists adopt that as a pose—they think it’s cool to say they’re working on a novel or photography project or “a series of shorts” or whatever, but don’t actually do anything, while many people with jobs put out astonishing work. Or at least work, which is usually a precursor to astonishing work.

For some people, the growing ability of people to disseminate ideas and art forms even without being paid is a real win. In the old days, if you wanted to write something and get it out there, you needed an editor or editors to agree with you. Now we have a direct way of resolving questions about what people actually want to read. Of course, the downside is that whole payment thing, but that’s the general downside of the new world in which we live, and, frankly it’s one that I don’t have a society-wide solution for.

In writing, my best guess is that more people are going to book-ify blogs, and try to sell the book for $1 – $5, under the (probably correct) assumption that very few people want to go back and read a blog’s entire archives, but an ebook could collect and organize the material of those archives. If I read a powerful post by someone who seemed interesting, I’d buy a $4 ebook that covers their greatest hits or introduced me to their broader thinking.

This is tied into other issues around what people spend their time doing. My friend also wrote that he read “a couple of articles on Keynes’ predictions of utopia and declining work hours,” but he noted that work still takes up a huge amount of most people’s lives. He’s right, but most reports show that median hours worked in the U.S. has declined, and male labor force participation has declined precipitously. Labor force participation in general is surprisingly low. Ross Douthat has been discussing this issue in The New York Times (a paid gig I might add), and, like, most reasonable people he has a nuanced take on what’s happening. See also this Wikipedia link on working time for some arguments that working time has declined overall.

Working time, however, probably hasn’t decreased for everyone. My guess is that working time has increased for some smallish number of people at the top of their professors (think lawyers, doctors, programmers, writers, business founders), with people at the bottom often relying more on government or gray market income sources. Douthat starts his essay by saying that we might expect working hours among the rich to decline first, so they can pursue more leisure, but he points out that the rich are working more than ever.

Though I am tempted to put “working” in scare quotes, because it seems like many of the rich are doing things they would enjoy doing on some level anyway; certainly a lot of programmers say they would keep programming even if they were millionaires, and many of them become millionaires and keep programming. The same is true of writers (though fewer become millionaires). Is writing a leisure or work activity for me? Both, depending. If I self-publish Asking Anna tomorrow and make a zillion dollars, the day after I’ll still be writing something. I would like to get paid but some of the work I do for fun isn’t contingent on me getting paid.

Turning blogs into books and self-publishing probably won’t replace the salaries that news organizations used to pay, but it’s one means for writers or would-be writers to get some traction.

Incidentally, the hobby-ification of many professions makes me feel pretty good about working as a grant writing consultant. No one think when they’re 14, “I want to be a grant writer like Isaac and Jake Seliger!”, while lots of people want to be like famous actors, musicians, or journalists. There is no glamor, and grant writing is an example of the classic aphorism, “Where there’s shit, there’s gold” at work.

Grant writing is also challenging. Very few people have the weird intersection of skills necessary to be good, and it’s a decade-long process to build those skills—especially for people who aren’t good writers already. The field is perpetually mutating, with new RFPs appearing and old ones disappearing, so that we’re not competing with proposals written two years ago (where many novelists, for example, are in effect still competing with their peers from the 20s or 60s or 90s).

To return to journalism as a specific example, I can think of one situation in which I’d want The Atlantic or another big publisher to publish my work: if I was worried about being sued. Journalism is replete with stories about heroic reporters being threatened by entrenched interests; Watergate and the Pentagon Papers are the best-known examples, but even small-town papers turn up corruption in city hall and so forth. As centralized organizations decline, individuals are to some extent picking up the slack, but individuals are also more susceptible to legal and other threats. If you discovered something nasty about a major corporation and knew they’d tie up your life in legal bullshit for the next ten years, would you publish, or would you listen to your wife telling you to think of the kids, or your parents telling you to think about your career and future? Most of us are not martyrs. But it’s much harder for Mega Corp or Mega Individual to threaten The Atlantic and similar outlets.

The power and wealth of a big media company has its uses.

But such a use is definitely a niche case. I could imagine some of the bigger foundations, like ProPublica, offering a legal umbrella to bloggers and other muckrakers to mitigate such risks.

I have intentionally elided the question of what people are going to do if their industries turn towards hobbies. That’s for a couple reasons: as I said above, I don’t have a good solution. In addition, the parts of the economy I’m discussing here are pretty small, and small problems don’t necessarily need “solutions,” per se. People who want to turn their hours into a lot of income should try to find ways and skills to do that, and people who want to turn their hours into fun products like writing or movies should try to find ways to do that too. Crying over industry loss or change isn’t going to turn back the clock, and just because someone could make a career as a journalist doesn’t mean they can today.


* To some extent I’ve subsidized other people’s computers, because Macs hold their value surprisingly well and can be sold for a quarter to half of their original purchase price three to five years after they’ve been bought. Every computer replaced by my family or our business has been sold on Craigslist. Its also possible, with a little knowledge and some online guides, to add RAM and an SSD to most computers made in the last couple of years, which will make them feel much more responsive.

Life: Love edition

“[T]he choice one makes between partners, between one man and another (or one woman and another) stretches beyond romance. It is, in the end, the choice between values, possibilities, futures, hopes, arguments (shared concepts that fit the world as you experience it), languages (shared words that fit the world as you believe it to be) and lives.”

—Zadie Smith, Changing My Mind

Martin Amis, the essay, the novel, and how to have fun in fiction

There’s an unusually interesting interview with Martin Amis in New York Magazine, where he says:

I think what has happened in fiction is that fiction has responded to the fact that the rate of history has accelerated in this last generation, and will continue to accelerate, with more sort of light-speed kind of communications. Those huge, leisurely, digressive, essayistic, meditative novels of the postwar era—some of which were on the best-seller lists for months—don’t have an audience anymore. [. . .]

No one is writing that kind of novel now. Well [. . . ] David Foster Wallace—that posthumous one looks sort of Joycean and huge and very left-field. But most novelists I think are much more aware than they used to be of the need for forward motion, for propulsion in a novel. Novelists are people too, and they’re responding to this just as the reader is.

I think people aren’t reading the “essayistic, meditative novels” because “essayistic, meditative novels” reads like code-words for boring. In addition, we’re living in “The Age of the Essay.” We don’t need novelists to write essays disguised as novels when we can get the real thing in damn near infinite supply.

The discovery mechanisms for essays are getting steadily better. Think of Marginal Revolution, Paul Graham’s essays, Hacker News, The Feature, and others I’m not aware. Every Saturday, Slate releases a link collection of 5 – 10 essays in its Longform series. Recent collections include the Olympics, startups, madness in Mexico, and disease. The pieces selected tend to be deep, simultaneously intro- and extrospective, substantive, and engaging. They also feel like narrative, and nonfiction writers routinely deploy the narrative tricks and voice that fiction pioneered. The best essay writers have the writing skill of all but perhaps the very best novelists.

As a result, both professional (in the sense of getting paid) and non-professional (in the sense of being good but not earning money directly from the job) writers have an easy means of publishing what they produce. Aggregators help disseminate that writing. A lot of academics who are experts in a particular subject have fairly readable blogs (many have no blogs, or unreadable blogs, but we’ll focus on the readable ones), and the academics who once would have been consigned to journals now have an outlet—assuming they can write well (many can’t).

We don’t need to wait two to five years for a novelist to decide to write a Big Novel on a topic. We often have the raw materials at hand, and the raw material is shaped and written by someone with more respect for the reader and the reader’s time than many “essayistic” novelists. I’ve read many of those, chiefly because they’ve been assigned at various levels of my academic career. They’re not incredibly engaging.

This is not a swansong about how the novel is dead; you can find those all over the Internet, and, before the Internet, in innumerable essays and books (an awful lot of novels are read and sold, which at the very least gives the form the appearance of life). But it is a description of how the novel is, or should be, changing. Too many novels are self-involved and boring. Too many pay too little to narrative pacing—in other words, to their readers. Too many novels aren’t about stuff. Too many are obsessed with themselves.

Novels might have gotten away with these problems before the Internet. For the most part, they can’t any more, except perhaps among people who read or pretend to read novels in order to derive status from their status as readers. But being holier-than-thou via literary achievement, if it ever worked all that well, seems pretty silly today. I suppose you could write novels about how hard it is to write novels in this condition—the Zuckerman books have this quality at times, but who is the modern Zuckerman?—but I don’t think anyone beyond other writers will be much interested.

If they’re not going to be essayistic and meditative, what are novels to be? “Fun” is an obvious answer. The “forward motion” and “propulsion” that Amis mentions are good places to start. That’s how novels differ, ideally, from nonfiction.

Novels also used to have a near-monopoly on erotic material and commentary. No more. If you want to read something weird, perverse, and compelling, Reddit does a fine job of providing it (threads like “What’s your secret that could literally ruin your life if it came out?” provides what novels used to).

Stylistically, there’s still the question of how weird and attenuated a writer can make individual sentences before the work as a whole becomes unreadable or boring or both. For at least a century and change, writers could go further and further in breaking grammar, syntax, and point of view rules while still being comprehensible. By the time you get to late Joyce or Samuel Beckett’s novels, however, you start to see the limits of incomprehensibility and rule breaking regarding sentence structure, grammar, or both.

Break enough rules and you have word salad instead of language.

Most of us don’t want to read word salad, though, so Finnegans Wake and Malone Dies remain the province of specialists writing papers to impress other specialists. We want “forward motion” and “propulsion.” A novel must delight in terms of the plot and the language used. Many, many novels don’t. Amis is aware of this—he says, “I’m not interested in making a diagnostic novel. I’m 100 percent committed in fiction to the pleasure principle—that’s what fiction is, and should be.” But I’m not sure his fiction shows this (as House of Meetings and Koba the Dread show). Nonetheless, I’m with him in principle, and, I hope, practice.

How to think about science and becoming a scientist

A lot of students want to know whether they should major in the humanities, business, or science, which is a hard choice because most of them have no idea whatsoever about what real science (or being a scientist) is like, and they won’t learn it from introductory lectures and lab classes. So freshmen and sophomores who are picking majors don’t, and can’t, really understand what they’re selecting—or so I’ve been told by a lot of grad students and youngish professors who are scientists.

One former student recently wrote me to say, “I was a biochemistry major with a dream of being a publisher and long story short, I am no longer a biochem major and I am going full force in getting into the publishing field right now” (emphasis added). I discouraged her from going “into” publishing, given that I’m not even convinced there is going to be a conventional publishing industry in five years, and forwarded her e-mail to a friend who was a biochemistry major. My friend’s response started as a letter about how to decide if you want to become a scientist but turned into a meditation on how to face the time in your life when you feel like you have to decide what, if anything, you want to become.


The thing about being “interested” in science is that the undergraduate survey classes rarely confirm if you really are. They’re boring. Rote. Dull. I credit my Bio 101 teacher with making the delicate, complicated mysteries of carbon based life as engaging as listening to my Cousin “M” discuss the subtle differences among protein powder supplements. I spent most of class surfing Wikipedia on my laptop. The next semester I weaseled my way into an advanced cell bio class that was fast and deep and intellectually stimulating, thanks to an eccentric teacher with a keen mind and a weird tendency to act out enzymatic reactions in a sort of bastardized interpretive dance. I dropped Bio 102, which didn’t cripple my ability to keep up with advanced cell bio in any way (showing that survey classes can be unnecessary, boring, and confusing—confusing primarily because they leave out the details that are supposed to be too “advanced” but in fact clarify what the hell is going on), and got an unpaid research position in a faculty lab that eventually turned into a paid gig. By the way: there is significant pressure to dumb survey courses down and virtually no pressure on professors to teach them well; there are still good ones, but don’t let the bad ones dissuade you.

If any field of scientific inquiry interests you, if you have the impulse to ask your own questions and are excited by the idea that you can go find the answers yourself and use what you’ve discovered to tinker and build and ask new questions—which is to say, if you like the idea of research—you’ve got a much better chance of figuring out if you want to be a scientist. How? Go and be one. Or, at least, play at being a scientist by finding a lab that will train you at doing the work until you wake up one day and realize that you are teaching a new undergrad how to program the PCR machine and your input is being used to develop experiments.

I was a biochemistry undergrad major, and I absolutely deplored the labs that were required by classes, but it turned out I loved the actual work of being in a lab. Classes lacked the creativity that makes science so appealing; they feel designed to discourage interest in science. In class, we had 50 minutes to purify a protein and learn to use the mass spectrometer. Big deal. Can I go now? But put me in front of the PCR machine with a purpose? I’d learn how to use it in an afternoon because doing so meant that I was one step closer to solving a problem no one had solved before. You don’t find that kind of motivation in most classrooms. And you don’t need a Ph.D. to contribute to the field. All you need is intellectual appetite. (For an exception to the “class is boring” rule, check out Richard Feynman’s intro to physics lectures.)

So: I didn’t like glossing over information, memorizing for tests, and being told I had till the end of class to identify how many hydrogen ions were in compound X. I wasn’t excited by my major, but I was excited by my subject—and the longer I spent working in a real lab with a professor who saw that I was there every day and willing to learn (he eventually gave me a pet project), the more engaged I became with biochemistry. Sure, the day-to-day involved a lot of pipetting and making nutrient-agar plates to grow bacteria on, but I was working towards something larger than a grade.

I was splicing the DNA of glucose galactose binding protein and green fluorescent protein to try to make a unique gene that could express a protein which fluoresced when binding to glucose. In essence, a protein flare. Then I built it into an e-coli plasmid so it would self-replicate, while a lab in Japan was trying to get the gene expressed into what effectively turned into glow-under-blacklight-just-add-sugarwater mice. The goal was to get the gene expressed in diabetic people who could wear a fluorimeter watch and check how brightly the genetically engineered freckle on their wrist glowed, in lieu of pricking their finger to check their blood glucose.

Do you have any idea how awesome it was to describe my research at parties? I left out the parts where I had to evacuate the whole lab for four hours after accidentally creating a chlorine cloud and especially the parts where I spent an entire day making 250 yeast-agar plates and went home with “nutrient powder” in my hair and lungs. But even with the accidents and drudgery, the bigger goal was motivating. Being part of the real scientific conversation gave my work purpose that a grade did not. I dreamed of building nanobots and splicing the DNA together to build biological machines. It sure as hell beat memorizing the Kreb’s cycle in return for a 4.0 GPA and med school.

That is what I love about science: you get to build something, you get to dream it up and ask questions and see if it works and even if you fail you learn something. What I loved was a long way from the dreary crap that characterizes so many undergrad classes. To be fair, the day-to-day isn’t all that whiz bang, but it’s rarely Dilbert-esque and I really liked the day-to-day challenges. There was something zen about turning on music and pipetting for three hours. That was right for me. It might not be for you; if you’re trying to be a scientist or get a feel for what science is like (more on that below), don’t be afraid to expose yourself to multiple labs if the first doesn’t work out for you.

My own heart will always be that of a splice-n-dicer. I’ll always love fiddling with DNA more than purifying compounds over a bunsen burner. But you don’t know what day-to-day tasks will give you the most pleasure. You don’t yet know that you might find zen at 3 a.m. setting up DNA assays, your mind clear, the fluid motion of your hand pulling you into a state of flow. You find out by doing, and you might be surprised—especially because classes don’t give you a good sense of what the life of a scientist is like. It also doesn’t introduce you to the graduate students, the post-doctorates and the assistant professors who show you what kind of struggle comes from love, which in turn generates internal motivation. It doesn’t take you away from your university into summer programs that show you how amazing it is to be in a lab with real money and the resources to make your crazy ideas possible.

Which brings me to choosing a field: If you like science, but don’t know what kind, pick the most substantive one that interests you, with as much math as you’re willing to handle, and just get started (math is handy because it applies virtually everywhere in the sciences). Chemistry, biochem and biology overlap to such a degree that I was working in a biochem lab on a genetics project with the goal of creating a protein, and biology labs worked with us in tandem. When you get into the real work, the lines between fields blur. You can major in biochem and get a Ph.D. in neuroscience, study organic chemistry and work for a physical chemistry / research firm. Other scientists don’t care about what classes you took or what your degree says—they care about what you know and what you can do and if what you can do can be applied in a useful way. When in doubt, focus on developing technical skills more than the words on your degree.

One summer I applied to the Mayo Clinic Summer Undergraduate Research Fellowship (something I recommend anyone interested in science do—there are “SURF” programs at almost every major university and research center and they will give you a stipend, housing and exposure to a new lab. It can do amazing things for your CV, your career and your relationship to the larger scientific community. In math and some other fields, your best bet is the NSF’s Research Experiences for Undergraduates (REU) Program). But I didn’t get the job. I had six months in a lab at that point. I had a 3.96 GPA. I had a pretty great “why me” essay. Still, nothing.

A year later I applied again. By that time I’d been in the lab for a year and a half. I knew how to handle most of our major equipment. My CV described the tasks I could perform unsupervised, the problems I tackled by myself, and solutions I’d found. My advisor explained my role and the amount of autonomy I had been given. This time I got the job. When I met with the director of my summer lab in person he made it clear that there were many fine applicants with stellar GPAs. I’d never even worked with radioactive iodine tagged proteins. They picked me because they knew undergrads only had three months to get substantive research done, and they simply didn’t have time to train someone (especially someone who might turn out to lack tenacity). They needed someone who knew how to work in the lab and could adapt quickly. They needed someone who knew how to work the machines my college lab used, and someone who knew how to work with e-coli plasmids. I could do that.

So pick whatever you think you like best, start with that, find a lab, and learn how to be adept at as many basic lab skills as possible. Delve more deeply into the ones associated with your research. Be ready to work when the right opportunity and research lab come along. The new lab will always ask what skills you have and whether they can be applied to the questions their lab is trying to solve, even if you’ve never asked similar questions. A chemistry major could therefore be what a certain biology lab needs at a given time.

A lot of what is frustrating and off-putting about science at first, including working in the research lab, is the same thing that’s frustrating and off-putting about math: to really enter the conversation you have to have the vocabulary, so there’s a lot of memorizing when you start. Which is just obnoxious. But it doesn’t take too long, and if you start interning in a lab early, then the memorizing feels justifiable and pertinent, even if you feel initially more frustrated at a) not knowing the information and b) not knowing how to apply it. If you don’t get into a lab, however, it’s just hard and pointlessly so (even though it isn’t).

(Virtually all fields have this learning curve, whether you realize it or not; one of Jake’s pet books is Daniel T. Willingham’s Why Don’t Students Like School: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom, which describes how people move from no knowledge to shallow knowledge to deep knowledge. It’s bewildering and disorienting to start with no knowledge on a subject, but you have to endure and transcend that state if you’re going to move to deep knowledge. He says that he’s climbed that mountain with regard to writing, which makes writing way more rewarding than it used to be.)

Once you have the language and are able to think about, say, protein folding, the way you would a paragraph of prose, or the rhythm in a popular song, science takes on a whole new life, like Frankenstein’s Monster but without the self-loathing or murder. You start to think about what questions you can ask, what you can build, and what you can do—as opposed to what you can regurgitate. The questions you pose to people in your lab will lead to larger conversations. Feeling like an insider is nice, not only because it’s nice to belong, but because you’ll realize that even being a small part of the conversation means you’re still part of the larger discussion.

Science is exciting, but not until you find a way to break through the barriers and into the real thing, so don’t give up prematurely. Like most things, however, your experience depends on whether you have or make the right opportunities. I went to medical school after a grad school detour. How I feel about that decision is an entirely different essay, and one I’ll post later. I ended up specializing in Emergency Medicine because I had enthusiastic ER docs teaching me. Before residency, I thought I’d do anesthesia, but the profs were boring and it seemed like awful work. I’m on a fabulous anesthesia rotation right now, the medical equivalent of a Riviera cruise, and am thinking, “Hey! Maybe I should have done this.” Same with rehab medicine. It’s a perfect fit for me, but I had two boring weeks of it in a non-representative place and so wasn’t about to sign myself over to a whole career without having any more to base my opinion on.

Some days I think that if I’d had a different lab, which exposed me to different things, if my Mayo summer had given me different connections, I would be pipetting merrily away at Cold Spring Harbor research center, building a nanobot that would deliver the next big cancer treatment on a cellular level. Or maybe I would be a disgruntled post-doc, wishing that I could finally have a lab of my own. Or working for Pfizer. Anything could have changed my path. And just because you choose to study something you love doesn’t mean you’ll succeed.

But not choosing to study something you love is even worse. Point is, most choices in life are luck and chance, but you shouldn’t discard viable options—especially ones in science—based on a couple of survey courses designed to move the meat. Universities do view freshmen as piggybanks whose tuition dollars fund professors’ salaries and research, which is why they cram 1,000 of you into lecture halls and deliver an industrial-grade product that’s every bit as pleasant as industrial-grade mystery meat. Unfortunately, those classes are often the only real way to know if you like something and to be exposed to it unless you seek out more real-world representative opportunities. Most universities won’t go out of their way to shunt you into those opportunities. You have to want them and seek them out. So if you think you like biology? Or physics? Read The Elegant Universe**. The greatest show on earth. The history of the polio vaccine. See if it stirs you.

That being said, if you don’t like science, you don’t like it; I’m just warning you that what you think you don’t like might simply be due to not quite knowing enough or having negative exposures. Still, you can have all the best intentions, follow my advice, find a great lab, try out different opportunities if the first or second don’t work out, and decide it’s just not for you. You probably can’t force it to be your passion, but you probably also underestimate the extent to which you, like most people, have a range of possible passions. I only caution you to make sure that you aren’t basing your choice on one bad class or a crappy lab advisor. This is good advice in any field.

Here’s an example of possible premature optimization: I received an email from Jake’s former student, saying she was thinking about being a judge as a “backup,” in case a career in publishing didn’t work out. Being a judge, that’s a life goal. A big one. And law does not make good on its promise of a comfortable income the way it once did. For more on that, see the Slate.com article “A Case of Supply v. Demand: Law schools are manufacturing more lawyers than America needs, and law students aren’t happy about it,” which points out that there are too many certified and credentialed “lawyers” for the amount of legal work available. Plus, while society needs a certain number of lawyers to function well, too many lawyers leads to diminishing returns as lawyers waste time ginning up work by suing each other over trivialities or chasing ambulances.

By contrast, an excess of scientists and engineers means more people who will build the stuff that lawyers then litigate over. Scientists and engineers expand the size of the economic pie; lawyers mostly work to divide it up differently. Whenever possible, work to be a person who creates things, instead of a person who tries to take stuff created by someone else. There is an effectively infinite amount of work in science because the universe is big and we don’t really understand it and we probably never will.* New answers to questions in science yields more questions. More lawsuits launched by lawyers just yields fighting over scraps.

Another personal example: I wasn’t just queen of the lab nerds. Sure, I tie-dyed my lab coat and dated a man who liked to hear me read aloud from organic chemistry textbook, but I also wanted to write: not academic papers and book chapters, but novels and essays. I’d always been dual-minded and never bought the “Two Cultures” idea one scientific and one humanistic, described in C.P. Snow’s eponymous book. This bifurcation is, to speak bluntly, bullshit. As a kid I spent as much time trying to win the science fair as I did submitting poetry to Highlights. May 1994’s Grumpy Dog issue was my first publication. You may have read it and enjoyed the story of “Sarah, the new puppy.” Or, you may not have been born yet. That was me as a kid. As an adult, I’m not confined to science either—and neither is any other scientist.

I imagine many of you reading this post who are struggling with whether or not to be a scientist are, fundamentally, not struggling with what you want to major in, but what you want to be and how your decisions in college influence your options. Many of you are likely creatively-minded, as scientific types often are, despite how “poindexter” characters are portrayed in popular T.V. Staying close to your passions outside the test tube gives you the creative spark that makes your scientific thinking unique and fresh. So you don’t have to pick science and say, “That’s it, I’m a scientist and only a scientist.” You become a scientist and say: Now what do I want to build/ask/figure out?


Jake again:

So what should you do now to get into science? Here’s a list that I, Jake Seliger the non-scientist, wrote, based on the experiences described by friends in the sciences:

0) Look for profs in your department. Look for ones who are doing research in an area in or adjacent to what you might be interested in doing.

1) Read a couple of their recent papers. You probably won’t understand them fully, but you should try to at least get a vague sense of what they’re doing. You may want to prepare a couple of questions you can ask in advance; some profs will try to weed out people who are merely firing off random e-mails or appearing in the office hours to beg.

2) Look for a website related to their lab or work, and try to get a sense of whether you might be interested in their work. Chances are you won’t be able to tell in advance. You should also figure out who their grad students are—most science profs will have somewhere between one and dozens of students working under them.

3) Go meet with said prof (or grad students) and say, “I’m interested in X, I’ve read papers W, Y, and Z, and I’d like to work in your lab.” Volunteer, since you probably won’t get paid at first.

4) They might say no. It’s probably not personal (rejection is rarely personal in dating, either, but it takes many people years or decades to figure this out). If the prof says no, go work on the grad students some, or generally make yourself a pest.

5) Try other labs.

6) Don’t give up. This is a persistent theme in this essay for good reason.

7) Keep reading papers in the area you’re interested in, even if you don’t understand them. Papers aren’t a substitute for research, but you’ll at least show that you’re interested and learn some of the lingo. Don’t underestimate the value of knowing a field’s jargon. Knowing the jargon can also be satisfying in its own right.

8) Take a computer science course or, even better, computer science courses. Almost all science labs have programming tasks no one wants to do, and your willingness to do scutwork will make you much more valuable. Simple but tedious programming tasks are the modern lab equivalent of sweeping the floor.

If you don’t have bench research experience, you probably won’t get into grad school, or into a good grad school. You might have to pay for an MA or something like that to get in, which is bad. If you’re thinking about grad school, read Louis Menand’s The Marketplace of Ideas as soon as possible. See also Penelope Trunk’s Don’t Try to Dodge the Recession with Grad School and Philip Greenspun’s Women in Science. Ignore the questionable gender comments Greenspun makes and attend to his discussion of what grad school in the sciences is like, especially this, his main point: “Adjusted for IQ, quantitative skills, and working hours, jobs in science are the lowest paid in the United States.”

Another: Alex Tabarrok points out in his book Launching The Innovation Renaissance: A New Path to Bring Smart Ideas to Market Fast that we appear to have too few people working in technical fields and too many majoring in business and dubious arts majors (notice that he doesn’t deal with graduate school, which is where he diverges from Greenspun). In his blog post “College has been oversold,” Tabarrok points out that student participation in fields that pay well and are likely “to create the kinds of innovations that drive economic growth” is flat. On an anecdotal level, virtually everyone I know who majored in the hard sciences and engineering is employed. Many of those who, like me, majored in English, aren’t.

According to a study discussed in the New York Times, people apparently leave engineering because it’s hard: “The typical engineering major today spends 18.5 hours per week studying. The typical social sciences major, by contrast, spends about 14.6 hours.” And:

So maybe students intending to major in STEM fields are changing their minds because those curriculums require more work, or because they’re scared off by the lower grades, or a combination of the two. Either way, it’s sort of discouraging when you consider that these requirements are intimidating enough to persuade students to forgo the additional earnings they are likely to get as engineers.

There’s another way to read these findings, though. Perhaps the higher wages earned by engineers reflect not only what they learn but also which students are likely to choose those majors in the first place and stay with them.

Don’t be scared by low grades. Yes, it’s discouraging to take classes where the exam average is 60, but keep taking them anyway. Low grades might be an indication that the field is more intellectually honest than one with easy, high grades.

In the process of writing and editing this essay, the usual panoply of articles is about topics like “science majors are more likely to get jobs” have been published. You’ve probably read these articles. They’re mostly correct. The smattering linked to here are just ones that happened to catch my attention.

Science grads may not get jobs just because science inherently makes you more employable—it may be that more tenacious, hard-working, and thus employable people are inclined to major in the sciences. But that means you should want to signal that you’re one of them. And healthier countries in general tend to focus on science, respect science, and product scientists; hence the story about the opposite in “Why the Arabic World Turned Away from Science.”

If you’re leaving science because the intro courses are too hard and your friends majoring in business are having more fun at parties, you’re probably doing yourself a tremendous disservice that you won’t even realize until years later. If you’re leaving science because of a genuine, passionate interest in some other field, you might have a better reason, but it still seems like you’d be better off double majoring or minoring in that other field.


My friend again, adding to what I said above:

As someone who was going to do the science PhD thing before deciding on medical school I agree with most of what Jake says. Let me emphasize: you will have to volunteer at first because you don’t have the skills to be hired in a lab for a job that will teach you something. Being hired without previous experience usually means the job doesn’t require the skills you want to learn, and so you won’t learn them. So you don’t want that job.

I had a paying job in a lab, so you can get them eventually—but I only started getting paid after I’d worked in it for a year, even then the pay was more like a nice boost because the money just happened to show up and they thought, “What the heck, she’s been helpful.” Think of this time as paying your way into graduate school, because if you don’t have lab work, despite how good your grades are, you will not get into a good graduate school with funding.

Here’s why: You have a limited amount of time in graduate school and are not just there to do independent research and learn. You’re there to do research with the department, and they need you to start immediately. If you already have years of bench research experience, the departments and the professors in that department know you can—and there is no substitute for experience.

The place where you really learn how to work in a lab and develop your skills is in one, not in the lab classes where you learn, at best, some rote things (plus, you need to know if you like the basic, day-to-day experience of working in a lab and the kind of culture you’ll find yourself in; not everyone does). Even if you do learn the tools you need for a certain lab, it doesn’t demonstrate that you’re actually interested in research.

The only thing that demonstrates an interest in research, which is all graduate school really cares about, is working in a lab and doing real research. I can’t stress that enough, which is why I’ve repeated it several times in this paragraph. A 4.0 means you can study. It doesn’t mean you can do research. People on grad school committees get an endless number of recommendation letters that say, “This candidate did well in class and got an ‘A.'” Those count for almost nothing. People on grad school committees want letters that say, “This candidate did X, Y, and Z in my lab.”

I recommend starting with your professors—the ones whose classes you’ve liked and who know you from office hours. Hit them up first. Tell them your goal is to be a scientist and that, while academics are nice, you want to start being a scientist now. If they don’t have space for you, tell them to point you in the direction of someone who does. Keep an open mind. Ask everybody. I was interested in nanobots, gene work, molecular what-nots, etc.

I started by asking my orgo [“organic chemistry” to laymen] teacher. Nothing. I asked my Biochem [“biological chemistry” or “biochemistry”] professor and was welcomed with open arms. Point is, if the labs you want have no space, go to another. Don’t give up. Be relentlessly resourceful. Be tenacious—and these aren’t qualities uniquely useful to scientists.

The skills I ended up with in the biochem lab turned out to be almost 100% on point with what I wanted to do later, even though the research was different. The kind of research you end up doing usually springs from the lab skills you have, and it’s much harder to decide what you want and try to find a lab that will give you those skills. So instead of trying to anticipate what research you’ll want to do from a position where you can’t know, just learn some research skills. Any skills are better than none. Then you have something to offer the lab you want when space / funding becomes available. I took what I learned in that biochem lab and spent a summer doing research on protein folding—it wasn’t like my initial research, but the prof needed someone who knew how to do X, Y and Z, which I did, and he was willing to train me on the rest.

You’ll face other decisions. For example, in many fields you’ll have to decide: do you want wet-lab research (this does not refer to adult entertainment) or do you want more clinical research? “Wet lab” means that you’re mucking with chemicals, or big machines, and stuff like. Clinical research means you’re dealing more with humans, or asking people questions, or something along those lines. I would suggest the wet lab if you think you may be even slightly interested (sort of like how you should experiment with lovers when you’re in college if you think you’re even slightly interested in different sorts of things). In fact, I’d suggest wet-lab work or some sort of computational lab in general, because clinical research skills can be extrapolated from wet lab—but not vice versa.

You can show that you can think if you’re in a clinical lab, but in a wet-lab you need to be able to think and use a pipette. Or that you can use modeling software, if you’re interested in the computer side of things. That’s where the programming comes in handy if you’re capable of doing it; if not, then I feel less strongly than Jake about programming, because often labs need someone with serious computer training, like a post doc, if their research revolves around modeling. But it could come in handy for you, anyway, and certainly couldn’t hurt, so if you’re interested it could be an extra perk.

Once you’re in the lab, if you want to learn skills outside what you’re working with. Ask. Ask everyone. Ask the computer guy, ask the woman on the other project. Get whatever you can get get good at it, then put it on you C.V. and make sure you can explain it clearly when someone asks, even if you’re not an expert, just be able to play on on T.V.

As for #3, about figuring out who their grad students are: I also find that less important. You need to talk to the primary investigator, the guy who runs the lab. If he’s not interested in you, it’s not worth going through grad student channels to convince him to take you. Someone is going to want you, and it’s best to go there in both science and love. Don’t fall for the false lead of the pretty girl in the alluring dress who disappears as soon as you get close. You can always try alternate channels later if you really want to get back into lab #1.

Think of it this way: if you’re struggling just to get a foot in the door, you’re going to struggle to get any research done. Not that the research will feel meaningful at first: you’ll be doing tasks assigned to you. But you should feel like this gets better, that you get more independence. And if that’s the not the ethos of the lab to start with, it never will be. As I mentioned before, if I’d gotten into that orgo lab, I’d have been a scut monkey for years.

As Jake said: read your professors’ papers. You probably won’t have any idea what’s going on. I still have no idea what’s going on half the time, but read ’em anyway. Shows you’re worth the effort, especially when you ask for that lab spot. Jake’s 100% right about ways to get your professors attention.

Don’t give up. Just don’t give up. Take “no” for an answer and kiss grad school (at least a good PhD program with full funding, which is what you want: don’t pay for this under any circumstances) goodbye. Scientists are distinguished by their tenacity, whether they’re in grad school or not. And make sure you know what you’re giving up before you do.

What kind of research are you interested in? What gets you going? Even if you’re not sure there are a certain number of fundamental things that, I believe, if you’re familiar with, will get you into whatever lab you want because they are used in most labs and shows you’re trainable for the other stuff. And you’ll know what science is like, which you simply don’t right now. Giving up on it based on some bad grades as a freshman or sophomore is folly.***


* One snarky but accurate commenter said, “There may be an infinite amount of work in science, but there is a finite (and very unevenly distributed) number of grants.

** Although a different friend observed, “Books are a step above classes, but in my experience, many aspiring theoretical physicists are really people who like reading popular science books more than they like doing math.”

*** You can download this essay as a .pdf.

Are you more than a consumer? “The Once and Future Liberalism” and some answers

This is one of the most insightful thing I’ve read about an unattractive feature of American society: we put an “emphasis on consumption rather than production as the defining characteristic of the good life.” It’s from “Beyond Blue 6: The Great Divorce,” where, in Walter Russell Mead’s reading, “Americans increasingly defined themselves by what they bought rather than what they did, and this shift of emphasis proved deeply damaging over time.” I’m not convinced this has happened equally for everybody, all the time, but it rings awfully true.

Which brings us back to the point made in the title: are you producing more than you consume? Are you focused on making things, broadly imagined, instead of “consuming” them? Is there more to your identity than the music you like and the clothes you wear? (“More” might mean things you know, or know how to do, or know how to make.) Can you do something or somethings few others can? If the answers are “no,” you might be feeling the malaise Mead is describing. In Anything You Want, Derek Sivers writes:

When you want to learn how to do something yourself, most people won’t understand. They’ll assume the only reason we do anything is to get it done, and doing it yourself is not the most efficient way.

But that’s forgetting about the joy of learning and doing.

If you never learn to do anything yourself—or anything beyond extremely basic tasks everyone else knows—you’re not going to lead a very satisfying life. Almost as bad, you probably won’t know it. You’ll only have that gnawing feeling you can’t name, a feeling that’s easy—too easy—to ignore most of the time. You can’t do everything yourself, and it would be madness to try. But you should be thinking about expanding what you can do. I’ve made a conscious effort to resist being defined by what I buy rather than what I do, and that effort has intensified since I read Paul Graham’s essay “Stuff;” notice especially where he says, “Because the people whose job is to sell you stuff are really, really good at it. The average 25 year old is no match for companies that have spent years figuring out how to get you to spend money on stuff. They make the experience of buying stuff so pleasant that “shopping” becomes a leisure activity.” To me it’s primarily tedious.

But this tedious activity is everywhere, and in Spent: Sex, Evolution, and Consumer Behavior, Geoffrey Miller describes how companies and advertisers have worked to exploit evolved human systems for mating and status in order to convince you that you need stuff. Really, as he points out, you don’t: five minutes of conversation does more signaling than almost all the stuff in the world. Still, I don’t really take a moral view of shopping, in that I don’t think disliking shopping somehow makes me more virtuous than someone who does like shopping, but I do think the emphasis on consumption is a dangerous one for people’s mental health and well-being. And I wonder if these issues are also linked to larger ones.

A lot of us are suffering from an existential crisis and a search for meaning in a complex world that often appears to lack it. You can see evidence in the Western world’s high suicide rates, in Viktor Frankl’s book Man’s Search for Meaning (he says, “I do not at all see in the bestseller status of my book so much an achievement and accomplishment on my part as an expression of the misery of our time: if hundreds of thousands of people reach out for a book whose very title promises to deal with the question of a meaning to life, it must be a question that burns under the fingernails”), in Irvin Yalom’s Existential Psychotherapy (especially the chapter on despair), in Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, in All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, in The Joy of Secularism: 11 Essays for How We Live Now, in the work of Michel Houellebecq. I could keep going. The question isn’t merely about the number of responses to present conditions, but about what those present conditions are, how they came about, what they say about contemporary politics (Mead makes the political connection explicit in “The Once and Future Liberalism: We need to get beyond the dysfunctional and outdated ideas of 20th-century liberalism“), and what they say about how the individual should respond.

People respond in all kinds of ways. Despair is one. Fanaticism, whether towards sports teams or political parties or organized religion is another, with religion being especially popular. You can retreat to religious belief, but most dogmatic religious beliefs are grounded in pre-modern beliefs and rituals, and too many religions are surrounded by fools (did Heinlein say, “It’s not God I have a problem with, it’s his fan club”? Google yields many variations). Those kinds of answers don’t look very good, at least to me. You have to look harder.

I think part of the answer has to lie in temperament, attitude, and finding a way to be more than a consumer. For a very long time, people had to produce a lot of what they consumed—including their music, food, and ideas. I don’t want to lapse into foolish romanticism about the pre-modern, pre-specialized world, since such a world would be impossible to recreate and ugly if we did. People conveniently forget about starvation and warfare when they discuss the distant past. Plus, specialization has too many benefits—like the iMac I’m looking at, the chair I’m sitting in, the program I’m using to write this, the tasty takeout I can order if I want it, the tea in my kitchen, the condoms in my bedroom, or the camera on my tripod. For all its virtues, though, I’m increasingly convinced that specialization has psychic costs that few of us are really confronting, even if many of us feel them, and those costs relate to how we related to meaning and work.

According to Mead, in the 19th Century, families “didn’t just play together and watch TV together; they worked together to feed and clothe themselves.” Today, disparate activities drive specialization even within the family, and family life has become an increasingly consumption, status-oriented experience. To Mead, “If we wonder why marriage isn’t as healthy today in many cases, one reason is surely that the increasing separation of the family from the vital currents of economic and social life dramatically reduces the importance of the bond to both spouses – and to the kids.” We’ve gotten wealthier as a society, and wealth enables us to make different kinds of choices. Marriage is much more of a consumer good: we choose it, rather than being forced into it because the alternative is distressingly high resource diminishment. Charles Murray observes some effects this has on marriage in Coming Apart: The State of White America, 1960-2010, since getting and staying married has enormous positive effects on income—even if “the vital currents of economic and social life” conspire to make spouses less dependent on each other.

Kids are less economically useful and simultaneously more dependent on their parents. It also means they’re separated from the real world for a very long time. To Mead, part of this is education:

As the educational system grew more complex and elaborate (without necessarily teaching some of the kids trapped in it very much) and as natural opportunities for appropriate work diminished, more and more young people spent the first twenty plus years of their lives with little or no serious exposure to the world of work.

It starts early, this emphasis on dubious education and the elimination of “natural opportunities for appropriate work”:

Historically, young people defined themselves and gained status by contributing to the work of their family or community. Childhood and adulthood tended to blend together more than they do now. [. . .] The process of maturation – and of partner-seeking – took place in a context informed by active work and cooperation.

In the absence of any meaningful connection to the world of work and production, many young people today develop identities through consumption and leisure activities alone. You are less what you do and make than what you buy and have: what music you listen to, what clothes you wear, what games you play, where you hang out and so forth. These are stunted, disempowering identities for the most part and tend to prolong adolescence in unhelpful ways. They contribute to some very stupid decisions and self-defeating attitudes. Young people often spend a quarter century primarily as critics of a life they know very little about: as consumers they feel powerful and secure, but production frightens and confuses them.

I’m familiar with those “stunted, disempowering identities” because I had one for along time. Most teenagers don’t spend their adolescence becoming expert hackers, like Mark Zuckerberg or Bill Gates, and they don’t spend their time becoming experts musicians, like innumerable musicians. They spend their adolescences alienated.

I’m quoting so many long passages from Mead because they’re essential, not incidental, to understanding what’s going on. The result of an “absence of any meaningful connection to the world of work and production” is Lord of the Flies meets teen drama TV and movies. Paul Graham gets this; in one of my favorite passages from “Why Nerds Are Unpopular,” he writes:

Teenage kids used to have a more active role in society. In pre-industrial times, they were all apprentices of one sort or another, whether in shops or on farms or even on warships. They weren’t left to create their own societies. They were junior members of adult societies.

Teenagers seem to have respected adults more then, because the adults were the visible experts in the skills they were trying to learn. Now most kids have little idea what their parents do in their distant offices, and see no connection (indeed, there is precious little) between schoolwork and the work they’ll do as adults.

And if teenagers respected adults more, adults also had more use for teenagers. After a couple years’ training, an apprentice could be a real help. Even the newest apprentice could be made to carry messages or sweep the workshop.

Now adults have no immediate use for teenagers. They would be in the way in an office. So they drop them off at school on their way to work, much as they might drop the dog off at a kennel if they were going away for the weekend.

What happened? We’re up against a hard one here. The cause of this problem is the same as the cause of so many present ills: specialization. As jobs become more specialized, we have to train longer for them. Kids in pre-industrial times started working at about 14 at the latest; kids on farms, where most people lived, began far earlier. Now kids who go to college don’t start working full-time till 21 or 22. With some degrees, like MDs and PhDs, you may not finish your training till 30.

But “school” is so often bad that 30% of teenagers drop out—against their own economic self-interest. Only about a third of people in their twenties have graduated from college. What gives? Part of it must be information asymmetry: teenagers don’t realize how important school is. But the other part of the problem is what Graham describes: how dull school seems, and how disconnected it is from what most people eventually do. And that disconnection is real.

So, instead of finding connections to skills and making things, teenagers pick up status cues from music and other forms of professionally-produced entertainment. Last year, I was on a train from Boston to New York and sat near a pair of 15-year-olds. We talked a bit, and one almost immediately asked me what kind of music I liked. The question struck me because it had been so long since I’d been asked it so early in a conversation with a stranger. In high school and early college, I was asked it all the time: high school-aged people sort themselves into tribes and evaluate others based on music. In college, the first question is, “What’s your major?”, and in the real world it’s, “What do you do?” The way people ask those early questions reveals a lot about the assumptions underlying the person doing the asking.

Now: I like music as much as the next guy, but after high school I stopped using it to sort people. Why should high school students identify themselves primarily based on music, as opposed to some other metric? It’s probably because they have nothing better to signal who they are than music. It would make sense to discuss music if you are a musician or a genuine music aficionado, but I wasn’t one and most of the people I knew weren’t either. Yet the “What’s your favorite music?” question always arose. Now, among adults, it’s more often “What do you do?”, which seems to me an improvement, especially given its proximity to the questions, “What can you do?” and “What do you know?”

But that’s not a very important question for most high school students. They aren’t doing anything hard enough that errors matter. And in some ways, mistakes don’t matter much in most modern walks of life: they don’t cause people to die, or to really live, or do things differently. So finding a niche where mistakes do matter—as they do when you run your own business, or in certain parts of the military, or in some parts of medicine, or as an individual artist accountable to fans—can lead to a fuller, more intensely lived life. But that requires getting off the standard path. Few of us have the energy to bother. Instead, we feel underutilized, with the best parts of ourselves rusting from disuse–or perhaps gone altogether, because we never tried to develop the best parts of ourselves. That might explain, almost as much as my desire to tell stories, why I spend so much time writing fiction that, as of this writing, has mostly been fodder for agents and friends, and why I persist in the face of indifference.

Individuals have to learn to want something more than idle consumption. They have to want to become artists, or hackers, or to change the world, or to make things, all of which are facets of the same central application of human creativity (to me, the art / science divide is bullshit for similar reasons). For much of the 20th Century, we haven’t found “something” in work:

Since work itself was so unrewarding for so many, satisfaction came from getting paid and being able to enjoy your free time in the car or the boat that you bought with your pay. It was a better deal than most people have gotten through history, but the loss of autonomy and engagement in work was a cost, and over time it took a greater and greater toll.

A friend once told me about why he left a high-paying government engineering job for the hazards and debts of law school: at his engineering job, everyone aspired to a boat or a bigger TV. Conversations revolved around what people had bought or were planning to buy. No one thought about ideas, or anything beyond consumption. So he quit to find a place where people did. I mean, who cares that you buy a boat? Maybe it makes getting laid marginally easier, at least for guys, but that time, money, and energy would probably be better spent going out and meeting people, rather than acquiring material objects.

I’ve seen people who have virtually no money be extraordinarily happy and extraordinarily successful with the sex of their choice, and people in the exact opposite condition. The people with no money and lots of sex tend to get that way because of their personalities and their ability to be vibrant (again: see Miller’s book Spent). Even if you’re bad at being vibrant, you can learn to be better: The Game is, at bottom, about how to be vibrant for straight men, and the many women’s magazines (like Cosmo) are, at bottom, about how to be vibrant for women. Neither, unfortunately, really teaches one to be tolerant of other people’s faults, which might be the most important thing in the game of sex, but perhaps that comes through in other venues.

I don’t wish to deify Mead or his argument; when he says, “There was none of the healthy interaction with nature that a farmer has,” I think he’s missing how exhausting farming was, how close farmers were to starvation for much of agricultural history, and how nasty nature is when you’re not protected from it by modern amenities (we only started to admire nature in the late eighteenth century, when it stopped being so dangerous to city dwellers.) It’s easy to romanticize farming when we don’t have to do it. Likewise, Mead says:

A consumption-centered society is ultimately a hollow society. It makes people rich in stuff but poor in soul. In its worst aspects, consumer society is a society of bored couch potatoes seeking artificial stimulus and excitement.

But I have no idea what he means by “poor in soul.” Are Mark Zuckerberg or Bill Gates “poor in soul?” Is Stephen King? Tucker Max? I would guess not, even though all four are “rich in stuff.” We’ve also been “A consumption-centered society” for much of the 20th century, if not earlier, and, all other things being equal, I’d rather have the right stuff than no stuff, even if the mindless acquisition of stuff is a growing hazard. The solution might be the mindful acquisition of stuff, but even that is hard and takes a certain amount of discipline, especially given how good advertisers are at selling. I would also include “politicians” as being among advertisers these days.

Contemporary politics are (mostly) inane, for the structural reasons Bryan Caplan describes in The Myth of the Rational Voter. So I’m predisposed to like explanations along these lines:

Nobody has a real answer for the restructuring of manufacturing and the loss of jobs to automation and outsourcing. As long as we are stuck with the current structures, nobody can provide the growing levels of medical and educational services we want without bankrupting the country. Neither “liberals” nor “conservatives” can end the generation-long stagnation in the wage level of ordinary American families. Neither can stop the accelerating erosion of the fiscal strength of our governments at all levels without disastrous reductions in the benefits and services on which many Americans depend.

Most people on the right and the left have “answers” about contemporary problems that miss large aspects of those problems or the inherent trade-offs involved. A lot of the debate that does occur is dumb, sometimes militantly and sometimes inadvertently, but dumb nonetheless. As Mead says: “We must come to terms with the fact that the debate we have been having over these issues for past several decades has been unproductive. We’re not in a “tastes great” versus “less filling” situation; we need an entirely new brew.” Yet we’re getting variations on old brews, in which liberals look like conservatives in their defense of 1930s-era policies, and conservatives look like conservatives in their veneration of 19th century-style free-market policies. Only a few commentators, like Tyler Cowen in The Great Stagnation, even try earnestly to identify real problems and discuss those problems in non-partisan terms.

This post started as a pair of links, but it ended in an essay because Mead’s essays are so important in the way they get at an essential aspect of contemporary life. If you’re a writer, you can’t afford to ignore what’s happening on the ground, unless you want to be, at best, irrelevant, and I wonder if one reason nonfiction may be outpacing fiction in the race for importance involves the way nonfiction sidesteps questions of meaning by focusing on real things with real effects, instead of how people can’t or won’t find meaning in a world where most of us succeed, at least on a material level, by following a conventional path.

Naturally, I also think about this in the context of fiction. A while ago, I wrote this to a friend: “Too much fiction is just about dumb people with dumb problems doing dumb things that the application of some minor amount of logic would solve. Bored with life because you’re a vaguely artistic hipster? Get a real job, or learn some science, or be a real artist, or do something meaningful. The world is full of unmet needs and probably always will be. But so many characters wander around protected by their own little bubbles. Get out! The world is a big place.” Mead, I think, would agree.

It’s hard to disentangle the individual, education, acquisition, ideas, society, and politics. I’ve somewhat conflated them in my analysis, above, because one inevitable leads to the other, since talking about how you as a person should respond inevitably leads one to questions about how you were educated, and education as a mass-process inevitably leads one to society, and so forth. But I, as an individual, can’t really change the larger systems in which I’m embedded, though I can do a limited amount to observe how those systems work and how I respond to them (which often entails writing like this and linking to other writers).

Are you more than a consumer? "The Once and Future Liberalism" and some answers

This is one of the most insightful thing I’ve read about an unattractive feature of American society: we put an “emphasis on consumption rather than production as the defining characteristic of the good life.” It’s from “Beyond Blue 6: The Great Divorce,” where, in Walter Russell Mead’s reading, “Americans increasingly defined themselves by what they bought rather than what they did, and this shift of emphasis proved deeply damaging over time.” I’m not convinced this has happened equally for everybody, all the time, but it rings awfully true.

Which brings us back to the point made in the title: are you producing more than you consume? Are you focused on making things, broadly imagined, instead of “consuming” them? Is there more to your identity than the music you like and the clothes you wear? (“More” might mean things you know, or know how to do, or know how to make.) Can you do something or somethings few others can? If the answers are “no,” you might be feeling the malaise Mead is describing. In Anything You Want, Derek Sivers writes:

When you want to learn how to do something yourself, most people won’t understand. They’ll assume the only reason we do anything is to get it done, and doing it yourself is not the most efficient way.

But that’s forgetting about the joy of learning and doing.

If you never learn to do anything yourself—or anything beyond extremely basic tasks everyone else knows—you’re not going to lead a very satisfying life. Almost as bad, you probably won’t know it. You’ll only have that gnawing feeling you can’t name, a feeling that’s easy—too easy—to ignore most of the time. You can’t do everything yourself, and it would be madness to try. But you should be thinking about expanding what you can do. I’ve made a conscious effort to resist being defined by what I buy rather than what I do, and that effort has intensified since I read Paul Graham’s essay “Stuff;” notice especially where he says, “Because the people whose job is to sell you stuff are really, really good at it. The average 25 year old is no match for companies that have spent years figuring out how to get you to spend money on stuff. They make the experience of buying stuff so pleasant that “shopping” becomes a leisure activity.” To me it’s primarily tedious.

But this tedious activity is everywhere, and in Spent: Sex, Evolution, and Consumer Behavior, Geoffrey Miller describes how companies and advertisers have worked to exploit evolved human systems for mating and status in order to convince you that you need stuff. Really, as he points out, you don’t: five minutes of conversation does more signaling than almost all the stuff in the world. Still, I don’t really take a moral view of shopping, in that I don’t think disliking shopping somehow makes me more virtuous than someone who does like shopping, but I do think the emphasis on consumption is a dangerous one for people’s mental health and well-being. And I wonder if these issues are also linked to larger ones.

A lot of us are suffering from an existential crisis and a search for meaning in a complex world that often appears to lack it. You can see evidence in the Western world’s high suicide rates, in Viktor Frankl’s book Man’s Search for Meaning (he says, “I do not at all see in the bestseller status of my book so much an achievement and accomplishment on my part as an expression of the misery of our time: if hundreds of thousands of people reach out for a book whose very title promises to deal with the question of a meaning to life, it must be a question that burns under the fingernails”), in Irvin Yalom’s Existential Psychotherapy (especially the chapter on despair), in Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, in All Things Shining: Reading the Western Classics to Find Meaning in a Secular Age, in The Joy of Secularism: 11 Essays for How We Live Now, in the work of Michel Houellebecq. I could keep going. The question isn’t merely about the number of responses to present conditions, but about what those present conditions are, how they came about, what they say about contemporary politics (Mead makes the political connection explicit in “The Once and Future Liberalism: We need to get beyond the dysfunctional and outdated ideas of 20th-century liberalism“), and what they say about how the individual should respond.

People respond in all kinds of ways. Despair is one. Fanaticism, whether towards sports teams or political parties or organized religion is another, with religion being especially popular. You can retreat to religious belief, but most dogmatic religious beliefs are grounded in pre-modern beliefs and rituals, and too many religions are surrounded by fools (did Heinlein say, “It’s not God I have a problem with, it’s his fan club”? Google yields many variations). Those kinds of answers don’t look very good, at least to me. You have to look harder.

I think part of the answer has to lie in temperament, attitude, and finding a way to be more than a consumer. For a very long time, people had to produce a lot of what they consumed—including their music, food, and ideas. I don’t want to lapse into foolish romanticism about the pre-modern, pre-specialized world, since such a world would be impossible to recreate and ugly if we did. People conveniently forget about starvation and warfare when they discuss the distant past. Plus, specialization has too many benefits—like the iMac I’m looking at, the chair I’m sitting in, the program I’m using to write this, the tasty takeout I can order if I want it, the tea in my kitchen, the condoms in my bedroom, or the camera on my tripod. For all its virtues, though, I’m increasingly convinced that specialization has psychic costs that few of us are really confronting, even if many of us feel them, and those costs relate to how we related to meaning and work.

According to Mead, in the 19th Century, families “didn’t just play together and watch TV together; they worked together to feed and clothe themselves.” Today, disparate activities drive specialization even within the family, and family life has become an increasingly consumption, status-oriented experience. To Mead, “If we wonder why marriage isn’t as healthy today in many cases, one reason is surely that the increasing separation of the family from the vital currents of economic and social life dramatically reduces the importance of the bond to both spouses – and to the kids.” We’ve gotten wealthier as a society, and wealth enables us to make different kinds of choices. Marriage is much more of a consumer good: we choose it, rather than being forced into it because the alternative is distressingly high resource diminishment. Charles Murray observes some effects this has on marriage in Coming Apart: The State of White America, 1960-2010, since getting and staying married has enormous positive effects on income—even if “the vital currents of economic and social life” conspire to make spouses less dependent on each other.

Kids are less economically useful and simultaneously more dependent on their parents. It also means they’re separated from the real world for a very long time. To Mead, part of this is education:

As the educational system grew more complex and elaborate (without necessarily teaching some of the kids trapped in it very much) and as natural opportunities for appropriate work diminished, more and more young people spent the first twenty plus years of their lives with little or no serious exposure to the world of work.

It starts early, this emphasis on dubious education and the elimination of “natural opportunities for appropriate work”:

Historically, young people defined themselves and gained status by contributing to the work of their family or community. Childhood and adulthood tended to blend together more than they do now. [. . .] The process of maturation – and of partner-seeking – took place in a context informed by active work and cooperation.

In the absence of any meaningful connection to the world of work and production, many young people today develop identities through consumption and leisure activities alone. You are less what you do and make than what you buy and have: what music you listen to, what clothes you wear, what games you play, where you hang out and so forth. These are stunted, disempowering identities for the most part and tend to prolong adolescence in unhelpful ways. They contribute to some very stupid decisions and self-defeating attitudes. Young people often spend a quarter century primarily as critics of a life they know very little about: as consumers they feel powerful and secure, but production frightens and confuses them.

I’m familiar with those “stunted, disempowering identities” because I had one for along time. Most teenagers don’t spend their adolescence becoming expert hackers, like Mark Zuckerberg or Bill Gates, and they don’t spend their time becoming experts musicians, like innumerable musicians. They spend their adolescences alienated.

I’m quoting so many long passages from Mead because they’re essential, not incidental, to understanding what’s going on. The result of an “absence of any meaningful connection to the world of work and production” is Lord of the Flies meets teen drama TV and movies. Paul Graham gets this; in one of my favorite passages from “Why Nerds Are Unpopular,” he writes:

Teenage kids used to have a more active role in society. In pre-industrial times, they were all apprentices of one sort or another, whether in shops or on farms or even on warships. They weren’t left to create their own societies. They were junior members of adult societies.

Teenagers seem to have respected adults more then, because the adults were the visible experts in the skills they were trying to learn. Now most kids have little idea what their parents do in their distant offices, and see no connection (indeed, there is precious little) between schoolwork and the work they’ll do as adults.

And if teenagers respected adults more, adults also had more use for teenagers. After a couple years’ training, an apprentice could be a real help. Even the newest apprentice could be made to carry messages or sweep the workshop.

Now adults have no immediate use for teenagers. They would be in the way in an office. So they drop them off at school on their way to work, much as they might drop the dog off at a kennel if they were going away for the weekend.

What happened? We’re up against a hard one here. The cause of this problem is the same as the cause of so many present ills: specialization. As jobs become more specialized, we have to train longer for them. Kids in pre-industrial times started working at about 14 at the latest; kids on farms, where most people lived, began far earlier. Now kids who go to college don’t start working full-time till 21 or 22. With some degrees, like MDs and PhDs, you may not finish your training till 30.

But “school” is so often bad that 30% of teenagers drop out—against their own economic self-interest. Only about a third of people in their twenties have graduated from college. What gives? Part of it must be information asymmetry: teenagers don’t realize how important school is. But the other part of the problem is what Graham describes: how dull school seems, and how disconnected it is from what most people eventually do. And that disconnection is real.

So, instead of finding connections to skills and making things, teenagers pick up status cues from music and other forms of professionally-produced entertainment. Last year, I was on a train from Boston to New York and sat near a pair of 15-year-olds. We talked a bit, and one almost immediately asked me what kind of music I liked. The question struck me because it had been so long since I’d been asked it so early in a conversation with a stranger. In high school and early college, I was asked it all the time: high school-aged people sort themselves into tribes and evaluate others based on music. In college, the first question is, “What’s your major?”, and in the real world it’s, “What do you do?” The way people ask those early questions reveals a lot about the assumptions underlying the person doing the asking.

Now: I like music as much as the next guy, but after high school I stopped using it to sort people. Why should high school students identify themselves primarily based on music, as opposed to some other metric? It’s probably because they have nothing better to signal who they are than music. It would make sense to discuss music if you are a musician or a genuine music aficionado, but I wasn’t one and most of the people I knew weren’t either. Yet the “What’s your favorite music?” question always arose. Now, among adults, it’s more often “What do you do?”, which seems to me an improvement, especially given its proximity to the questions, “What can you do?” and “What do you know?”

But that’s not a very important question for most high school students. They aren’t doing anything hard enough that errors matter. And in some ways, mistakes don’t matter much in most modern walks of life: they don’t cause people to die, or to really live, or do things differently. So finding a niche where mistakes do matter—as they do when you run your own business, or in certain parts of the military, or in some parts of medicine, or as an individual artist accountable to fans—can lead to a fuller, more intensely lived life. But that requires getting off the standard path. Few of us have the energy to bother. Instead, we feel underutilized, with the best parts of ourselves rusting from disuse–or perhaps gone altogether, because we never tried to develop the best parts of ourselves. That might explain, almost as much as my desire to tell stories, why I spend so much time writing fiction that, as of this writing, has mostly been fodder for agents and friends, and why I persist in the face of indifference.

Individuals have to learn to want something more than idle consumption. They have to want to become artists, or hackers, or to change the world, or to make things, all of which are facets of the same central application of human creativity (to me, the art / science divide is bullshit for similar reasons). For much of the 20th Century, we haven’t found “something” in work:

Since work itself was so unrewarding for so many, satisfaction came from getting paid and being able to enjoy your free time in the car or the boat that you bought with your pay. It was a better deal than most people have gotten through history, but the loss of autonomy and engagement in work was a cost, and over time it took a greater and greater toll.

A friend once told me about why he left a high-paying government engineering job for the hazards and debts of law school: at his engineering job, everyone aspired to a boat or a bigger TV. Conversations revolved around what people had bought or were planning to buy. No one thought about ideas, or anything beyond consumption. So he quit to find a place where people did. I mean, who cares that you buy a boat? Maybe it makes getting laid marginally easier, at least for guys, but that time, money, and energy would probably be better spent going out and meeting people, rather than acquiring material objects.

I’ve seen people who have virtually no money be extraordinarily happy and extraordinarily successful with the sex of their choice, and people in the exact opposite condition. The people with no money and lots of sex tend to get that way because of their personalities and their ability to be vibrant (again: see Miller’s book Spent). Even if you’re bad at being vibrant, you can learn to be better: The Game is, at bottom, about how to be vibrant for straight men, and the many women’s magazines (like Cosmo) are, at bottom, about how to be vibrant for women. Neither, unfortunately, really teaches one to be tolerant of other people’s faults, which might be the most important thing in the game of sex, but perhaps that comes through in other venues.

I don’t wish to deify Mead or his argument; when he says, “There was none of the healthy interaction with nature that a farmer has,” I think he’s missing how exhausting farming was, how close farmers were to starvation for much of agricultural history, and how nasty nature is when you’re not protected from it by modern amenities (we only started to admire nature in the late eighteenth century, when it stopped being so dangerous to city dwellers.) It’s easy to romanticize farming when we don’t have to do it. Likewise, Mead says:

A consumption-centered society is ultimately a hollow society. It makes people rich in stuff but poor in soul. In its worst aspects, consumer society is a society of bored couch potatoes seeking artificial stimulus and excitement.

But I have no idea what he means by “poor in soul.” Are Mark Zuckerberg or Bill Gates “poor in soul?” Is Stephen King? Tucker Max? I would guess not, even though all four are “rich in stuff.” We’ve also been “A consumption-centered society” for much of the 20th century, if not earlier, and, all other things being equal, I’d rather have the right stuff than no stuff, even if the mindless acquisition of stuff is a growing hazard. The solution might be the mindful acquisition of stuff, but even that is hard and takes a certain amount of discipline, especially given how good advertisers are at selling. I would also include “politicians” as being among advertisers these days.

Contemporary politics are (mostly) inane, for the structural reasons Bryan Caplan describes in The Myth of the Rational Voter. So I’m predisposed to like explanations along these lines:

Nobody has a real answer for the restructuring of manufacturing and the loss of jobs to automation and outsourcing. As long as we are stuck with the current structures, nobody can provide the growing levels of medical and educational services we want without bankrupting the country. Neither “liberals” nor “conservatives” can end the generation-long stagnation in the wage level of ordinary American families. Neither can stop the accelerating erosion of the fiscal strength of our governments at all levels without disastrous reductions in the benefits and services on which many Americans depend.

Most people on the right and the left have “answers” about contemporary problems that miss large aspects of those problems or the inherent trade-offs involved. A lot of the debate that does occur is dumb, sometimes militantly and sometimes inadvertently, but dumb nonetheless. As Mead says: “We must come to terms with the fact that the debate we have been having over these issues for past several decades has been unproductive. We’re not in a “tastes great” versus “less filling” situation; we need an entirely new brew.” Yet we’re getting variations on old brews, in which liberals look like conservatives in their defense of 1930s-era policies, and conservatives look like conservatives in their veneration of 19th century-style free-market policies. Only a few commentators, like Tyler Cowen in The Great Stagnation, even try earnestly to identify real problems and discuss those problems in non-partisan terms.

This post started as a pair of links, but it ended in an essay because Mead’s essays are so important in the way they get at an essential aspect of contemporary life. If you’re a writer, you can’t afford to ignore what’s happening on the ground, unless you want to be, at best, irrelevant, and I wonder if one reason nonfiction may be outpacing fiction in the race for importance involves the way nonfiction sidesteps questions of meaning by focusing on real things with real effects, instead of how people can’t or won’t find meaning in a world where most of us succeed, at least on a material level, by following a conventional path.

Naturally, I also think about this in the context of fiction. A while ago, I wrote this to a friend: “Too much fiction is just about dumb people with dumb problems doing dumb things that the application of some minor amount of logic would solve. Bored with life because you’re a vaguely artistic hipster? Get a real job, or learn some science, or be a real artist, or do something meaningful. The world is full of unmet needs and probably always will be. But so many characters wander around protected by their own little bubbles. Get out! The world is a big place.” Mead, I think, would agree.

It’s hard to disentangle the individual, education, acquisition, ideas, society, and politics. I’ve somewhat conflated them in my analysis, above, because one inevitable leads to the other, since talking about how you as a person should respond inevitably leads one to questions about how you were educated, and education as a mass-process inevitably leads one to society, and so forth. But I, as an individual, can’t really change the larger systems in which I’m embedded, though I can do a limited amount to observe how those systems work and how I respond to them (which often entails writing like this and linking to other writers).

Distrust That Particular Flavor — William Gibson

As with most essay collections, the ones in Distrust That Particular Flavor are uneven: a few feel like period pieces that’ve outlived their period, but most maintain their vitality (Gibson admits as much in the introduction). Gibson knows about the expiration date of predictions and commentary, and having this feature built into his essays makes them endure better. It’s a useful form of admitting a potential weakness and thus nullifying it. In the place of dubious predictions, Gibson makes predictions about not being able to predict and how we should respond:

I found the material of the actual twenty-first century richer, stranger, more multiplex, than any imaginary twenty-first century could ever have been. And it it could be unpacked with the toolkit of science fiction. I don’t really see how it can be unpacked otherwise, as so much of it is so utterly akin to science fiction, complete with a workaday level of cognitive dissonance we now take utterly for granted.

I’d like to know what that last sentence means: what’s a “workaday level of cognitive dissonance,” as opposed to a high or low level? How do we take it for granted now, in a way w didn’t before? I’d like clarification, but I have some idea of what he means: that things are going to look very different in a couple years, in a way that we can’t predict now. His own novels offer an example of this: in Pattern Recognition, published in 2003, Cayce Pollard is part of a loose collaborative of “footage” fetishists, who hunt down a series of mysterious videos and debate what, if anything, they mean (as so many people do on so many Internet forums: the chatter too often means nothing, as I’ve discovered since starting to read about photography). By 2005, YouTube comes along as the de facto repository of all non-pornographic things video. The “material of the actual twenty-first century” changes from 2003 to 2012. What remains is the weirdness.

In writing and in ideas, though Gibson is less weird and easier to follow here than in his recent fiction. There are transitions, titles, short descriptions in italicized blue at the back of each essay, where the contemporary-ish, 2011 Gibson comments on his earlier work. He gets to grade himself on what he’s gotten right and what he hasn’t. He’s self-aware, about both his faults and his mode of work:

A book exists at the intersection of the author’s subconscious and the reader’s response. An author’s career exists in the same way. A writer worries away at a jumble of thoughts, building them into a device that communicates, but the writer doesn’t know what’s been communicated until it’s possible to see it communicated.

After thirty years, a writer looks back and sees a career of a certain shape, entirely unanticipated.

It’s a mysterious business, the writing of fiction, and I thank you all for making it possible.

Comments like this, on the nature of the book and of writing, are peppered in Distrust That Particular Flavor. Technology changes but writing remains, though we again get the idea of fundamental unpredictability (“the writer doesn’t know what’s being communicated”), which is the hallmark of our time and perhaps the hallmark of life since the Industrial Revolution. It’s the kind of life that science fiction prepares us for, even when the science fiction is wrong about the particulars. It still gets the temperament right. Hence science fiction as a toolkit for the present and future—and, to some extent, as a toolkit for the past. One could view the past as a series of social disruptions abetted and enabled by technology that creates winners and losers in the struggle or cooperation for resources, sex, power:

Much of history has been, often to an unrecognized degree, technologically driven. From the extinction of North America’s mega-fauna to the current geopolitical significance of the Middle East, technology has driven change. [. . .] Very seldom do nations legislate the emergence of new technology.

The Internet, an unprecedented driver of change, was a complete accident, and that seems more often the way of things. The Internet is the result of the unlikely marriage of a DARPA project and the nascent industry of desktop computing. Had nations better understood the potential of the Internet, I suspect they might well have strangled it in its cradle. Emergent technology is, by its very nature, out of control, and leads to unpredictable outcomes.

The first step is recognition, which is part of the work Gibson is doing. Nations also might not “legislate the emergence of new technology,” but they do create more or less favorable conditions to the emergence of technology. Economic historians, general historians, and others have been trying to figure out why the Industrial Revolution emerged from England when it did, as opposed to emerging somewhere else or sometime else. I find the Roman example most tantalizing: they appear to have missed the printing press and gunpowder as two major pre-conditions, since the printing press allows the rapid dissemination of ideas and gunpowder, if used correctly, lowers of the cost of defense against barbarians.

I find the idea of history being “technologically driven” intriguing: technology has enabled progressively large agglomerations of humans, whether in what we now call “countries” or “corporations,” to act in concert. The endgame isn’t obvious and probably never will be, unless we manage to destroy ourselves. We can only watch, participate in, or ignore the show. Most people do the latter, to the extent they can.

I use a fountain pen and notebook and so identify with this:

Mechanical watches are so brilliantly unnecessary.
Any Swatch or Casio keeps better time, and high-end contemporary Swiss watches are priced like small cars. But mechanical watches partake of what my friend John Clute calls the Tamagotchi Gesture. They’re pointless in a peculiarly needful way; they’re comforting precisely because they require tending.

Much of life, especially cultural life, beyond food, shelter, and sex might be categorized as “brilliantly unnecessary;” it’s awfully hard to delineate where the necessary ends and superfluous begins—as the Soviet Union discovered. To me, haute couture is stupidly unnecessary, but a lot of fashion designers would call fountain pens the same. Necessity changes. Pleasure varies by person. Being able to keep “better time” isn’t the sole purpose of a watch, which itself is increasingly an affectation, given the ubiquity of computers with clocks embedded (we sometimes call these computers “cell phones”). We want to tend. Maybe we need to. Maybe tending is part of what makes us who we are, part of what makes us different from the people who like hanging out with their friends, watching TV, and shopping. Gibson also mentions that his relationship or lack thereof to TV also relates to him as a writer:

I suspect I have spent just about exactly as much time actually writing as the average person my age has spent watching television, and that, as much as anything, may be the real secret here.

Notice that word, “may,” weakening his comment, but not fatally. TV is the mostly invisible vampire of time, and it’s only when people like Gibson, or Clay Shirky, point to it as such that we think about it. Doing almost anything other than watching TV with the time most people spend watching it means you’re going to learn a lot more, if you’re doing something even marginally active (this is Shirky’s point about the coming “cognitive surplus” enabled by the Internet). Gibson did something different than most people his generation, which is why we now know who he is, and why his thoughts go deeper. Like this, variations of which I’ve read before but that still resonate:

Conspiracy theories and the occult comfort us because they present models of the world that more easily make sense than the world itself, and, regardless of how dark or threatening, are inherently less frightening.

They’re less frightening because they have intentionality instead of randomness, and random is really scary to many people, who prefer to see causality where none or little exists. Instead, we have all these large systems with numerous nodes and inherently unpredictability in the changes and interactions between the nodes; one can see this from a very small to a very large scale.

This is easier to perceive in the abstract, as stated here, than in the concrete, as seen in life. So we get stories, often in “nonfiction” form, about good and evil and malevolent consciousnesses, often wrapped up in political narratives, that don’t really capture reality. The weirdness of reality, to return to term I used above. Reality is hard to capture, and perhaps that science fiction toolkit gives us a method of doing so better than many others. Certainly better than a lot of the newspaper story toolkits, or literary theory toolkits, to name two I’m familiar with (and probably better than religious toolkits, too).

I’m keeping the book; given that I’ve become progressively less inclined to keep books I can’t imagine re-reading, this is a serious endorsement of Distrust That Particular Flavor. I wish Gibson wrote more nonfiction—at least, I wish he did if he could maintain the impressive quality he does here.

%d bloggers like this: