Sex, Drugs, and Cocoa Puffs: A Low Culture Manifesto — Chuck Klosterman

Pop-culture essays age in dog years while retaining the occasional long-term insight that stays fresh by accident. I’m reading Sex, Drugs, and Cocoa Puffs: A Low Culture Manifesto and mostly noticed age spots but also saw a few prescient moments, like this:

But Junod claims that he [made up details about Michael Stripe of R.E.M. in an article] in order to make people reevaluate how the press covers celebrity, and that’s valid. It’s valid because conventional celebrity journalism is inevitably hounded by two problems: Either the subject is lying, or the writer is guessing. Junod just happened to embrace both of those obstacles simultaneously.

The relationship of the Klosterman essay to say John Jeremiah Sullivan’s more recent Real World essay, “Leaving Reality” essay is obvious, but I think Kloosterman is also forgetting—or doesn’t want to simply say—that people read celebrity profiles in part because they want to be lied to. There is more than a little complicity in the lie, which changes the relations of the liar to the person being lied to. Or perhaps people want to feel false intimacy, which can be achieved partially through lying.

klostermanThe “subject” of these profiles—like the Michael Stripe one, or others in its genre—is probably trying mostly not to say or do anything that will make him or her look like an asshole when taken out of context. This can be shockingly hard to do, since the subject can’t tell when the writer is “guessing” or what the writer is “guessing.” In this context “guessing” can be another word for “interpretation.” One reason to read the New Yorker, incidentally, is that its writers appear to attempt to be scrumptiously fair and to avoid gossip—yet those are the very qualities that can give rise to accusations of being “boring.” One person’s boring is another’s accurate.

Imagine someone followed you around, all the time, for a couple of days and maybe for longer, and that the person has some bad will, or at least wants to make your life into a story. Could the person get some stuff that would make you look bad? Probably. I know that someone who could observe everything I wrote, and watch everything I do could make me look really bad. So smart celebrities avoid the real press, or only interact with the relatively small, non-jerk parts of the press—like The New Yorker.

Let’s take a specific example of an article about the world behind celebrity journalism: Sarah Miller’s hilarious “Anna Nicole Smith Kind of Made a Pass at Me.” I dramatically read parts of it to some friends the other night. This paragraph stands out in particular:

I wrote a first draft, in which, without spelling everything out, I attempted to give some real sense of that day. “I can’t publish this,” my editor said, and in her defense, I’m sure she was right. I wrote another version that made it sound like I’d had fun, which took hours and hours, because it was not real; writing something that is not real is not impossible, but it is very close to it. Through every long moment I worked on it I cursed myself for not taking that stupid trip to Magic Mountain, which would have made it all so much easier. Anyway, they published that version, and I got my money.

Miller describes what actually happened this way:

“Sarah Miller,” [Anna Nicole Smith] said, “You’ve got the prettiest blue eyes.” If we were in a movie, she’d have added, “I do declare.”

“Thank you,” I said formally.

“You ever had sex with a girl?”

It was none of her business, but I thought being honest might somehow give her back some of the dignity my mind had robbed her of, and I thought she might sense it, and that we might have a real conversation. “Yes, actually, Anna. I have.”

“Well, did you like it?” The word “like” lasted for several seconds.

“I actually did not,” I said. “It was a…misbegotten adventure.” I was pleased at how much I sounded like my father.

But that can’t be published, not at the time Miller was trying to get the story. Her editor, however, doesn’t want “real.” The number of readers who do is small. How many people watch PBS versus celebutainment shows? How many read The New Yorker versus US Weekly? The truth is hard and amusing fictions easy, so we choose the latter. In the introduction to Sex, Drugs, and Cocoa Puffs Klosterman writes that “accelerated culture [. . .] doesn’t speed things up as much as it jams everything into the same wall of sound. But that’s not necessarily tragic.” I’m not convinced there is such a thing as “accelerated culture,” but I am convinced that elements of what passes for low or contemporary or whatever culture do emerge from the collective decisions of millions of individuals.

But it is also worth stepping back and looking for larger patterns, which is what Klosterman almost but doesn’t quite do. He is a little too fond too of grand pronouncements. Like:

The main problem with mass media is that it makes it impossible to fall in love with any acumen of normalcy. There is no “normal,” because everybody is being twisted by the same forces simultaneously. You can’t compare your relationship with the playful couple who lives next door, because they’re probably modeling themselves after Chandler Bing and Monica Geller. Real people are actively trying to live like fake people, so real people are no less fake. Every comparison becomes impractical. This is why the impractical has become totally acceptable; impracticality almost seems cool.

What is an “acumen of normalcy?” I’m not sure either. I had to check Google for “Chandler Bing” and “Monica Geller.” And has it ever been the case that “real people” have not tried to model themselves on “fake people?” If you read major religious texts as fundamentally mythological, as I do, the answer is “no:” people have been trying to emulate the Christian Bible and the Old Testament for literally thousands of years. Early novels with melodramatic endings encouraged their readers to attempt to reenact those ending. We seek narrative fiction in order to learn how to live—and that isn’t at all new. I don’t think there has ever been as firm a normal as we’d like to project on the past.

Eventually, with paragraphs like the quoted section, one comes to the conclusion that either everything is “fake” or everything is “real”—which is the sort of conclusion high freshmen hit when they’re in their dorm rooms at 2:00 a.m. The next day they still get up for class and go to breakfast. What is one supposed to do differently if one decides that real people are fake?

Perhaps not surprisingly, the next essay in the Kloserman collection concerns the video game “The Sims.” Also not surprisingly, some SF writers have wondered what might happen if we get a wholly immersive and wholly fake world. One possible solution to the Fermi Paradox is that sufficiently advanced civilizations make video games that are so cool that they’d rather live in constructed worlds than explore the real universe.

That’s an interesting thought experiment, but like the high freshmen mentioned above no one does anything differently today based on it. Klosterman tells tales about meaningless arguments. Eventually, however, generative people come to realize that arguments that don’t lead to any sort of change or growth are pointless, and they get on with their lives. One sign of “low culture” may be that winning or losing the argument means nothing, and the participants should go build or make something instead.

“Sisu:” a new favorite word that comes from Finnish and was popularized by war

A Thousand Lakes of Red Blood on White Snow” brilliantly describes how tiny Finland successfully fought the Soviet Union twice during World War II:

Thus with a thousand lakes of warm red blood on cold white snow did the Finns purchase their escape from assimilation into the Soviet Union, ensuring that when the Iron Curtain was drawn, it ran along the eastern side of Finland rather than the western one.

The word “sisu” captures the mindset necessary to persevere against formidable, unlikely odds, though it is unlikely to have the resonance it needs unless you’ve read the entire article:

Sisu resists exact translation into other languages but loosely translated refers to a stoic toughness consisting of strength of will, determination, and perseverance in the face of adversity and against repeated setbacks; it means stubborn fortitude in the face of insurmountable odds; the ability to keep fighting after most people would have quit, and fighting with the will to win.

Sisu is more than mere physical courage, requiring an inner strength nourished by optimism, tempered by realism, and powered by a great deal of pig-headed obstinacy.

“Grit,” “stoicism,” and “tenacity” express similar concepts in English.

Anyone know a good, general history of Finland? Many people are currently enamored of its schools, but perhaps the same cultural thing that enabled the country to fight the Winter War also enable it to succeed educationally where others fail.

We are our own enemies: “Arts & Entertainments” edition

In “The Collective Conscience of Reality Television: In a format without a code of conduct, viewers drive the limits of the exploitation and privacy invasions allowed onscreen” Serena Elavia writes that “What viewers will or won’t watch matters immensely to networks; in fact, they seem function as the networks’ sole ‘conscience.’” She’s right, and it’s a point too infrequently made: most of the cultural “problems” that the commentariat identifies arise because the audience responds to whatever the “problem” might be, whether it’s improbably hot and photoshopped models or reality TV or football or soda.

This is important because words like “society” or “the media” are actually shorthands for “the aggregated preferences of many, perhaps millions, of individuals.” You can’t really blame “society” for much of anything; you can at best blame the many individuals who hold and perpetuate beliefs or practices or whatever. “Conscience” is distributed, and it’s arguably becoming more distributed in the Internet age, when the means of discussions are (literally) at everyone’s fingertips. This blog is a good example of that principle in action.

Elavia’s point is also similar to one made by Brian Moody, the producer in Christopher Beha’s novel Arts & Entertainments. Towards the end of the novel he and Eddie, the everyman nebbish protagonist, discuss the nature of TV and, beneath that, the nature of God, and Moody says:

The audience has only way of expressing its interest—by watching. They might watch because they love you. They might watch because they hate you. They might watch because they’re sick. Doesn’t matter. Is that good or bad? The question doesn’t make any sense. Good is whatever the audience watches [. . . .] The audience is all there is [. . . .] I care about the audience, and I won’t defy them.

That last line, about how Moody “won’t defy” the audience, is scary because it implies he’ll do anything. Kill a man? If the audience wants it—and some dark corners of the Internet imply there is a market for murder. Moody is unsettling because he’ll do anything to anyone around him if the audience wills it. Most of us would like to imagine our friends, and even strangers, will not under any circumstances murder, torture, or rape us. Moody implies that in the right circumstances he would, or he would allow it to happen, almost as a form of worship.

Right now we don’t live in Moody’s world: as Elavia observes, producers only stop when audiences protest. Which raises a question: What happens if audiences don’t protest? That sort of question underlies books like The Hunger Games. Over time it may become more salient. Fiction and history teach us that we don’t really know what our neighbors and friends and strangers will do in real crises. Many, however, will indulge or release the darkness within.

Owning vs sharing: Don’t get caught in the ugly middle

In a tweet Paul Graham writes: “As buying and selling become easier, owning approaches sharing.” That describes my behavior regarding many objects, especially electronics: for as long as I’ve been buying Macs and Mac products, I’ve been selling the old versions on Craigslist for a third to half of their initial value. In some sense I’m actually leasing them, but using myself as the leasing agent. Although I’ve owned a car I actually prefer not to and Uber is accelerating the ability to rent cars when needed and avoid the hassles of ownership. Housing has of course long been both rented and owned, and like many economists I find the U.S. obsession with owning housing to be misguided.

But there are other ways too that owning approaches sharing in my life:

  • Old cameras and lenses get sold to fund new ones. Like Macs, they tend to retain a fair amount of value—usually about half for lenses and a third for camera bodies.
  • It’s not uncommon for me to sell books that look promising but don’t live up to expectations, almost always through Amazon (despite Amazon’s encourage for buyers to scam sellers; for objects worth less than $20 I don’t think the issue is overwhelmingly important).
  • Although I haven’t begun doing this yet, I think that selling bikes may be more economical than moving them. The last bike I moved from Tucson to New York was probably a net loss and should’ve been sold instead of shipped.

There are some items that still aren’t easily sold, like beds and furniture, in part because they’re heavy, in part because they can harbor bed bugs, and in part because they just aren’t that valuable. I don’t have the citation handy, but I’ve read that Ikea might be facilitating mobility by making it cheap and easy to setup new apartments: it’s possible to buy a couch, a chair, some dishes, a bed, and some shelves for under $1,000, in the course of an afternoon (although I’d prefer a Tuft & Needle bed, but that’s an aside).

Among my friends, city-to-city moves often entail dumping most of their stuff and buying it again at the destination, since the moving cost is too high to justify the hassle. That’s less true of me because I have a sit-stand desk and some other pretty expensive gear, but in this respect I’m in the minority. Keeping a minority of one’s stuff may also lead to a more satisfying, experience-rich life, at least for some people.

The habit of either having very expensive and durable stuff or throwaway stuff may also be indicative of the polarization of many domains, in which it makes sense to either buy or be the best, or buy throwaway stuff or don’t bother competing. Don’t get caught in the ugly middle. Like “Death before inconvenience,” “Don’t get caught in the ugly middle,” is something companies should contemplate.

Owning cars and houses in particular is just insanely expensive. In “The Cheapest Generation,” Derek Thompson and Jordan Weissmann observe that

Smartphones compete against cars for young people’s big-ticket dollars, since the cost of a good phone and data plan can exceed $1,000 a year.

But cars cost close to $10,000 a year, according to AAA—or at least an order of magnitude more than a phone. Even if other transportation expenses (Uber, bikes, subways (where available)) cost a couple thousand dollars, they’re still significantly cheaper than owning a car. And a phone plus a data plan enables those alternatives. Owning and sharing may be less opposed than they were once believed to be.

Paul Graham and the artist

Paul Graham’s new essay “Before the Startup” is as always fascinating, but Graham also says several things that apply to artists:

The way to come up with good startup ideas is to take a step back. Instead of making a conscious effort to think of startup ideas, turn your mind into the type that startup ideas form in without any conscious effort. In fact, so unconsciously that you don’t even realize at first that they’re startup ideas.

The same is true of ideas for novels, which often come from minute observations or moments or studies of character. They often don’t feel like novels at first: they feel like a situation (“What if a guy did this…”) and the full novel comes later. Artists often work at the margins.

He also writes in a footnote:

I did manage to think of a heuristic for detecting whether you have a taste for interesting ideas: whether you find known boring ideas intolerable. Could you endure studying literary theory, or working in middle management at a large company?

This may be why I and perhaps many other grad students find grad school worse as time goes on, and why MFA programs have been growing. Too many critics have ceased focusing not on how “to be an expert on your users and the problem you’re solving for them”—or, in this example, “readers” instead of “users”—and instead focus on straight forward careerism, which rarely seems to overlap with what people want to read.Paul Graham and the artist

What happened with Deconstruction? And why is there so much bad writing in academia?

How To Deconstruct Almost Anything” has been making the online rounds for 20 years for a good reason: it’s an effective satire of writing in the humanities and some of the dumber currents of contemporary thought in academia.* It also usually raises an obvious question: How did “Deconstruction,” or its siblings “Poststructuralism” or “Postmodernism,” get started in the first place?

My take is a “meta” idea about institutions rather than a direct comment on the merits of deconstruction as a method or philosophy. The rise of deconstruction has more to do with the needs of academia as an institution than the quality of deconstruction as a tool, method, or philosophy. To understand why, however, one has to go far back in time.

Since at least the 18th Century, writers of various sorts have been systematically (key word: before the Enlightenment andIndustrial Revolution, investigations were rarely systematic by modern standards) asking fundamental questions about what words mean and how they mean them, along with what works made of words mean and how they mean them. Though critical ideas go back to Plato and Aristotle, Dr. Johnson is a decent place to start. We eventually began calling such people “critics.” In the 19th Century this habit gets a big boost from the Romantics and then writers like Matthew Arnold.

Many of the debates about what things mean and why have inherent tensions, like: “Should you consider the author’s time period or point in history when evaluating a work?” or “Can art be inherently aesthetic or must it be political?” Others can be formulated. Different answers predominate in different periods.

In the 20th Century, critics start getting caught up in academia (I. A. Richards is one example); before that, most of them were what we’d now call freelancers who wrote for their own fancy or for general, education audiences. The shift happens for many reasons, and one is the invention of “research” universities; this may seem incidental to questions about Deconstruction, but it isn’t because Deconstruction wouldn’t exist or wouldn’t exist in the way it does without academia. Anyway, research universities get started in Germany, then spread to the U.S. through Johns Hopkins, which was founded in 1876. Professors of English start getting appointed. In research universities, professors need to produce “original research” to qualify for hiring, tenure, and promotion. This makes a lot of sense in the sciences, which have a very clear discover-and-build model in which new work is right and old work is wrong. This doesn’t work quite as well in the humanities and especially in fields like English.

English professors initially study words—these days we’d primarily call them philologists—and where they come from, and there is also a large contingent of professors of Greek or Latin who also teach some English. Over time English professors move from being primarily philological in nature towards being critics. The first people to really ratchet up the research-on-original-works game were the New Critics, starting in the 1930s. In the 1930s they are young whippersnappers who can ignore their elders in part because getting a job as a professor is a relatively easy, relatively genteel endeavor.

New Critics predominate until the 1950s, when Structuralists seize the high ground (think of someone like Northrop Frye) and begin asking about what sorts of universal questions literature might ask, or what universal qualities it might possess. After 1945, too, universities expand like crazy due to the G.I. Bill and then baby boomers goes to college. Pretty much anyone who can get a PhD can get a tenure-track job teaching English. That lets waves of people with new ideas who want to overthrow the ideas of their elders into academia. In the 1970s, Deconstructionists (otherwise known as Post-structuralists) show up. They’re the French theorists who are routinely mocked outside of academia for obvious reasons:

The move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brought the question of temporality into the thinking of structure, and marked a shift from a form of Althusserian theory that takes structural totalities as theoretical objects to one in which the insights into the contingent possibility of structure inaugurate a renewed conception of hegemony as bound up with the contingent sites and strategies of the rearticulation of power.

That’s Judith Butler, quoted in Steven Pinker’s witty, readable The Sense of Style, in which he explains why this passage is terrible and how to avoid inflicting passages like it onto others. Inside of academia, she’s considered beyond criticism.

In each generational change of method and ideology, from philology to New Criticism to Structuralism to Poststructuralism, newly-minted professors needed to get PhDs, get hired by departments (often though not always in English), and get tenure by producing “original research.” One way to produce original research is to denounce the methods and ideas of your predecessors as horse shit and then set up a new set of methods and ideas, which can also be less charitably called “assumptions.”

But a funny thing happens to the critical-industrial complex in universities starting around 1975: the baby boomers finish college. The absolute number of students stops growing and even shrinks for a number of years. Colleges have all these tenured professors who can’t be gotten rid of, because tenure prevents them from being fired. So colleges stop hiring (see Menand’s The Marketplace of Ideas for a good account of this dynamic).

Colleges never really hired en masse again.

Other factors also reduced or discouraged the hiring of professors by colleges. In the 1980s and 1990s court decisions strike down mandatory retirement. Instead of getting a gold watch (or whatever academics gave), professors could continue being full profs well into their 70s or even 80s. Life expectancies lengthened throughout the 20th Century, and by now a professor gets tenure at say 35 could still be teaching at 85. In college I had a couple of professors who should have been forcibly retired at least a decade before I encountered them, but that is no longer possible.

Consequently, the personnel churn that used to produce new dominant ideologies in academia stops around the 1970s. The relatively few new faculty slots from 1975 to the present go to people who already believed in Deconstructionist ideals, though those ideals tend to go by the term “Literary Theory,” or just “Theory,” by the 1980s. When hundreds of plausible applications arrive for each faculty position, it’s very easy to select for comfortable ideological conformity. As noted above, the humanities don’t even have the backstop of experiment and reality against which radicals can base major changes. People who are gadflies like me can get blogs, but blogs don’t pay the bills and still don’t have much suck inside the academic edifice itself. Critics might also write academic novels, but those don’t seem to have had much of an impact on those inside. Perhaps the most salient example of institutional change is the rise of the MFA program for both undergrads and grad students, since those who teach in MFA programs tend to believe that it is possible to write well and that it is possible and even desirable to write for people who aren’t themselves academics.

Let’s return to Deconstruction as a concept. It has some interesting ideas, like this one: “he asks us to question not whether something is an X or a Y, but rather to get ‘meta’ and start examining what makes it possible for us to go through life assigning things too ontological categories (X or Y) in the first place” and others, like those pointing out that a work of art can mean two opposing things simultaneously, and that there often isn’t a single best reading of a particular work.

The problem, however, is that Deconstruction’s sillier adherents—who are all over universities—take a misreading of Saussure to argue that Deconstruction means that nothing means anything, except that everything means that men, white people, and Western imperialists oppress women, non-white people, and everyone else, and hell, as long as we’re at it capitalism is evil. History also means nothing because nothing means anything, or everything means nothing, or nothing means everything. But dressed up in sufficiently confusing language—see the Butler passage from earlier in this essay—no one can tell what if anything is really being argued.

There has been some blowback against this (Paglia, Falck, Windschuttle), but the sillier parts of Deconstructionist / Post-structuralist nonsense won, and the institutional forces operating within academia mean that that victory has been depressingly permanent. Those forces show no signs of abating. Almost no one in academia asks, “Is the work I’m doing actually important, for any reasonable value of ‘important?’” The ones who ask it tend to find something else to do. As my roommate from my first year of grad school observed when she quit after her M.A., “It’s all a bunch of bullshit.”

The people who would normally produce intellectual churn have mostly been shut out of the job market, or have moved to the healthier world of ideas online or in journalism, or have been marginalized (Paglia). Few people welcome genuine attacks on their ideas and few of us are as open-minded as we’d like to believe; academics like to think they’re open-minded, but my experience with peer review thus far indicates otherwise. So real critics tend to follow the “Exit, Voice, Loyalty” model described by Albert Hirschman in his eponymous book and exit.

The smarter ones who still want to write go for MFAs, where the goal is to produce art that someone else might actually want to read. The MFA option has grown for many reasons, but one is as an alternative for literary-minded people who want to produce writing that might matter to someone other than other English PhDs.

Few important thinkers have emerged from the humanities in the last 25 or so years. Many have in the sciences, which should be apparent through the Edge.org writers. As John Brockman, the Edge.org founder, says:

The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.

One would think that “the traditional intellectual” would wake up and do something about this. There have been some signs of this happening—like Franco Moretti or Jonathan Gottschall—but so far those green shoots have been easy to miss and far from the mainstream. “Theory” and the bad writing associated with remains king.

Works not cited but from which this reply draws:

Menand, Louis. The Marketplace of Ideas: Reform and Resistance in the American University. New York: W.W. Norton, 2010.

Paglia, Camille. “Junk Bonds and Corporate Raiders: Academe in the Hour of the Wolf.” Arion Third Series 1.2 (1991/04/01): 139-212.

Paglia, Camille. Sex, Art, and American Culture: Essays. 1 ed. New York: Vintage, 1992.

Falck, Colin. Myth, Truth and Literature: Towards a True Post-modernism. 2 ed. New York: Cambridge University Press, 1994.

Windschuttle, Keith. The Killing of History: How Literary Critics and Social Theorists are Murdering Our Past. 1st Free Press Ed., 1997 ed. New York: Free Press, 1997.

Star, Alexander. Quick Studies: The Best of Lingua Franca. 1st ed. Farrar, Straus and Giroux, 2002.

Cusset, Francois. French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States. Trans. Jeff Fort. Minneapolis: University Of Minnesota Press, 2008.

Pinker, Steven. The Sense of Style: the Thinking Person’s Guide to Writing in the 21st Century. New York: Viking Adult, 2014.


* Here is one recent discussion, from which the original version of this essay was drawn. “How To Deconstruct Almost Anything” remains popular for the same reason academic novels remain popular: it is often easier to criticize through humor and satire than direct attack.

Finally! Someone else notices that the best instructors aren’t necessarily the most credentialed

Finally! Someone else notices that a lot of academic practices don’t make any sense: “Pictures from an Institution: Leon Botstein made Bard College what it is, but can he insure that it outlasts him?” makes me like Bard; this in particular stands out: “In the thirty-nine years that Botstein has been president of Bard, the college has served as a kind of petri dish for his many pedagogical hypotheses [. . . including] that public intellectuals are often better teachers than newly minted Ph.D.s are.” Why isn’t anyone else following the Bard model?

The question is partially rhetorical. College presidents and trustees are probably systematically selected for conformity, but I’ve gotta think there are other people out there who are going, “Aping the Ivy League model is not going to work for us. What can we do differently?” The current order of things, driven by bogus ranking systems, discourages this sort of thinking. Colleges love the rhetoric of being different, but very few follow that rhetoric to actually being different. Perhaps rising costs will eventually force them to be differentiate or die. Then again, the article says that Bard may be on its way to death or drastic restructuring because of financial problems. Still I don’t see overspending as being fundamentally and intrinsically linked with other issues. Instead, it seems that being a maverick in one field may simply translate to being a maverick in many, including places one doesn’t want mavericks (like finances).

A few weeks ago I wrote about donating to Clark, my alma mater. Although I still think Clark a good school, I’d love to see it move in a more Bard-ish direction. the current President and trustees, however, appear to have come through the system and do not seem like shake-it-up types, regardless of their rhetoric.

Follow

Get every new post delivered to your Inbox.

Join 1,348 other followers

%d bloggers like this: