Distrust That Particular Flavor — William Gibson

As with most essay collections, the ones in Distrust That Particular Flavor are uneven: a few feel like period pieces that’ve outlived their period, but most maintain their vitality (Gibson admits as much in the introduction). Gibson knows about the expiration date of predictions and commentary, and having this feature built into his essays makes them endure better. It’s a useful form of admitting a potential weakness and thus nullifying it. In the place of dubious predictions, Gibson makes predictions about not being able to predict and how we should respond:

I found the material of the actual twenty-first century richer, stranger, more multiplex, than any imaginary twenty-first century could ever have been. And it it could be unpacked with the toolkit of science fiction. I don’t really see how it can be unpacked otherwise, as so much of it is so utterly akin to science fiction, complete with a workaday level of cognitive dissonance we now take utterly for granted.

I’d like to know what that last sentence means: what’s a “workaday level of cognitive dissonance,” as opposed to a high or low level? How do we take it for granted now, in a way w didn’t before? I’d like clarification, but I have some idea of what he means: that things are going to look very different in a couple years, in a way that we can’t predict now. His own novels offer an example of this: in Pattern Recognition, published in 2003, Cayce Pollard is part of a loose collaborative of “footage” fetishists, who hunt down a series of mysterious videos and debate what, if anything, they mean (as so many people do on so many Internet forums: the chatter too often means nothing, as I’ve discovered since starting to read about photography). By 2005, YouTube comes along as the de facto repository of all non-pornographic things video. The “material of the actual twenty-first century” changes from 2003 to 2012. What remains is the weirdness.

In writing and in ideas, though Gibson is less weird and easier to follow here than in his recent fiction. There are transitions, titles, short descriptions in italicized blue at the back of each essay, where the contemporary-ish, 2011 Gibson comments on his earlier work. He gets to grade himself on what he’s gotten right and what he hasn’t. He’s self-aware, about both his faults and his mode of work:

A book exists at the intersection of the author’s subconscious and the reader’s response. An author’s career exists in the same way. A writer worries away at a jumble of thoughts, building them into a device that communicates, but the writer doesn’t know what’s been communicated until it’s possible to see it communicated.

After thirty years, a writer looks back and sees a career of a certain shape, entirely unanticipated.

It’s a mysterious business, the writing of fiction, and I thank you all for making it possible.

Comments like this, on the nature of the book and of writing, are peppered in Distrust That Particular Flavor. Technology changes but writing remains, though we again get the idea of fundamental unpredictability (“the writer doesn’t know what’s being communicated”), which is the hallmark of our time and perhaps the hallmark of life since the Industrial Revolution. It’s the kind of life that science fiction prepares us for, even when the science fiction is wrong about the particulars. It still gets the temperament right. Hence science fiction as a toolkit for the present and future—and, to some extent, as a toolkit for the past. One could view the past as a series of social disruptions abetted and enabled by technology that creates winners and losers in the struggle or cooperation for resources, sex, power:

Much of history has been, often to an unrecognized degree, technologically driven. From the extinction of North America’s mega-fauna to the current geopolitical significance of the Middle East, technology has driven change. [. . .] Very seldom do nations legislate the emergence of new technology.

The Internet, an unprecedented driver of change, was a complete accident, and that seems more often the way of things. The Internet is the result of the unlikely marriage of a DARPA project and the nascent industry of desktop computing. Had nations better understood the potential of the Internet, I suspect they might well have strangled it in its cradle. Emergent technology is, by its very nature, out of control, and leads to unpredictable outcomes.

The first step is recognition, which is part of the work Gibson is doing. Nations also might not “legislate the emergence of new technology,” but they do create more or less favorable conditions to the emergence of technology. Economic historians, general historians, and others have been trying to figure out why the Industrial Revolution emerged from England when it did, as opposed to emerging somewhere else or sometime else. I find the Roman example most tantalizing: they appear to have missed the printing press and gunpowder as two major pre-conditions, since the printing press allows the rapid dissemination of ideas and gunpowder, if used correctly, lowers of the cost of defense against barbarians.

I find the idea of history being “technologically driven” intriguing: technology has enabled progressively large agglomerations of humans, whether in what we now call “countries” or “corporations,” to act in concert. The endgame isn’t obvious and probably never will be, unless we manage to destroy ourselves. We can only watch, participate in, or ignore the show. Most people do the latter, to the extent they can.

I use a fountain pen and notebook and so identify with this:

Mechanical watches are so brilliantly unnecessary.
Any Swatch or Casio keeps better time, and high-end contemporary Swiss watches are priced like small cars. But mechanical watches partake of what my friend John Clute calls the Tamagotchi Gesture. They’re pointless in a peculiarly needful way; they’re comforting precisely because they require tending.

Much of life, especially cultural life, beyond food, shelter, and sex might be categorized as “brilliantly unnecessary;” it’s awfully hard to delineate where the necessary ends and superfluous begins—as the Soviet Union discovered. To me, haute couture is stupidly unnecessary, but a lot of fashion designers would call fountain pens the same. Necessity changes. Pleasure varies by person. Being able to keep “better time” isn’t the sole purpose of a watch, which itself is increasingly an affectation, given the ubiquity of computers with clocks embedded (we sometimes call these computers “cell phones”). We want to tend. Maybe we need to. Maybe tending is part of what makes us who we are, part of what makes us different from the people who like hanging out with their friends, watching TV, and shopping. Gibson also mentions that his relationship or lack thereof to TV also relates to him as a writer:

I suspect I have spent just about exactly as much time actually writing as the average person my age has spent watching television, and that, as much as anything, may be the real secret here.

Notice that word, “may,” weakening his comment, but not fatally. TV is the mostly invisible vampire of time, and it’s only when people like Gibson, or Clay Shirky, point to it as such that we think about it. Doing almost anything other than watching TV with the time most people spend watching it means you’re going to learn a lot more, if you’re doing something even marginally active (this is Shirky’s point about the coming “cognitive surplus” enabled by the Internet). Gibson did something different than most people his generation, which is why we now know who he is, and why his thoughts go deeper. Like this, variations of which I’ve read before but that still resonate:

Conspiracy theories and the occult comfort us because they present models of the world that more easily make sense than the world itself, and, regardless of how dark or threatening, are inherently less frightening.

They’re less frightening because they have intentionality instead of randomness, and random is really scary to many people, who prefer to see causality where none or little exists. Instead, we have all these large systems with numerous nodes and inherently unpredictability in the changes and interactions between the nodes; one can see this from a very small to a very large scale.

This is easier to perceive in the abstract, as stated here, than in the concrete, as seen in life. So we get stories, often in “nonfiction” form, about good and evil and malevolent consciousnesses, often wrapped up in political narratives, that don’t really capture reality. The weirdness of reality, to return to term I used above. Reality is hard to capture, and perhaps that science fiction toolkit gives us a method of doing so better than many others. Certainly better than a lot of the newspaper story toolkits, or literary theory toolkits, to name two I’m familiar with (and probably better than religious toolkits, too).

I’m keeping the book; given that I’ve become progressively less inclined to keep books I can’t imagine re-reading, this is a serious endorsement of Distrust That Particular Flavor. I wish Gibson wrote more nonfiction—at least, I wish he did if he could maintain the impressive quality he does here.

Adapt — Tim Harford

Adapt is deep—much deeper than most pop economics books, and deeper than Harford’s last book, The Logic of Life. I can’t really define precisely how—”deeper” is not the sort of thing that lets me compare quotes from one section versus another section. But there’s a sense of inevitability about this book.

Harford describes how Thomas Thwaites, “a post-graduate design student at the Royal College of Art in London,” attempted to make a toaster from scratch. He failed, and not subtly, either. This leads to Harford’s larger observation: “The modern world is mind-bogglingly complicated. Far simpler objects than a toaster involve global supply chains and the coordinated efforts of many individuals, scattered across the world. Many do not even know the final destination of their efforts.” It’s easy to find this alienating, especially if you’re a random paper pusher who manages information and never sees anything tangible that you’ve created. Hence the derogatory term—”paper pusher”—that presumably sets up some kind of binary, with the paper pusher contrasted to, say, a lumberjack, or something. I don’t even know what, other than that the phrase is common.

Yet we exist as paper-pushers and bureaucratic cogs because people will pay for cogs and because if we didn’t, we also wouldn’t have the modern economy. We don’t think about this much, however; as Harford says, “The complexity we have created for ourselves envelops us so completely that, instead of being dizzied, we take it for granted.” Maybe we need to. But we also need an unusual set of skills in such a vast landscape: ones that will let us try new ideas, let them fail or succeed, and then try something else. That’s a top-level view of Harford’s point.

In a blog post, Harford writes:

[. . .] the message of Adapt isn’t really “practice makes perfect,” or even “learn from your mistakes,” at least not as a straightforward self-help cliché. It’s about building systems – whether markets, businesses, governments or armies – that solve complex problems. And it turns out that complex problem-solving usually means experimenting, quickly discovering what works and what doesn’t, and somehow letting what’s working replace what isn’t.

Unfortunately, we often don’t realize how complex problems should be solved, and individual egos often get in the way of those problems. That was the basic issue with Rumsfeld as Defense Secretary: he didn’t accept the need to improvise, which appears to be getting more important over time, not less. This also sounds similar to the subject of Nassim Taleb’s next book, Antifragility: How to Live in a World We Don’t Understand.

Or consider this passage from Adapt, which should make us humbler about the large political problems we face and how we can solve them:

We badly need to believe in the potency of leaders. Our instinctive response, when faced with a complicated challenge, is to look for a leader who will solve it. It wasn’t just Obama: every president is elected after promising to change the way politics works; and almost every president then slumps in the pools as reality starts to bite. This isn’t because we keep electing the wrong leaders. It is because we have an inflated sense of what leadership can achieve in the modern world.

Perhaps we have this instinct because we evolved to operate in small hunter-gatherer groups, solving small hunter-gatherer problems. The societies in which our modern brains developed weren’t modern: they contained a few hundred separate products, rather than ten billion. The challenges such societies faced, however formidable, were simple enough to have been solved by an intelligent, wise, brave leader. They would have been vastly simpler than the challenges facing a newly elected US president.

Notice the key word in the first sentence: “need.” Is it really a need we have to believe in our leaders? At first I wanted to say no, but thinking about all the symbolic capital we invest in our leaders (and actors, and others, especially if those “others” are credentialed) makes me think otherwise. Those needs should make us somewhat uncomfortable, since leaders might not be able to fix as much as they might imagine. This is also an aspect of the “New Jesus” complex, which James Fallows described in the context of David Petraeus becoming the commander of American troops in Iraq. As Fallows says:

Everyone who has ever worked in an office will recognize the idea. The New Jesus is the guy the boss has just brought in to solve the problems that the slackers and idiots already on the staff cannot handle. Of course sooner or later the New Jesus himself turns into a slacker or idiot, and the search for the next Jesus begins.*

We want some Messianic figure to sweep away all our problems. In the real world, that just doesn’t happen, or it very seldom happens. Petraeus was certainly important, but he was also implementing ideas that had percolated around the military for some time—as Harford discusses in his chapter on “Conflict or: How Organisations Learn.” The military is an obvious environment for exploring adaptation, since the consequences of failing to adapt are severe: people die. Blockbuster went under because it couldn’t or wouldn’t compete with Netflix (see here for more), but the consequences mostly happened in terms of shares lost. On the battlefield or in the emergency room, it happens in terms of lives lost. We want a leader to somehow “clean house” or “cut through red tape” to solve problems, but that often doesn’t happen, especially outside secular hagiography. Instead, we need to learn as individuals and organizations how to adapt to circumstances and how to make circumstances adapt to us. Few would disagree with this banal assertion. Many would disagree in a particular circumstance that requires adaptation.

The word “potency” hearkens to the Middle Ages, when the fecundity of the King was linked to the fecundity of the realm, as so many fairy tales hold. Yet we’re still using the same kinds of words to describe leaders today, even when leaders get in trouble for being overly, uh, potent (see, for example, Bill Clinton, or whoever is involved in the scandal du jour).

There’s a recurrent thread of very old ideas and needs running up against modern complexity in this book, although Harford doesn’t discuss such issues directly. But they’re present, if you’re watching for adjectives like “potency” to describe leaders. Or word like “instinct” that contrast with the cool, cerebral mastery we’d like to associate with modern technical accomplishment. Underlying contemporary achievements sit older ideas. When we deny those ideas, we get into trouble. Harford is trying to get us back out.


* For the origins of the New Jesus complex, see this post, also from Fallows.

Caitlin Flanagan and narrative fallacies in Girl Land

In “The King of Human Error,” Michael Lewis describes Daniel Kahneman’s brilliant work, which I’ve learned about slowly over the last few years, as I see him cited more and more but only recently have come to understand just how pervasive and deserved his influence has been; Kahneman’s latest book, Thinking, Fast and Slow, is the kind of brilliant summa that makes even writing a review difficult because it’s so good and contains so much material all in one place. In his essay, Lewis says that “The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts. Kahneman and Tversky called this logical error the ‘conjunction fallacy.'”

Caitlin Flanagan’s Girl Land is superficially interesting but can be accurately summarized as simply the conjunction fallacy in book form.

Then we need to be doubly dubious of narrative and narrative fallacies; when we hear things embedded in stories, we ought to be thinking about how those things might not be true, how we’re affected by anecdotes, and how our reasoning holds up under statistical and other kinds of analysis. I like stories, and almost all of us like stories, but too many of us appear to be unwilling to acknowledge that stories we tell may be inaccurate or misleading. Think of Tyler Cowen’s TED talk on this subject too.

In the Lewis article, Kahneman also says: “People say your childhood has a big influence on who you become [. . .] I’m not at all sure that’s true.” I’m not sure either. Flanagan and Freud think so; Bryan Caplan is more skeptical. I am leaning steadily more towards the Caplan / Kahneman uncertain worldview. I wish Flanagan would move in that direction too. She starts Girl Land by saying, “Every woman I’ve known describes her adolescence as the most psychologically intense period of her life.” Which is pretty damn depressing: most people spend their adolescence under their parents’ yoke, stuck in frequently pointless high school classes, and finishing it without accomplishing anything of note. That this state could be “the most psychologically intense” of not just a single person’s life, but of every woman’s life, is to demean the accomplishments and real achievements of adult women. It might be that having a schlong disqualifies me from entering this discussion, but see too the links at the end of this post—which go to female critics equally unimpressed with Girl Land.

I’m not even convinced Flanagan has a strong grasp of what women are really like—maybe “girl land” looks different on the inside, because from the outside I saw as a teenager very little of the subtlety and sensitivity and weakness Flanagan suggests girls have. Perhaps it’s there, but if so, it’s well-hidden; to me a lot of the book reads like female solipsism and navel-gazing, and very disconnected from how women and teenage girls actually behave. Flanagan decries “the sexually explicit music, the endless hard-core and even fetish pornography available twenty-four hours a day on the Internet [. . .]” while ignoring that most girls and women appear to like sexually explicit music; if they didn’t, they’d listen to something else and shun guys who like such music. But they don’t.

Since Flanagan’s chief method of research is anecdote, let me do the same: I’ve known plenty of women who like fetish pornography. She also says puzzling stuff like, “For generations, a girl alone in her room was understood to be doing important work.” What? Understood by whom? And what constitutes “important work” here? In Flanagan’s view, it isn’t developing a detailed knowledge of microbiology in the hopes of furthering human understanding; it’s writing a diary.

There are other howlers: Flanagan says that “they [girls] are forced—perhaps more now than at any other time—to experience sexuality on boys’ terms.” This ignores the power of the female “no”—in our society women are the ones who decide to say yes or no to sex. She misses how many girls and women are drawn to bad-boy alpha males; any time they want “to experience sexuality on [girls’] terms,” whatever that might mean, they’re welcome to. Flanagan doesn’t have a sense of agency or how individuals create society. She says that “the mass media in which so many girls are immersed today does not mean them well; it is driven by a set of priorities largely created by men and largely devoted to the exploitation of girls and young women.” But this only works if girls choose to participate in the forms of mass media Flanagan is describing. That they do, especially in an age of infinite cultural possibilities, indicates that girls like whatever this “mass media” is that “does not mean them well.”

I’m not the only one to have noticed this stuff. See also “What Caitlin Flanagan’s new book Girl Land gets wrong about girls.” And “Facts and the real world hardly exist in Caitlin Flanagan’s ‘Girl Land,’ where gauzy, phony nostalgia reigns:” “Flanagan works as a critic, was once a teacher and counselor at an elite private school, and is the mother of two boys, but somehow nothing has matched the intensity of that girlhood; it forms the only authentically compelling material here.” Which is pretty damn depressing, to have the most intense moments of one’s life happen, at, say, 15.

Really late January 2012 links: Innovation, undergrads, TSA, Updike, the evils of JSTOR, and more

* This is our national identity crisis in a nutshell: Do we want government spending half its money on redistribution and military, or re-dedicating itself to science, infrastructure, and health research?

Do STEM Faculties Want Undegratuates To Study STEM Fields?

* “This might seem a small thing — hey, so what if these foreign jet-setters endure some hassle? — but I think it is emblematic of some cumulatively larger issues. Americans are habituated to griping about our airports and airlines, but I sense that people haven’t internalized how comparatively backward and unpleasant this part of our “modern” infrastructure has become.”

* “The scale and the brutality of our prisons are the moral scandal of American life.

* Locked in the Ivory Tower: Why JSTOR Imprisons Academic Research.

* Rabbit at Rest: The bizarre and misguided critical assault on John Updike’s reputation. I suspect there are a couple of things going on:

1) His fiction isn’t easily categorizable, so you can’t lump him in and say he’s part of group X: hysterical realism, postmodernism, whatever.

2) Many of his novels don’t have much plot, so non-academic readers aren’t likely to love him as much as academic writers.

3) When he began writing, explicit sex was rare, or relatively rare, in fiction; now that it’s common, some of the tension in his earlier books is absent for contemporary readers.

4) You can read Updike and figure out who’s speaking and where a scene is occurring, which isn’t fashionable in some literary circles and hasn’t been for a long time.

5) I suspect most average readers would prefer Robertson Davies to Updike, yet Davies is barely known in the United States or anywhere outside Canada; I think over time Updike will share his fate.

* On programmers:

Formal logical proofs, and therefore programs—formal logical proofs that particular computations are possible, expressed in a formal system called a programming language—are utterly meaningless. To write a computer program you have to come to terms with this, to accept that whatever you might want the program to mean, the machine will blindly follow its meaningless rules and come to some meaningless conclusion. In the test the consistent group showed a pre-acceptance of this fact: they are capable of seeing mathematical calculation problems in terms of rules, and can follow those rules wheresoever they may lead. The inconsistent group, on the other hand, looks for meaning where it is not. The blank group knows that it is looking at meaninglessness, and refuses to deal with it.

The “inconsistent group” sounds like many of the humanities grad students and profs I know.

* “In the high-rise offices of the big publishers, with their crowded bookshelves and resplendent views, the reaction to Amazon’s move is analogous to the screech of a small woodland creature being pursued by a jungle predator.

* The Business Rusch: Readers:

When I started, it wasn’t possible to make a living as a self-published writer. It is now. In fact, weirdly, you can make more money as a self-published writer than you ever could as a midlist writer—and in some cases, more than you could make as a bestselling writer.

Honestly, I find that astounding. This change has happened in just the past few years. A number of readers of this blog have commented on how fun it’s been to watch my attitudes change toward self- and indie-publishing. I’m still educating myself on all of this, and I’m still astonished by some things that I learn.

This might be me, shortly.

* “Students aspiring to technical majors (science/mathematics/engineering) were more likely than other students to report a sibling with an autism spectrum disorder (p = 0.037). Conversely, students interested in the humanities were more likely to report a family member with major depressive disorder (p = 8.8×10−4), bipolar disorder (p = 0.027), or substance abuse problems (p = 1.9×10−6).”

(Hat tip Marginal Revolution.)

* A Company Built on a Crisper Gin and Tonic: The quest for a better G&T led Jordan Silbert to start beverage company Q Tonic.

* “If I were a zombie, I’d never eat your brain / I’d just want your heart.”

The meanest thing I've ever said

Someone asked, and I thought about it for a while: what makes a comment really mean? Context counts: strangers can say cruel stuff that should roll off, because you can’t take everything said by a random asshole seriously—especially on the Internet. Accuracy should count too: people who say mean but obviously false things can be laughed off, so mean things probably need to have enough truth to sting; they could be untrue but the sort of thing you’re worried about being true. Especially from people who know you well. Power dynamics might count too: a nasty comment from a boss or advisor might count for more than one from a peer.

With those parameters in mind, when I was an undergrad I was hanging out a party and this girl who was, uh, not conventionally attractive, began doing a mock strip-tease (I think / hope it was “mock,” anyway). One or two guys offered her dollar bills, and then she came over to me, and I said, extremely loudly, that I’d only pay her to keep her clothes on. The other guys laughed, but she looked like I’d just murdered her puppy.

I was mostly being funny. But women are used to being pursued and having sexual power over men; when they don’t, and when they have their lack of sexual power pointed directly observed, they become extremely upset in a way that I suspect most guys are used to (this is part of Norah Vincent’s point in the fourth chapter of Self-Made Man). This was around the same time I realized that being inured to a woman’s attractiveness yields the paradoxical-seeming result of being more successful with women. And I was realizing how many women are susceptible to status plays in sexual marketplace value, especially if they’re worried that theirs is low. An astonishing large number are. The mean thing is using this kind of status play on someone who isn’t conventionally attractive.

The meanest thing I’ve ever said

Someone asked, and I thought about it for a while: what makes a comment really mean? Context counts: strangers can say cruel stuff that should roll off, because you can’t take everything said by a random asshole seriously—especially on the Internet. Accuracy should count too: people who say mean but obviously false things can be laughed off, so mean things probably need to have enough truth to sting; they could be untrue but the sort of thing you’re worried about being true. Especially from people who know you well. Power dynamics might count too: a nasty comment from a boss or advisor might count for more than one from a peer.

With those parameters in mind, when I was an undergrad I was hanging out a party and this girl who was, uh, not conventionally attractive, began doing a mock strip-tease (I think / hope it was “mock,” anyway). One or two guys offered her dollar bills, and then she came over to me, and I said, extremely loudly, that I’d only pay her to keep her clothes on. The other guys laughed, but she looked like I’d just murdered her puppy.

I was mostly being funny. But women are used to being pursued and having sexual power over men; when they don’t, and when they have their lack of sexual power pointed directly observed, they become extremely upset in a way that I suspect most guys are used to (this is part of Norah Vincent’s point in the fourth chapter of Self-Made Man). This was around the same time I realized that being inured to a woman’s attractiveness yields the paradoxical-seeming result of being more successful with women. And I was realizing how many women are susceptible to status plays in sexual marketplace value, especially if they’re worried that theirs is low. An astonishing large number are. The mean thing is using this kind of status play on someone who isn’t conventionally attractive.

%d bloggers like this: