Skin in the Game – Nassim Taleb

Skin in the Game is congruent with Tom Ricks’ book The Generals. Almost all generals and high-ranking officers in the U.S. military are now exempt from real risk, as Ricks argues—they are exempt even the risk of being fired or reassigned for simple incompetence (or being ill-suited to a role). Almost all enlisted men and junior officers, however, are heavily exposed to real risk, like being killed. That risk asymmetry should give pause to someone contemplating joining. The risk profile for generals prior to the Korean war, while not as a great as the risk profile for regular soldiers, was more reasonable than it is today. Military contractors are arguably the greatest beneficiary of the military today. If more people knew (and acted like they knew) this, we might see changes.

In Skin in the Game Taleb has many, many unusual examples, many of them good; he reads more like an old-fashioned philosopher (that is: one who wants to be read, heard, and understood, as opposed to one who wants tenure), and I mean that as a compliment. One of his rules is, “No person in a transaction should have certainty about the outcomes while the other one has uncertainty.” I wonder how this rule could be applied to colleges, especially under a student-loan system, in which the college is certain to be paid by the student, the student’s family, or the student’s bank (which is really to say, the bank’s student), while the student may see a variable return on investment—especially if the student is ill-equipped in the first place. Colleges may be selling credentials more than skills. But almost no one thinks about those things in advance.

Skin in the Game will, like Antifragile, frustrate you if you demand that every single sentence be true and useful. Some of Taleb’s micro-examples are bad, like his thing against GMOs:

In my war with the Monsanto machine, the advocates of genetically modified organisms (transgenics) kept countering me with benefit analyses (which were often bogus and doctored up), not tail risk analyses for repeated exposures

This view is incoherent because virtually every food eaten today has been “genetically modified,” inefficiently, through selective breeding. If you wish to learn just how hard this is, see The Wizard and the Prophet by Charles Mann. Transgenics speed the process. See this sad tale, and the links, for one researcher in the field who is giving up due to widespread opposition. He points out that, over and over again, transgenic have been shown to be safe.

Taleb is right that there are tail risks to transgenics… but that’s also theoretically true of traditional cross-breeding, and it’s also true of not engaging in transgenics. The alternative to high-efficiency transgenics is environmental degradation and, in many places, starvation. That’s pretty bad, and there’s a serious, usually unstated, environmental trade-off between signaling environmental caring and opposite transgenics (nuclear energy is the same).

Despite incorrect micro-examples, Skin in the Game is great and you should read it. It is less uneven than Antifragile. It’s also an excellent book to re-read (don’t expect to get everything the first time through) because Taleb gives so many examples and is overflowing with ideas.

Like: “If your private life conflicts with your intellectual opinion, it cancels your intellectual ideas, not your private life.” Something easily and frequently forgotten, or never considered in the first place. Look at what people do, not what they say. One of the many charming parts of Alain de Botton’s The Consolations of Philosophy is the apparently wide gap between what many philosophers wrote and how they appeared to live. Maybe the truest philosophers don’t write but do.

Or consider:

the highest form of virtue is unpopular. This does not mean that virtue is inherently unpopular, or correlates with unpopularity, only that unpopular acts signal some risk taking and genuine behavior.

A very Peter Thiel point: he asks what popular view is wrong and what unpopular views a given person holds.

Or consider:

The only definition of rationality that I’ve found that is practically, empirically, and mathematically rigorous is the following: what is rational is that which allows for survival.

This may be true, but most of us in the West now survive, unless we do something truly stupid, dangerous, or brave. So our wealth and comfort may enable us to be irrational, because we’re much less likely to pay the ultimate penalty than we once were. Darwin Awards aside, we mostly make it. We can worry more about terrorism than the much more immediate and likely specter of death in the form of the car, which kills far more people every year in the United States than terrorism.

To his credit, though, Taleb does write:

The Chernoff bound can be explained as follows. The probability that the number of people who drown in their bathtubs in the United States doubles next year [. . .] is one per several trillions lifetimes of the universe. This cannot be said about the doubling of the number of people killed by terrorism over the same period.

He’s right that the number who could be killed by terrorism is massive, especially given the risk of nuclear and biological weapons. But the disproportionate focus on terrorism takes too much attention from risks that seem mundane, like getting into cars. Everyone expects to get into car crashes. Perhaps we should be thinking more seriously about that. Too bad almost no one is.

“University presidents: We’ve been blindsided.” Er, no.

University presidents: We’ve been blindsided” is an amazing article—if the narrative it presents is true. It’s amazing because people have been complaining about political correctness and nothing-means-anything postmodernism since at least the early ’90s, yet the problems with reality and identity politics seem to have intensified in the Internet age. University presidents haven’t been blindsided, and some of the problems in universities aren’t directly their fault—but perhaps their biggest failure, with some notable exceptions (like the University of Chicago), is not standing up for free speech.

I don’t see how it’s impossible to see this coming; the right’s attack on academia has its roots in the kind of scorn and disdain I write about in “The right really was coming after college next.” As I say there, I’ve been hearing enormous, overly broad slams against the right for as long as I’ve been involved in higher education. That sort of thing has gone basically unchecked for I-don’t-know how long. It’s surprising not to expect a backlash, eventually, and institutions that don’t police themselves eventually get policed or at least attacked from the outside.

(Since such observations tend to generate calls of “partisanship,” I’ll again note that I’m not on the right and am worried about intellectual honesty.)

There is this:

“It’s not enough anymore to just say, ‘trust us,'” Yale President Peter Salovey said. “There is an attempt to build a narrative of colleges and universities as out of touch and not politically diverse, and I think … we have a responsibility to counter that — both in actions and in how we present ourselves.”

That’s because universities are not politically diverse. At all. Heterodox Academy has been writing about this since it was founded. Political monocultures may in turn encourage freedom of speech restrictions, especially against the other guy, who isn’t even around to make a case. For example, some of you may have been following the Wilifred Laurier University brouhaha (if not, “Why Wilfrid Laurier University’s president apologized to Lindsay Shepherd” is an okay place to start, though the school is in Canada, not the United States). Shepherd’s department wrote a reply, “An open letter from members of the Communication Studies Department, Wilfrid Laurier University” that says, “Public debates about freedom of expression, while valuable, can have a silencing effect on the free speech of other members of the public.” In other words, academics who are supposed to support free speech and disinterested inquiry don’t. And they get to decide what counts as free speech.

If academics don’t support free speech, they’re just another interest group, subject to the same social and political forces that all interest groups are subject to. I don’t think the department that somehow thought this letter to be a good idea realizes as much.

The idea that “trust us” is good enough doesn’t seem to be good enough anymore. In the U.S., the last decade of anti-free-speech and left-wing activism on campus has brought us a Congress that is in some ways more retrograde than any since… I’m not sure when. Maybe the ’90s. Maybe earlier. Yet the response on campus has been to shrug and worry about pronouns.

Rather than “touting their positive impacts on their communities to local civic groups, lawmakers and alumni,” universities need to re-commit to free speech, open and disinterested inquiry, and not prima facie opposing an entire, large political group. Sure, “Some presidents said they blame themselves for failing to communicate the good they do for society — educating young people, finding cures for diseases and often acting as major job creators.” But, again, universities exist to learn what’s true, as best one can, and then explain why it’s true.

Then there’s this:

But there was also an element of defensiveness. Many argue the backlash they’ve faced is part of a larger societal rethinking of major institutions, and that they’re victims of a political cynicism that isn’t necessarily related to their actions. University of Washington President Ana Mari Cauce, for one, compared public attitudes toward universities with distrust of Congress, the legal system, the voting system and the presidency.

While universities do a lot right, they (or some of their members) also engaging in dangerous epistemic nihilism that’s contrary to their missions. And people are catching onto that. Every time one sees a fracas like the one at Evergreen College, universities as a whole lose a little of their prestige. And the response of many administrators hasn’t been good.

Meanwhile, the incredible Title IX stories don’t help (or see Laura Kipnis’s story). One can argue that these are isolated cases. But are they? With each story, and the inept institutional response to it, universities look worse and so do their presidents. University presidents aren’t reaffirming the principles of free speech and disinterested research, and they’re letting bureaucrats create preposterous and absurd tribunals. Then they’re saying they’ve been blindsided! A better question might be, “How can you not see a reckoning in advance?”

“The right really was coming after college next”

Excuse the awkward headline and focus on the content in “The right really was coming after college next.” Relatively few people point out that college has been coming after the right for a very long time; sometimes college correctly comes after the right (e.g. Iraq War II), but the coming after is usually indiscriminate. I’ve spent my entire adult life hearing professors say that Republicans are stupid or people who vote for Romney or whoever are stupid. Perhaps we ought not to be surprised when the right eventually hits back?

A few have noticed that “Elite colleges are making it easy for conservatives to dislike them.” A few have also noticed that we ought to be working towards greater civility and respect, especially regarding ideological disagreement; that’s one purpose of Jonathan Haidt’s Heterodox Academy. Still, on the ground and on a day-to-day level, the academic vituperation towards the right in the humanities and most social sciences (excluding economics) has been so obvious and so clear that I’m surprised it’s taken this long for a backlash.

Because I’m already imagining the assumptions in the comments and on Twitter, let me note that I’m not arguing this from the right—I find that I’m on the side of neither the right nor the left, in part because neither the right nor the left is on my side—but I am arguing this as someone who cares about freedom of speech and freedom of thought, which have never been free and have often been unpopular. It’s important to work towards understanding before judgment or condemnation, even though that principle too has likely never been popular or widely adopted.

It seems to me that homogeneous, lockstep thought is dangerous wherever it occurs, and increasingly it appears to be occurring in large parts of colleges. One hopes that the colleges notice this and try to self-correct. Self-correction will likely be more pleasant than whatever political solution might be devised in statehouses.


‘Maybe in 50 years there won’t be novels’

Claire Messud: ‘Maybe in 50 years there won’t be novels:’ As her fifth novel is published, the American writer warns that shrinking attention spans could prove the death of long fiction” makes an interesting point that is definitely plausible and may also be correct. Still, while average attention spans may be shrinking, elite attention spans may be as long as they ever were—they have to be to do good work. The people who make Twitter, Facebook, and SnapChat need intense concentration to do the work they do (everyone ought to at least attempt a programming class, if for no other reason than to understand the kind of mental effort it entails). If we’re going to keep the lights on, the Internet working at all, and the world running, we need to be able to concentrate long enough to really understand a topic deeply.

As fewer people can do this, the value of doing it rises. My own work as a grant writer depends on concentration; part of the reason we have a business is because most people can’t concentrate long enough to learn to write well and then apply that learning to grant applications. As I wrote in 2012, “Grant writing is long-form, not fragmentary.” Cal Newport makes a similar point, although not about grant writing, in Deep Work.

The contemporary tension between an attention-addled majority and a deep-working minority fuels Neal Stephenson’s novel Anathem. It’s not the most readable of novels because the made-up vocabulary of the future is so grating. The idea is a reasonable one (our present vocabulary is different from the past’s vocabulary, so won’t the same be true of the future?), but the novel also shows the technical problems that attempting to implement that idea entail. I wonder if Messud has read Anathem.

Anyway, to return to Messud, I suspect this is true: “That we can’t fathom other people, or ourselves, is the engine of fiction” and as long as it remains true there will be an appetite for novels among at least some people.

By the way, I’ve started a couple of Messud’s books and never cottoned to them. Maybe the flaw is mine.

A field guide to Trump Resistance demonstrations

My father, Isaac, wrote this post.

As a young man, I was very involved in the antiwar movement, as well as other social justice causes. I went to many demonstrations, organized some, and spent two weeks registering African American voters in Natchez, MS during one Spring Break (I wasn’t smart enough to go to Fort Lauderdale). Like President Obama, I was trained by the Industrial Areas Foundation (IAF) as a community organizer and spent over a year working as a community organizer in a low-income African American neighborhood. You can read about my background as a true-believer radical at the link.

While I long ago hung up my demonstrating spurs, looking back over four decades to my activist and community organizing days gives me an unusual perspective on the sudden eruption of large street demonstrations by the Resistance to Trump.* This is my guide to Resistance fighters, who are willing to leave their coffee shops, yoga studios and Whole Foods for the suddenly trendy act of street demonstrations.

Not everyone in the thousands of people marching will necessarily share your desire for the peaceful exercise of group free speech. Some demonstrators likely have ulterior motives, whether they be the black masked anarchists who seem to love to break Starbucks windows, or, potentially much worse agent provocateurs. Such infiltrators could be aligned with some interest group, which could include Trump supporters, police—or, I suppose, even Putin, if you’re a conspiratorial sort.

As the protests grow, local cops and politicians, including Democrats, are going to be under increasing pressure to contain demonstrations to enable people to get to work, etc. It only takes a few agitators to turn a peaceful march into a riot. I know this first hand, as I was part of a demonstration of about 5,000 students protesting a speech by then HUD Secretary George Romney that suddenly turned ugly. Within a few minutes, some people in the crowd starting taunting the police and rushing the stage where Romney was speaking. A later investigation found that the people who rushed the stage were actually police agent provocateurs planted in the march. Waves of baton wielding cops moved on us from all sides. Tear gas canisters went flying. I got fairly badly tear gassed and retreated to a nearby classroom building to hide in a bathroom and wash my eyes, which is a different meaning for “sheltering” in place and a different meaning than today’s “safe space” advocates have in mind.

In a large march or demonstration, you’ll only see what happening immediately around you, and you can quickly get caught up in a riot without realizing that a riot is happening. Always be aware of your surroundings and who’s nearby.

Try to give yourself an escape route and be prepared to bolt. This is not the time for flip-flops and t-shirts. Wear running shoes, long pants, and a long sleeved shirt, since tear gas and CS gas (much stronger than tear gas and sometimes used by police), irritate the skin as well as the eyes. Carry a hanky and water bottle in case you need wash your eyes out if you get teargassed. Above all, resist the urge to get in the face of cops, throw projectiles or otherwise provoke police—they will use their batons to defend themselves and you might easily be handcuffed and carted off to jail.

Leave your babies, kids, and dogs at home. While pushing a stroller makes for great TV/Facebook/Instagram videos, kids and pets will be in extreme danger if a riot develops and you won’t be able to protect them.

The biggest danger is that counter-marchers will appear. So far, the Resistance marchers have had the field to themselves. Eventually, counter-marchers will start showing up. Over 60 million Americans voted for Trump and they also know how to use social media. Two opposing groups of peaceful demonstrators are not a problem, and police will usually try to keep them separated.

But just as there are anarchists aligned with the Resistance, some pro-Trumpsters may show up with ax handles and be intent on mayhem. Trump has fairly strong support among police unions, veterans, and bikers, for example, and these guys may show up ready to fight.

The police rank and file may also be more sympathetic to the counter-marchers than the Resistance and might just let the two groups tangle for a while. Or the police themselves can become the counter-marchers, like the Chicago Police Department at the Democratic Convention in 1968, which was later deemed a “police riot.” I don’t think any cops were disciplined, but hundreds of demonstrators were badly injured and the Chicago Seven faced years of litigation.

The free expression of political opinions is the bedrock of the American democracy and I’m all for showing opposition to the politics and politicians of the day. As the Rollings Stones said in “You Don’t Always Get What You Want,” “And I went down to the demonstration to get my fair share of abuse.” Don’t needlessly open yourself to abuse.

* I think conjuring up images of WW II France is overwrought, unless volunteers from Brooklyn and Santa Monica are going to parachute into coal country and disguise themselves by wearing red Make America Great hats, chewing tobacco, and driving Ford F-150s. If I were a community organizer on this project, I would use the term “Rebel Alliance” instead. The French Resistance were all white folk, while Star Wars goes for ethnic and inter-planetary species diversity.

Caught in the nerd-o-sphere or researcher bubble

In a Tweet Benedict Evans mentions, “I’m always baffled when people are surprised by charts like this. What do people think the world was like 250 years ago? Isn’t this obvious?”


I replied, “I teach undergrads; it isn’t obvious to most, and most either don’t think about it or rely on TV-based historical fiction,” but that’s too glib; the chart’s demonstration of growing wealth is obvious to people who’ve read a lot of history and who’re immersed in the nerd-o-sphere or researcher bubble, but that’s a small part of the population. Most people don’t really, really think about or study history, and to the extent they think about it at all they rely on hazy, unsourced stereotypes.

I’ve read lots of student papers (and for that matter Internet comments) saying things like, “In the past, [claim here].” Some will even say, “In the old days…” In the margins I will write in reply, “Which years and geographic areas are you thinking about?” When I ask those kinds of questions in class students look at me strangely, like I’ve suddenly demanded they perform gymnastics.

The past really is a foreign country and unless someone has made the effort to learn about it directly, meta-learn how to learn, and learn how the people in a given time period likely thought, it can look like the present but with different clothes. That’s often how it’s presented in TV, movies, and pop fiction (see e.g. “Rules for Writing Neo-Victorian Novels“). To take one obvious example, characters in such TV shows and movies often have modern sexual and religious mores, ignoring that many of the sexual mores and rules of the last ~500 years of European and American history evolved because a) reliable contraception was unavailable or extremely limited, b) a child born to a single woman could end up killing both child and woman due to lack of money and/or food, and c) many STIs that are now treated with a quick antibiotic were death sentences.

In most countries today, people don’t worry about starving to death, so the kind of absolute poverty that’s stunningly declined in the last couple centuries takes a strong imaginative leap to inhabit. People also seem to experience hedonic adaptation, so the many things that make our lives easy and pleasant become invisible (that’s true of me too).

So the average person probably never thinks about what the world was like 250 years ago, and, if they do, they probably don’t have the baseline knowledge necessary to conceptualize and contextualize it properly. Those of us caught in the nerd-o-sphere and researcher bubble, like myself, do. Our sense of “obvious” shifts with the environment we inhabit and the education we’ve had (or the education we’re continuing all the time).

And about that education system. Years ago I used to read tech sites in which self-taught autodidacts would fulminate about the failures of the conventional school system and prophesize about how the liberation of information will remake the educational sector into a free intellectual utopia in which students would learn much faster and at their own pace, leading to peace, harmony, and knowledge; in this world, rather than being bludgeoned by teachers and professors, students would become self-motivated because they’d be unshackled from conventional curriculums. To some extent I believed those criticisms and prophecies. One day we would set students free and they’d joyously learn for the sake of learning.

Then I started teaching and discovered that the conventional school system exists to work on or with the vast majority of the population, which doesn’t give a fig about the joy of knowledge or intrinsic learning or whatever else Internet nerds and PhDs love. The self-taught autodidacts who wrote on Slashdot (back then) and Hacker News or Reddit or blogs today are a distinct minority and at most a couple percent of the total population. Often they were or are poorly served in some ways by the conventional education system, especially because they often have unusual ways of interacting socially.

Now, today, I’ve both taught regular, non-nerd students and read books like Geek Heresy: Rescuing Social Change from the Cult of Technology, and I’ve realized why the education system has evolved the way it has. Most people, left to their own devices, don’t study poetry and math and so on. They watch videos on YouTube and TV and play videogames and chat with their friends. Those are all fine activities and I’ve of course done all of them, but the average person doesn’t much engage in systematic skill- and knowledge-building of the sort that dedicated study is (ideally) supposed to do.

In short, the nerds who want to reform the education system are very different than the average student the system is designed to serve, in a way similar to the way the average person in the nerd-o-sphere or researcher bubble is likely very different from the average person, who hardcore nerds may not know or interact with very much.

I’m very much in that nerd-o-sphere and if you’re reading this there’s a high probability you are too. And when I write about undergrads, remember that I’m writing about the top half of the population in terms of motivation, cognition, and tenacity.

“From Pickup Artist to Pariah” buries the lead

In “From Pickup Artist to Pariah: Jared Rutledge fancied himself a big man of the ‘manosphere.’ But when his online musings about 46 women were exposed, his whole town turned against him,” oddly, the most interesting and perhaps important parts of the article are buried or de-emphasized:

In 2012, he slept with three women; in 2013, 17; in 2014, 22. In manosphere terms, he was spinning plates — keeping multiple casual relationships going at once.

In other words… it worked, at least according to this writer. And:

I met four women at a downtown bar. All were on Jared’s List of Lays. Over cocktails and ramen, the women told me about Jared’s sexual habits, his occasional flakiness, his black-and-white worldview. [. . .] They seemed most troubled by just how fine he had been to date. “I really liked him,” said W. “And that’s what makes me feel so gullible.”

In other words… it worked, at least according to the women interviewed as framed by this writer.

How might a Straussian read “From Pickup Artist to Pariah?” Parts of the article, and not those already quoted, could be inserted directly into Onion stories.

The first sentence of Public Enemies: Dueling Writers Take On Each Other and the World is “Dear Bernard-Henri Lévy, We have, as they say, nothing in common—except for one essential trait: we are both rather contemptible individuals.” Is being contemptible sometimes a sign of status? As BHL implies, the greatest hatred is often reserved for that which might be true.*

In other news, the Wall Street Journal reports today that “Global Temperatures Set Record for Second Straight Year: 2015 was the warmest year world-wide since reliable global record-keeping began in 1880.”

In Julie Klausner’s book, I Don’t Care About Your Band: What I Learned from Indie Rockers, Trust Funders, Pornographers, Felons, Faux Sensitive Hipsters, and Other Guys I’ve Dated, she writes at the very end, “Around this time of graduation or evolution or whatever you call becoming thirty, I started fending off the guys I didn’t like before I slept with them. It was the first change I noticed in my behavior that really marked my twenties being over.” Maybe Rutledge’s mistake is of tone: Comedians are sometimes forgiven and sometimes thrown into the fire. No one is ever forgiven seriousness.

Houellebecq also writes, “there is in those I admire a tendency toward irresponsibility that I find only too easy to understand.” He is not the first person to admire irresponsibility. In Surely You’re Joking, Mr. Feynman!“, Richard Feynman says:

Von Neumann gave me an interesting idea: that you don’t have to be responsible for the world that you’re in. So I have developed a very powerful sense of social irresponsibility as a result of Von Neumann’s advice. It’s made me a very happy man ever since. But it was Von Neumann who put the seed in that grew into my active irresponsibility.

%d bloggers like this: