Ninety-five percent of people are fine — but it’s that last five percent

How Airline Workers Learn to Deal with Passengers” reminds me of teaching; I’ve spent a bunch of years teaching college students and being a grant writing consultant, and I suspect that part of the problem airline workers experience is simple and akin to the problems I experience: 95% of people are fine, but that last 5% can occupy a lot of time and mental energy.* So there’s a temptation to become somewhat armored against that last 5%, which impacts interactions with the vast majority of people, who are normal and reasonable.

A lot of public-facing professions seem to have this problem, including emergency medicine doctors, cops, retail workers, and public school teachers. Because that bottom 5% is so noisy and time-consuming, a kind of misanthropy can set in, as one begins to think the bottom represents the whole—even if intellectually one knows it does not. Mental, psychological, and emotional armoring can reduce one’s overall effectiveness; this is particularly obvious in teaching, in which person-to-person connection plays a stronger role than it does in, say, consulting.

There seems to be something about the human mind that makes one negative interaction stand out more than 10 positive or normal interactions. So there’s a kind of crowding-out effect going on. And when I have to deal with someone who is unreasonable, I try to actively, consciously remind myself that they don’t represent the whole and that behind every irrationally unhappy person there are 49 to 99 normal people who aren’t giving me unwarranted grief.

Colleges in particular have been in the news lately, and a lot of people have seen crazy social justice warrior stuff out there, like the Middlebury College thing or the Halloween costumes at Yale. This stuff is in fact outrageous, but, again, it’s salient because it’s unusual and because it’s unusual one sees it in the news (and it does represent a real problem, though the problem tends to be overstated).

Friends and acquaintances who know what I do sometimes ask me about whether I see this kind of stuff. I do, a little. The vast majority of college students, however, seem to want what college students have always wanted: to learn something; to get by; to get a job when they’re done; to get laid; to learn something about themselves and the societies they live in; to make friends; to individuate from their families. You could add other items. Many students feel a vague sense of worry about being excellent sheep, and that worry is itself a sign of intellectual health. Most students, if they’ve thought about free-speech issues at all, vaguely support it. But a minority of well-organized and angry activists can make a lot more noise and news than the silent majority!

That last point is one many casual news readers don’t realize, which is why I emphasize it. It’s related to my essay, ““How do you know when you’re being insensitive? How do you know when you’re funny?” Similar issues play out in many fields beyond teaching and consulting. One angry, unreasonable, or irrational customer or client drowns out a lot of generically happy or satisfied ones. Or consider “I was a landlord: This is what it taught me about people.” Landlords often have to be prepared for worst-case scenarios, and that preparation bleeds into their everyday scenarios and interactions.

Social media probably amplifies many of the problem traits described above by allowing the least-reasonable people to organize, scream, and (not infrequently) lie. I don’t know what, if any, solution exists to these problems, apart from most individuals to attempt to be as reasonable as possible and not succumb to the noisy but unhinged minority. Not much of a rallying cry, is it?

* You can change the ratios some; I doubt the number of problem students reaches 10% in most scenarios, and I also doubt that the number declines below 2% (among professions that face the general public).

Why hasn’t someone tried to build or fund a very low-cost, very high-quality college?

As the title asks, why hasn’t someone tried to build or fund a very low-cost, very high-quality college? Or, if they have, what school is out there and has tried this?

It seems like a ripe strategy because virtually every (even slightly) selective school is pursing the same prestige strategy. Yet even as they do so, news about outrageous student loan burdens is everywhere and probably affecting the choices made by students. At the same time, college tuition has been outpacing inflation for decades—and everyone knows it. Education is a component of the “cost disease” that is afflicting other sectors too. The number of college administrators has grown enormously (though that may not be the prime factor behind public-school cost increases). Still, it used to be possible to work summer jobs and graduate with little or no debt; schools in the 1960s or 1970s don’t appear to have been dramatically worse at education than schools today, and in some ways they may have been better, yet today colleges are many times more expensive.

College costs and debts have soared, and at the same time the number of PhDs granted far outstrips the number of tenure-track and teaching jobs. Most universities and even many colleges care far more about research, much of which is bogus anyway, than teaching. Many universities don’t care about teaching at all, as long as the professor shows up to lecture, isn’t drunk, and doesn’t trade sex for grades. I hear many, many grad students and early professors lament the way their schools don’t care about teaching. So there’s a surplus of cheap PhDs out there who would desperately like to be professors. While professors who only teach two or three classes per semester complain relentlessly about all the “work” they supposedly have and how “busy” they allegedly are, it could be very easy to get professors to teach far more than they currently do at most schools, further reducing costs.

In short, the supply of faculty is there, and the supply of students ought to be there. So, with the setup above, let me repeat: why hasn’t anyone attempted to start a teaching-focused college with low tuition and extremely high-quality academics? I’m thinking of a school with a mandate to minimize the number of administrators and sports teams. One could even eliminate tenure, and thus ensure that PhDs hired today won’t still be on the payroll in 40 years.

This situation sounds like a community college, but I’m imagining a school that still draws from a national applicant pool and still maintains or attempts to maintain an elite or comprehensive academic character. Think of a liberal arts school but scaled up somewhat and with fewer administrators. If I were a billionaire I might try to do this; stupendously rich people loved endowing schools in the 19th Century, but that seems to have fallen out of fashion. Still, it worked then, so perhaps it could work now.

It may be that schools are really selling prestige and status, and consequently a low-cost, high-quality teaching school would be too low prestige and low status to attract students.

Still, and again as noted previously, pretty much every school, public or private, is pursuing the exact same prestige, admissions, and marketing strategy. With one or two exceptions (CalTech, University of Chicago—okay, there are a few others, but not many), they don’t even try (really or seriously) to distinguish themselves, and almost every school competes for the same BS college rankings. Such a uniform market seems ripe for alternate approaches, yet none are being tried or have taken off (so far as I know).

What am I missing?

* Maybe it was easier to start colleges in the 19th Century, when regulation was nonexistent and complex subsidies of various kinds weren’t available. In the 19th Century, many colleges were also founded with the explicit intent of saving students’ souls, so perhaps the lack of religiosity in today’s billionaires and/or most of today’s students is a factor.

* Current schools might just be too damn good at marketing for others to break in.

* Maybe there are efforts afoot and they’ve either failed or are too small for me to have noticed.

* Current schools are pursuing a complex price discrimination strategy, in which the sticker price is paid by a relatively small number of students, and much of the study body receives “scholarships” that are really tuition discounts. Maybe this system is more appealing to students and possibly schools than a transparent, everyone-pays-$5,000-per-year strategy.

* Students by and large pay with their parents’ money or pay with loans, so many an unbundled version of a school really is less attractive than one with lots of administrators, feel-good projects, fancy gyms, etc.

* Billionaires who might fund this are busy doing other things with their money.

* The number of “good” or at least weird and different students who would try such a school is not great enough (given the current cost of college and the number of students out there, I find this one hard to believe, but it isn’t impossible).

I’m guess that number four is most likely, but maybe there are other features I’m missing.

Caught in the nerd-o-sphere or researcher bubble

In a Tweet Benedict Evans mentions, “I’m always baffled when people are surprised by charts like this. What do people think the world was like 250 years ago? Isn’t this obvious?”


I replied, “I teach undergrads; it isn’t obvious to most, and most either don’t think about it or rely on TV-based historical fiction,” but that’s too glib; the chart’s demonstration of growing wealth is obvious to people who’ve read a lot of history and who’re immersed in the nerd-o-sphere or researcher bubble, but that’s a small part of the population. Most people don’t really, really think about or study history, and to the extent they think about it at all they rely on hazy, unsourced stereotypes.

I’ve read lots of student papers (and for that matter Internet comments) saying things like, “In the past, [claim here].” Some will even say, “In the old days…” In the margins I will write in reply, “Which years and geographic areas are you thinking about?” When I ask those kinds of questions in class students look at me strangely, like I’ve suddenly demanded they perform gymnastics.

The past really is a foreign country and unless someone has made the effort to learn about it directly, meta-learn how to learn, and learn how the people in a given time period likely thought, it can look like the present but with different clothes. That’s often how it’s presented in TV, movies, and pop fiction (see e.g. “Rules for Writing Neo-Victorian Novels“). To take one obvious example, characters in such TV shows and movies often have modern sexual and religious mores, ignoring that many of the sexual mores and rules of the last ~500 years of European and American history evolved because a) reliable contraception was unavailable or extremely limited, b) a child born to a single woman could end up killing both child and woman due to lack of money and/or food, and c) many STIs that are now treated with a quick antibiotic were death sentences.

In most countries today, people don’t worry about starving to death, so the kind of absolute poverty that’s stunningly declined in the last couple centuries takes a strong imaginative leap to inhabit. People also seem to experience hedonic adaptation, so the many things that make our lives easy and pleasant become invisible (that’s true of me too).

So the average person probably never thinks about what the world was like 250 years ago, and, if they do, they probably don’t have the baseline knowledge necessary to conceptualize and contextualize it properly. Those of us caught in the nerd-o-sphere and researcher bubble, like myself, do. Our sense of “obvious” shifts with the environment we inhabit and the education we’ve had (or the education we’re continuing all the time).

And about that education system. Years ago I used to read tech sites in which self-taught autodidacts would fulminate about the failures of the conventional school system and prophesize about how the liberation of information will remake the educational sector into a free intellectual utopia in which students would learn much faster and at their own pace, leading to peace, harmony, and knowledge; in this world, rather than being bludgeoned by teachers and professors, students would become self-motivated because they’d be unshackled from conventional curriculums. To some extent I believed those criticisms and prophecies. One day we would set students free and they’d joyously learn for the sake of learning.

Then I started teaching and discovered that the conventional school system exists to work on or with the vast majority of the population, which doesn’t give a fig about the joy of knowledge or intrinsic learning or whatever else Internet nerds and PhDs love. The self-taught autodidacts who wrote on Slashdot (back then) and Hacker News or Reddit or blogs today are a distinct minority and at most a couple percent of the total population. Often they were or are poorly served in some ways by the conventional education system, especially because they often have unusual ways of interacting socially.

Now, today, I’ve both taught regular, non-nerd students and read books like Geek Heresy: Rescuing Social Change from the Cult of Technology, and I’ve realized why the education system has evolved the way it has. Most people, left to their own devices, don’t study poetry and math and so on. They watch videos on YouTube and TV and play videogames and chat with their friends. Those are all fine activities and I’ve of course done all of them, but the average person doesn’t much engage in systematic skill- and knowledge-building of the sort that dedicated study is (ideally) supposed to do.

In short, the nerds who want to reform the education system are very different than the average student the system is designed to serve, in a way similar to the way the average person in the nerd-o-sphere or researcher bubble is likely very different from the average person, who hardcore nerds may not know or interact with very much.

I’m very much in that nerd-o-sphere and if you’re reading this there’s a high probability you are too. And when I write about undergrads, remember that I’m writing about the top half of the population in terms of motivation, cognition, and tenacity.

Circumstances under which going to law school can make sense

The reasons you should avoid law school are well known and I won’t repeat them here, but the other day I was explaining to a former student why she shouldn’t go to law school and she asked a perceptive question: Who should go? Under what conditions should a person go?

The answer is “almost no one” and “almost never,” but law school can be okay in a handful of circumstances:

* People who have already worked in law firms, probably as a paralegal but maybe under other circumstances, and who thus understand what the day-to-day life of a lawyer is like. That firm should have a job waiting and ready to go for the person before the person starts law school.

* People who have family (or close family friends) in law firms who can set the law school applicant up with a job straight out of school. If your uncle has a firm and wants you to take over that firm, law school can make sense.

* People with a very specific sense of what they want a law degree for and what they want to do with it—for example, people who desperately want to fight for voting rights, or immigrant rights, or something along those lines, and are convinced that those fights will be their life’s work, regardless of other challenges.

That’s really it; if I’m missing something, leave a comment or send an email. Law school mostly works for people who don’t need law school and only need the credentials that law school entails. There is a reason why most lawyers learned the craft on the job as apprentices, and law school only became a requirement in the post-World War II-era as a way of raising the salaries and status of then-existing lawyers.

Even going to highly ranked schools doesn’t make sense because, while you may get a big-firm job straight out of school, you’ll still be shackled to the work by student loan debt slavery, and you’ll still have to be a lawyer at the end (which most people don’t really want to do), and you’ll still probably not make partner (which means that you’re mostly working to line someone else’s pocket).

Don’t go to law school.

Why do so many people continue to pursue doctorates?

In “The Ever-Tightening Job Market for Ph.D.s: Why do so many people continue to pursue doctorates?”, Laura McKenna reviews the data on how terrible grad programs and the academic job market are, then goes on to ask: “Why hasn’t all this information helped winnow down the ranks of aspiring professors—why hasn’t it proved to be an effective Ph.D. prophylactic?” Having observed and participated in the mass delusion, I have some possible answers:

1. It’s a way to (pointlessly) delay adulthood.

2. Fear of the job market.

3. Don’t know what else to do.

4. Magical thinking (despite the numerous articles out there, like mine) that attempt to dissuade it). I think this is the biggest issue. In addition, there seems to be a Lake Woebegone effect: Everyone thinks they’re going to be above average.

5. Contrary to what grad students often say, in many disciplines and programs grad school is pretty easy and fun! You get to hang out on campus, think about ideas, take a minimal number of classes, do a bit of teaching, and have copious free time. Also, let me be euphemistic and say that many straight guys spend a lot of time with female undergrads. The problem is that, as time advances and your priorities start changing (want a real life / job, date people who don’t date people whose lives aren’t together, etc.), reality starts to intrude. Many grad students have an unacknowledged Peter Pan complex.

6. For most, academic success has been rewarded every step of the way (thanks to Hacker News reader mathattack for this point). The individuals who’ve gotten the most mileage listening to their teachers are also the ones who most need to stop listening to them. Professors are very keen on producing more professors and reproducing themselves, even though doing so is often not in the best interests of a particular individual.

7. People mistakenly focus on the outliers who accomplish major, important breakthroughs and think that they’ll be like the outliers, not the medians. This is another variant of the Lake Woebegone effect.

Note that a few fields (econ, computer science) appear to have relatively robust job outcomes for PhDs, so some of the above likely doesn’t apply to them.

Life: There is no authority left edition

All modern thought is falsified by a mystique of transgression, which it falls back into even when it is trying to escape. For Lacan, desire is still a by-product of the law. Even the most daring thinkers nowadays do not dare to recognize that prohibition has a protective function with regard to the conflicts inevitably provoked by desire. They would be afraid that people might see them as ‘reactionary’. In the currents of thought that have dominated us for a century, there is one tendency we must never forget: the fear of being regarded as naive or submissive, the desire to play at being the freest thinker—the most ‘radical’, etc. As long as you pander to this desire, you can make the modern intellectual say almost anything you like. This is the new way in which we are still ‘keeping up with the Jonses’.

— Rene Girard, Things Hidden Since the Foundation of the World.

The book itself is a hodgepodge of brilliance and incoherence / irrelevance, but the former outweighs the latter. The notion of defining the difference between “man” and “animal” seems to fascinate older philosophers in a way that I find bizarre or unimportant.

To my mind, the strongest part about American culture American culture’s meta ability to rapidly re-write itself in response to changing conditions and outside influences, including conditions related to Girard’s conception of desire.

Universities treat adjuncts like they do because they can

There Is No Excuse for How Universities Treat Adjuncts: Students are paying higher tuition than ever. Why can’t more of that revenue go to the people teaching them?” is well-summarized by its headline, but there is a very good “excuse” why universities treat adjuncts how they do: because they can. When people stop signing up for grad school and/or to be adjuncts, universities will have to offer better pay and/or conditions. Until that happens, universities won’t.

Markets are clearing.

“There Is No Excuse” keeps popping up in my inbox or in discussion sites, and I keep saying the same thing. The usual response to decry administrators (which is fine with me, although I’m not sure they’re the fundamental driver of cost) and to promote unions. On unions in universities I don’t have incredibly strong opinions; there may be some benefits to the people who get into the union on the ground floor, but raising the pay of some adjuncts will result in shortages of any work for others.

If the market-clearing price is $3,000 per class, and universities have to pay $5,000, there’ll be a large pool of people who can’t get those higher wages because there aren’t a sufficient number of jobs out there for them. So one can trade plentiful but somewhat poorly remunerative jobs for a smaller number of somewhat more remunerative jobs for those who grab them.

Universities will also be somewhat less reluctant to hire anyone, because those people who are hired won’t leave. In the late ’80s and early ’90s, a series of court cases eliminated mandatory retirement policies. Those ruling, combined with lengthening life expectancies, meant that tenured faculty could stay on indefinitely—blocking the path forward for younger academics. Which led to… the explosion of adjuncts presently being decried in the media. Unions (usually) prevent their members from being fired, and, if lay-offs do happen, they happen on a “first-in, first-out” basis. So the youngest and freshest workers will be gone first.

Still, I find the rhetoric of faculty who argue that the job of grad students is to be students hilarious; at the University of Arizona, grad students in English taught the same number of classes for the same number of hours as full-time faculty. Grad students, like medical residents, are workers, regardless of what else you call them.

It’s also worth contemplating alternatives to academia, which has lots of barriers to entry, formal credentials, and unspoken rules. Marginal product of labor for academics is hard to measure. by contrast, technology employment works well in part because there are close to zero barriers to entry. Someone who wants to learn to code can type “Learn to code” in Google and start. Unions will make the existing barriers to academia higher, and will leave it less like the healthier parts of the economy.

Academics are not exempt from the law of supply and demand, and it turns out that tenured academics are savvy marketers, singing a song of the life of the mind to the unwary who throw themselves onto the shore of a barren island. Supposedly smart people seem to be doing a lot of not-so-smart things.

By the way, I also like teaching as an adjunct because the jobs are plentiful, the hours are flexible, and the work is quite different from what I usually do. On a dollars-per-hour basis the pay is much worse, but the work itself is sometimes gratifying, and one gets the usual and much-discussed pleasures of teaching (the joy of young minds, etc.). It seems like the online conversation around adjuncts is dominated by people who teach at five different schools and have neither time nor money. Like any job it is not perfect. I do it for the same reason everyone, everywhere, works any job: it beats the alternatives.

%d bloggers like this: