Computers and network effects: Why your computer is “slow”

Going Nowhere Really Fast, or How Computers Only Come in Two Speeds” is half-right. Here’s the part that’s right:

[…] it remains obvious that computers come in just two speeds: slow and fast. A slow computer is one which cannot keep up with the operator’s actions in real time, and forces the hapless human to wait. A fast computer is one which can, and does not.

Today’s personal computers (with a few possible exceptions) are only available in the “slow” speed grade.

So far so good: I wish I didn’t have to wait as long as I do for Word to open documents or load or for OS X to become responsive after reboot. But then there’s the reason offered as to why computers feel subjectively slower in many respects than they did:

The GUI of my 4MHz Symbolics 3620 lisp machine is more responsive on average than that of my 3GHz office PC. The former boots (into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created) faster than the latter boots into its syrupy imponade hell.

This implies that the world is filled with “bloat.” But such an argument reminds me of Joel Spolsky’s Bloatware and the 80/20 myth. He says:

A lot of software developers are seduced by the old “80/20” rule. It seems to make a lot of sense: 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies.

Unfortunately, it’s never the same 20%. Everybody uses a different set of features.

Exactly. And he goes on to quote Jamie Zawinski saying, “Convenient though it would be if it were true, Mozilla [Netscape 1.0] is not big because it’s full of useless crap. Mozilla is big because your needs are big. Your needs are big because the Internet is big. There are lots of small, lean web browsers out there that, incidentally, do almost nothing useful.”

That’s correct; Stanislav’s 4MHz Symbolics 3620 lisp machine was/is no doubt a nice computer. But modern, ultra-responsive computers don’t exist not because people like bloat—they don’t exist because people in the aggregate choose trade-offs that favor a very wide diversity of uses. People don’t want to make the trade-offs that fast responsiveness implies in sufficient numbers for there to be a market for such a computer.

Nothing is stopping someone from making a stripped-down version of, say, Linux that will boot “into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created faster than the latter boots into its syrupy imponade hell.” But most people evidently prefer the features that modern OSes and programs offer. Or, rather, they prefer that modern OSes support THEIR pet feature and make everything as easy to accomplish as possible at the expense of speed. If you take out their favorite feature… well, then you can keep your superfast response time and they’ll stick with Windows.

To his credit, Stanislav responded to a version of what I wrote above, noting some of the possible technical deficiencies of Linux:

If you think that a static-language-kernel abomination like Linux (or any other UNIX clone) could be turned into a civilized programming environment, you are gravely mistaken.

That may be true: my programming skill and knowledge end around simple scripting and CS 102. But whatever the weaknesses of Linux, OS X, and Windows, taken together they represent uncounted hours of programming and debugging time and effort. For those of you who haven’t tried it, I can only say that programming is an enormous challenge. To try and replicate all that modern OSes offer would be hard—and probably effectively impossible. If Stanislav wants to do it, though, I’d be his first cheerleader—but the history of computing is also rife with massive rewrites of existing software and paradigms that fail. See, for example, GNU/Hurd for a classic example. It’s been in development since 1990. Did it fail for technical or social reasons? I have no idea, but the history of new operating systems, however technically advanced, is not a happy one.

Stanislav goes on to say:

And if only the bloat and waste consisted of actual features that someone truly wants to use.

The problem is that one man’s feature is another’s bloat, and vice-versa, which Joel Spolsky points out, and that’s why the computer experience looks like it does today: because people hate bloat, unless it’s their bloat, in which case they’ll tolerate it.

He links to a cool post on regulated utilities as seen in New York (go read it). But I don’t think the power grid metaphor is a good one because transmission lines do one thing: move electricity. Computers can be programmed to do effectively anything, and, because users’ needs vary so much, so does the software. You don’t have to build everything from APIs to photo manipulation utilities to web browsers on top of power lines.

Note the last line of Symbolics, Inc.: A failure of heterogeneous engineering, which is linked to in Stanislav’s “About” page:

Symbolics is a classic example of a company failing at heterogeneous engineering. Focusing exclusively on the technical aspects of engineering led to great technical innovation. However, Symbolics did not successfully engineer its environment, custormers [sic], competitors and the market. This made the company unable to achieve long term success.

That kind of thinking sounds, to me, like the kind of thinking that leads one to lament how “slow” modern computers are. They are—from one perspective. From another, they enable things that the Lisp machine didn’t have (like, say, YouTube).

However, I’m a random armchair quarterback, and code talks while BS walks. If you think you can produce an OS that people want to use, write it. But when it doesn’t support X, where “X” is whatever they want, don’t be surprised when those people don’t use it. Metcalfe’s Law is strong in computing, and there is a massive amount of computing history devoted to the rewrite syndrome; for another example, see Dreaming in Code, a book that describes how an ostensibly simple task became an engineering monster.

Columbia or prison: similarities and differences?

Terry Gross’ interview with Scott Spencer (of A Man in the Woods) notes that the author has “taught fiction writing at Columbia University, and in prison” (1:10; I think she says “in prison,” although it might be “at prisons”). The tone sounds like this sort of trajectory is completely normal, like a sandwich and soup. To me, it invites questions:

  • Can I be the only one who finds the juxtaposition of those two fine American institutions curious or notable?
  • How many writers or professors have taught at an Ivy League school and a penal facility?
  • Is teaching at the one pretty much like teaching at the other?
  • If you’ve currently got a gig at a prison, how do you make the transition to Columbia? I assume relatively few people want to make the opposite leap.

Jane Austen, Emma, and what characters do

I’m rereading Jane Austen’s Emma and realized that when the characters in the novel debate the validity, respectability, or wisdom of the minor actions of other characters in the novel—which is essentially all that happens—they are really judging themselves and their own choices. For example, there’s a moment when Emma is considering Knightley’s observations about Elton’s real motives:

He had frightened her a little about Mr. Elton; but when she considered that Mr. Knightley could not have observed him as she had done, neither with the interest, nor (she must be allowed to tell herself, in spite of Mr. Knightley’s pretensions) with the skill of such an observer on such a question as herself, that he had spoken it hastily and in anger, she was able to believe, that he had rather said what he wished resentfully to be true, than what he knew any thing about.

When Emma says that Knightley “could not have observed him as she had done,” she’s really saying that she’s a more able observer than Knightley and that she doesn’t merely base things on what she “wished resentfully to be true.” This is proved wrong, of course, like many of her comments and ideas, and it shows that while she thinks she values seeing things clearly, given her “skill” as “such an observer,” she actually sees no more clearly than anyone else. The reader figures out that Emma is self-deceptive, while within the novel she is proclaiming that her own choice of Elton as a sexual partner for Harriet is an appropriate one.

Emma also tends not to have much meta-cognition—instead, we, the readers, act as her meta evaluator. For example, she moves briefly in this direction after Elton foolish declares her love, but she pulls back before it can come to fruition:

She had had many a hint from Mr. Knightley and some from her own heart, as to her deficiency—but none were equal to counteract the persuasion of its being very disagreeable,—a waste of time—tiresome women—and all the horror of being in danger of falling in with the second-rate and third-rate of Highbury, who were calling on them for ever, and therefore she seldom went near [the Bates, who she considers inferiors].

Whatever hints Knightley drops Emma ignores through most of the novel—likewise the ones “from her own heart.” Her own choices must be right because they come from her, even when those choices spring from unarticulated values that don’t hold up to Knightley’s clarifying vision. Emma never interrogates what “the second rate and third rate” mean: that’s one of the frustrating parts about this novel and so many others. The characters lack the ability to explicitly question their own values, even as they express what values they hold by denigrating the values of other characters. This is part of the joke and the irony of the novel, of course, but I tend prefer characters with somewhat greater self-awareness.

But the pleasure of Emma is realizing that its characters lack much of the self-awareness we think they should have. They debate values when they should be debating their debate on values. That, instead, is left to the critics.

The Novel: An Alternative History — Steven Moore

Novels really start when an important technology (the printing press) allows novelists to respond to one another.

Steven Moore’s The Novel: An Alternative History: Beginnings to 1600 is a very alternative history that points even more than most histories of the novel to the question of what defines the genre. But it answers that question with less satisfaction: a novel is any prose work of some length that is what we would now call fiction. But the idea of fiction / nonfiction weren’t particularly well established until the late eighteenth century, as discussed in some of those conventional histories, like The Rise Of The Novel: Studies In Defoe, Richardson And Fielding and Institutions of the English Novel: From Defoe to Scott.

Without that epistemological distinction, critics lack the intellectual scaffolding necessary to really talk about fiction: you have a muddle of stuff that people haven’t really figured out how to deal with. In The Disappearance of God, J. Hillis Miller puts it differently: “The change from traditional literature to a modern genre like the novel can be defined as a moving of once objective worlds of myth and romance into the subjective consciousness of man,” but he’s getting at a similar idea: the “objective worlds of myth” turn out not to be as “objective” as they appear, and the “subjective consciousness of man” reevaluates those worlds of myth. We get at distinctions between what’s true and what’s false based on our ability to recognize our own subjective position, which the novel helps us do.

Moore discusses these issues, of course: he notes the standard history I’m espousing and his reasons for doubting it:

And today our best novelists follow in this great tradition [from Defoe, Swift, and Richardson to the 19th Century realists through Joyce and Faulkner to the present]: that is, realistic narratives driven by strong plot and peopled by well-rounded characters struggling with serious ethical issues, conveyed in language anybody can understand.

Wrong. The novel has been around since at least the 4th century BCE […] and flourished in the Mediterranean area until the coming of the Christian Dark Ages.

That’s on page three. I’ve responded to the philosophical and intellectual aspects of what I think problematic, but there’s another issue: Moore’s argument ignores the technological history that enabled the novel to occur. I’ll return to my first paragraph.

Without the printing press, it’s wrong-headed to speak of novels. They couldn’t be sufficiently read, distributed, and disseminated, to enable the “speaking to each other” that I think of in fiction. There wasn’t a “creativity revolution” along the lines of the runaway Industrial Revolution of the eighteenth century (see, for example, Joel Mokyr’s The Enlightened Economy, which I discuss at the link). Books didn’t react enough to other books; that’s part of what the novel got going, and this aspect was enabled by the Industrial Revolution and the press. The two are fundamentally linked.

Some works that we would now classify as fiction definitely were written or compiled, as Moore rightly points out, but they didn’t gain the epistemological distinctions that we grant novels until much later, and novels evolved with a mass reading public that could only occur when novels were mass-produced—produced in numbers that allowed them to be read and responded to by other writers. Claiming that early quasi-fiction forms are novels is like saying that a play and a TV show are the same thing because both rely on visual representations of actors who are pretending to be someone else. In some respects, that’s true, but it still misses how form changes function. It misses the insights of Marshall McLuhan.

He almost gets to this issue:

Sorting through the various ancient writings that have come down to us on cuneiform tablets, papyri, scrolls, and ostraca (potsherds or limestone flakes), it is not difficult to find prototypes for literary fiction and what would eventually be called the novel. What’s difficult is sorting prose from poetry, and fiction from mythology and theology.

But the problem of sorting deserves more attention. Until it can be discussed with greater depth, it misses essential features of the genre. Accounts of the novel need to take two major issues into their reading: a technological one and an intellectual one. The technological one, as mentioned, is the invention and improvement of the printing press, without which the sheer labor necessary to produce copies of novels would have prevented many writers from working at all; you can read more about this in Elizabeth L. Eisenstein’s The Printing Press as an Agent of Change The second is the growth of subjectivity and the acknowledgment of subjectivity in fiction, as also discussed above. Without those technological and the intellectual facets, I don’t think you really have novels, at least in the way they’re conceived of in contemporary times.

The other thing I’d like to note is that Moore is doing more a taxonomy than a history: it has brief sections on more than 200 books with relatively little analysis of each book. This lessens the depth of his book and makes it more tedious as we go from culture to culture without a great deal of discussion about what common items link novel to novel. But that’s part of the problem: proto-novels weren’t linked because their authors didn’t know of one another or of what made fiction fiction and nonfiction nonfiction. Moore is left with this basic shape for The Novel: An Alternative History by his material; in short, form undercuts argument. Too bad, because it’s an argument worth paying attention to if for no other reason than its novelty.

Signaling, status, blogging, academia, and ideas

Jeff Ely’s Cheap Talk has one of those mandatory “Why I Blog” posts, but it’s unusually good and also increasingly describes my own feeling toward the genre. Jeff says:

There is a painful non-convexity in academic research. Only really good ideas are worth pursuing but it takes a lot of investment to find out whether any given idea is going to be really good. Usually you spend a lot of time doing some preliminary thinking just to prove to yourself that this idea is not good enough to turn into a full-fledged paper.

He’s right, but it’s hard to say which of the 100 preliminary ideas one might have over a couple of months “are worth pursuing.” Usually the answer is, “not very many.” So writing blog posts becomes a way of exploring those ideas without committing to attempting to write a full paper.

But to me, the other important part is that blogs often fill in my preliminary thinking, especially in subjects outside my field. I’m starting my third year of grad school in English lit at the University of Arizona and may write my dissertation about signaling and status in novels. My interest in the issue arose partially because of Robin Hanson’s relentless focus on signaling in Overcoming Bias, which got me thinking about how this subject works now.

The “big paper” I’m working on deals with academic novels like Richard Russo’s Straight Man and Francine Prose’s Blue Angel (which I’ve written about in a preliminary fashion—for Straight Man, a very preliminary fashion). Status issues are omnipresent in academia, as every academic knows, and as a result one can trace my reading of Overcoming Bias to my attention to status to my attention to theoretical and practical aspects of status in these books (there’s some other stuff going on here too, like an interest in evolutionary biology that predates reading Overcoming Bias, but I’ll leave that out for now).

Others have contributed too: I think I learned about Codes of the Underworld from an econ blog. It offers an obvious way to help interpret novels like those by Elmore Leonard, Raymond Chandler, and other crime / caper writers who deal with characters who need to convincingly signal to others that they’re available for crime but also need not to be caught by police, and so forth.

In the meantime, from what I can discern from following some journals on the novel and American lit, virtually no English professors I’ve found are using these kinds of methods. They’re mostly wrapped up in the standard forms of English criticism, literary theory, and debate. Those forms are very good, of course, but I’d like to go in other directions as well, and one way I’ve learned about alternative directions is through reading blogs. To my knowledge no one else has developed a complete theory of how signaling and status work in fiction, even though you could call novels long prose works in which characters signal their status to other characters, themselves, and the reader.

So I’m working on that. I’ve got some leads, like William Flesch’s Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction and Jonathan Gottschall’s Literature, Science, and a New Humanities, but the field looks mostly open at the moment. Part of the reason I’ve been able to conceptualize the field is because I’ve started many threads through this blog and frequently read the blogs of others. If Steven Berlin Johnson is right about where good ideas come from, then I’ve been doing the right kinds of things without consciously realizing it until now. And I only have thanks to Jeff Ely’s Cheap Talk—it took a blog to create the nascent idea about why blogging is valuable, how different fields contribute to my own major interests, and how ideas form.

How Universities Work, or: What I Wish I’d Known Freshman Year: A Guide to American University Life for the Uninitiated

Note that you can also read this essay as a .pdf.

Introduction

Fellow graduate students sometimes express shock at how little many undergraduates know about the structure and purpose of universities. It’s not astonishing to me: I didn’t understand the basic facts of academic life or the hierarchies and incentives universities present to faculty and students when I walked into Clark University at age 18. I learned most of what’s expressed here through osmosis, implication, inference, discussion with professors, and random reading over seven years.

Although most of it seems obvious now, as a freshman I was like a medieval peasant who conceived of the earth as the center of the universe; Copernicus’ heliocentric[1] revolution hadn’t reached me, and the much more accurate view of the universe discovered by later thinkers wasn’t even a glimmer to me. Consequently, I’m writing this document to explain, as clearly and concisely as I can, how universities work and how you, a freshman or sophomore, can thrive in them.

The biggest difference between a university and a high school is that universities are designed to create new knowledge, while high schools are designed to disseminate existing knowledge. That means universities give you far greater autonomy and in turn expect far more from you in terms of intellectual curiosity, personal interest, and maturity.

Universities are also supposed to help students help themselves. That is, you, the student, are or should be most responsible for your own learning.

Degrees

This section might make your eyes glaze over, but it’s important for understanding how universities work. If you’re a freshman in college, you’ve probably just received your high school diploma. Congratulations: you’re now probably working toward your B.A. (bachelor of arts) or B.S. (bachelor of science), which will probably take four years. If you earn that, you’ll have received your undergraduate degree.

From your B.A./B.S., if you wish to, you’ll be able to go on to professional degrees like law (J.D.), medicine (M.D.), or business (M.B.A.), or to further academic degrees, which usually come in the form of an M.A., or Master’s Degree. An M.A. usually takes one to two years after a B.A. After or concurrently with an M.A., one can pursue a Ph.D., or Doctor of Philosophy degree, which usually takes four to ten years after a B.A.

The M.A. and Ph.D. are known as research degrees, meaning that they are conferred for performing original research on a specific topic (remember: universities exist to create new knowledge). Professional degrees are designed to give their holder the knowledge necessary to be a professional: a lawyer, a doctor, or a business administrator.

Many if not most people who earn Ph.D.s ultimately hope to become a professor, as described in the next section. The goal of someone earning a Ph.D. is essentially to become the foremost expert in a particular and narrow subject.

Professors, Adjuncts, and Graduate Students

There are two to three main groups—one could even call them species—you’ll interact with in a university: professors, adjunct professors, and graduate students.

Professors almost always have a Ph.D. Many will have written important books and articles in their field of expertise. They can be divided into two important classes: those with tenure—a word you’ll increasingly hear as you move through the university system—and those without. “Tenure,” as defined by the New Oxford American Dictionary that comes with Mac OS X 10.6, is “guaranteed permanent employment, esp. as a teacher or professor, after a probationary period.” It means that the university can’t fire the professor, who in turn has proven him or herself through the publication of those aforementioned books and papers along with a commitment to teaching. This professor will probably spend her career at the university she’s presently at.

Those without tenure but hoping to achieve it are on the “tenure track,” which means that, sometime between three and six years after they’re hired, a committee composed of their peers in the department will, along with university administrators and others, decide whether to offer tenure. Many professors on the tenure track are working feverishly on books and articles meant for publication. Without those publications, they will be denied tenure and fired from their position.

Adjuncts, sometimes called adjunct professors, usually have at least an M.A. and often have a Ph.D. They do not have tenure and are not on the “tenure track” that could lead to tenure. They usually teach more classes than tenured or tenure-track professors, and they also have less job security. Usually, but not always, adjuncts teach lower-level classes. They are not expected to do  research as a condition of staying at the university.

Graduate Students (like me, as of this writing) have earned a B.A. or equivalent and are working towards either an M.A. or a Ph.D. From the time they begin, most graduate students will spend another two to eight years in school. They take a set number of small, advanced classes followed by tests and/or the writing of a dissertation, which is an article- or book-length project designed to show mastery in their field.

Many—also like me—teach or help teach classes as part of their contract with the university. In my case, I teach two classes most semesters, usually consisting of English 101, 102, or 109 for the University of Arizona. As such, I take and teach classes. In  return, the university doesn’t charge me tuition and pays me a small stipend. Most graduate students who teach you ultimately want to become professors. To get a job as a professor, they need to show excellence in research—usually by writing articles and/or books—as well as in teaching.

For all three groups, much of their professional lives revolve around tenure, which brings additional job security, income, and prestige.

Two Masters

Most graduate students and non-tenured professors serve two masters: teaching and research. As an undergraduate, you primarily see their teaching side, and your instructors might seem like another version of high school teachers. For some if not most instructors, however, teaching is not their primary duty and interest; rather, they primarily want to conduct original research, which usually takes the form of writing articles (also sometimes called “papers”) and books. The papers you are assigned for many classes are supposed to help you prepare for more advanced writing and research.

Graduate students and professors feel constant tension between their teaching and their research / writing responsibilities. Good ones try to balance the two. For most graduate students and professors, however, published research leads to career advancement, better jobs, and, ultimately, tenure.

Many of your instructors will have stronger incentives to work on research than teaching. This doesn’t mean they will shirk teaching, but many do. Some teach creatively and diligently, as they should. But it’s nonetheless wise to understand the two masters most of your instructors face; they are usually rewarded much more for research than teaching.

In graduate school multiple professors told me to minimize my time spent teaching and maximize my time spent researching. This isn’t unusual advice. Grad students and non-tenured professors are often explicitly told not to waste time on teaching, since that doesn’t lead to advancement, and often imbibe a cultural atmosphere that denigrates teaching. This is important if you’re wondering why your professors seem distracted or uninterested in the classroom. Professors are often incentivized not to focus on teaching. Professional academics understand these facts well, but they’re surprisingly poorly understood by everyone else:

There is only one problem with telling students to seek out good teaching in college. They’re going to have some trouble finding it, because academic institutions usually don’t care about it. Oh, they’ll tell you otherwise, in their promotional material. But I advise you to be skeptical. The profession’s whole incentive structure is biased against teaching, and the more prestigious the school, the stronger the bias is likely to be. (Deresiewicz 180-1)

I personally think teaching is of great importance and that schools ought to reward teaching, but “what I personally think” and “what is true” are different in this situation.

Interacting with Professors, Adjuncts, and Graduate Students

To earn tenure (or work towards a PhD, or earning tenure), many professors and grad students spend long periods of time intensely studying a subject, most often but not exclusively through reading. They expect you to read the assigned material and have some background in reading more generally; if you don’t, expect a difficult time in universities.

Professors and other instructors have devoted or are devoting much of their lives to their subjects. As you might imagine, having someone say that they find a subject boring, worthless, or irrelevant often irritates professors, since if professors found their subject boring, worthless, or irrelevant, they wouldn’t have spent or be planning to spend their lives studying it.

Most make their subject their lives and vice-versa. They could in theory earn more money in other professions but choose not to pursue those professions, but they are often excited by knowledge itself and want to find others who share that excitement. If you say or imply their classes are worthless, you’ve said or implied that their entire lives are worthless. Most people do not like to think that their lives are worthless.

Professors can sometimes seem aloof or demanding. This is partially due to the demands placed on them (see “Two Masters,” above). Being aloof or demanding doesn’t mean a professor doesn’t like you. Most professors are interested in their students to the extent that students are interested in the subject being taught. Engaged professors often try to stir students’ interest in a subject, but actively hostile/ uninterested students will often find their instructors uninterested in them. Motivated and interested students often inspire the same in their professors.[2] It’s a virtuous cycle.

To be sure, there are exceptions: some professors will be hostile or uninterested regardless of how much effort a student shows, and some will be martyrs who try to reach even the most distant, disgruntled student. But most professors are in the middle, looking for students who are engaged and focusing on those students.

Nearly all your instructors have passed through the trials and tests they’re giving you: if they hadn’t done so, and excelled, they wouldn’t be teaching you. Thus, few are impressed when you allocate time poorly, try to cram before tests, appear hungover in class, and show up late to or miss class repeatedly. On the other hand, many will cut slack for diligent students who show promise.

One reason professors don’t think much of student excuses is because many students have different priorities than professors. As undergraduates, most professors were part of the “academic culture” on campus, to use Murray Sperber’s term (5); in contrast, many undergraduates are part of the collegiate (interested in the Greek system, parties, and football games) or vocational (interested in job training) cultures. The academic culture, according to Sperber, “[has a] minimal understanding of, and sympathy for, the majority of their undergraduate students” (7) at big public schools.

I think Sperber is too harsh, but the principle is accurate: if you aren’t in school to learn and develop your intellect—and most students in most schools aren’t, as Sperber shows—you probably won’t understand your professors and their motivations. But they will understand yours. Academics are a disproportionately small percentage of the student population at most schools but an extraordinary large proportion of grad students and professors.

Another book, Paying for the Party: How College Maintains Inequality, describes how many universities have evolved two or more tracks, but those tracks are mostly concealed from the students. One track is primarily academic, with hard, usually technical, majors that are highly demanding and that usually lead to developing important skills. The other track is primarily social and leaves students with fewer skills but lots of time to party. The latter track works reasonably well, or is at least not catastrophic, for students from wealthy and/or well-connected families that can get intellectually weak, low-skill students jobs upon graduation—even graduation with a dubious degree and four years of intense partying. The party/social track doesn’t work well for students with poorer or disconnected families. The more time I spend in the system the more apparent the two tracks become—and the more I wish students were explicitly told about them.

Requirements for Undergraduates

You can only graduate from a university if you pick a major and fulfill its requirements. Clark called its undergraduate requirements “Perspectives,” while the University of Arizona calls them “Gen Eds” or “General Education Requirements.” There is no way to avoid filling requirements, and most requirements demand that you spend a certain amount of time with your rear end in a seat at a certain number of classes. Fulfill as many requirements as possible as soon as you realize those requirements exist, assuming you want to graduate on time.

You’ll often be assigned an “academic advisor,” whose job it is to help keep you on track to graduate and to help you pick courses. Don’t be afraid of this person: he or she will often help you or point you to people who can help you. At bigger schools, your advisor will often seem harried or uninterested, but even if that person is, remember that he or she is still a valuable resource. And if you can’t get help from your counselor, find the requirements of potential majors or all majors and work toward checking them off, because you won’t be able to get out of them.

As an undergrad, I tried and found that there is virtually no negotiating with requirements, even if some are or seem silly. For example, Clark required that students take “science perspective.” In studying my schedule and options, I figured that astronomy was the easiest way out. Considering how useless astronomy looked, I decided to petition the Dean of Students to be excused from it so I could take better classes, arguing that I’d taken real science classes in high school and that I could be more productively engaged elsewhere. The answer came quickly: “no.”

Astronomy, as it was taught to me, consisted of tasks like memorizing the lengths of planets from the sun, what the Kuiper Belt is[3], and the like. Tests asked things like the size of each planet—in other words, to regurgitate facts that one can find in two seconds on Google, which is how I found out what the Kuiper Belt is again. The professor teaching it no longer appeared to have a firm grasp of his mental faculties; I think he was in his 80s. At least it was relatively easy: the only worse thing would’ve been having to take, say, chemistry, or a real science class.

That astronomy class was probably the most useless I took, and Clark’s tuition at that time was something like $22,000. I received a scholarship toward tuition, room, and board, so my tuition was probably closer to $16,000, or $8,000 per semester. Undergrads took four classes, so the useless astronomy class cost around $2,000. Would I have rather taken another English class, or computer science class, or a myriad of other subjects? You bet. But I couldn’t, and if I didn’t take some kind of science class, I wouldn’t have been able to graduate, no matter the uselessness of the class.

What should I major in?

I have a theory that virtually everything you learn in universities (and maybe life) is the substance or application of two (or three, depending on how you wish to count) abilities: math and reading/writing. Regardless of what you major in, work on building those two skills.

In the liberal arts, that most often means philosophy, English, and history; other majors vary by university, but those requiring a lot of reading and writing are almost always better than those that don’t. In the hard sciences and economics you’ll be left to develop your reading and writing skills on your own. And this does apply to you, whether you realize it or not. As software company founder and rich guy Joel Spolsky wrote:

Even on the small scale, when you look at any programming organization, the programmers with the most power and influence are the ones who can write and speak in English clearly, convincingly, and comfortably. Also it helps to be tall, but you can’t do anything about that.

The difference between a tolerable programmer and a great programmer is not how many programming languages  they know, and it’s not whether they prefer Python or Java. It’s whether they can communicate their ideas. By persuading other people, they get leverage.

So if you want leverage, learn how to write. And if liberal arts majors don’t want to be bamboozled by statistics, they better learn some math.

In short, I have no idea what you should major in. But you probably shouldn’t major in business, communication, sociology, or criminal justice, all of which are worthy subjects that, for most undergraduates, are sufficiently watered down that you’re unlikely to challenge yourself much. Odds are that you’ll even make more money as a philosophy major than a business management major (“Salary Increase by Major”).

Paul Graham wrote:

Thomas Huxley said “Try to learn something about everything and everything about something.” Most universities aim at this ideal.

But what’s everything? To me it means, all that people learn in the course of working honestly on hard problems. All such work tends to be related, in that ideas and techniques from one field can often be transplanted successfully to others. Even others that seem quite distant. For example, I write essays the same way I write software: I sit down and blow out a lame version 1 as fast as I can type, then spend several weeks rewriting it.

The reality is that your specific major probably doesn’t matter nearly as much as your tenacity, ability to learn, and the consistent application of that ability to learn to specific problems. One way  people—friends, employers, graduate schools, colleagues, etc.—measure this is by measuring the way you speak and write, which together are a proxy for how much and how deeply you’ve read.

A great deal of college is about teaching you how to learn, and reading is probably the fastest way to learn. Once you’ve mastered the art of reading, you’ll be set for life, provided you keep exercising the skills you develop at a university. Keep that in mind as you search for majors: those that assign more reading, more writing, and more math are probably more worthwhile than those that  don’t.

Many people have many opinions about what you should major in, and most of them are probably wrong. This one included. As I said previously, it probably doesn’t matter in the long run, so don’t worry much about what to major in—worry about finding something you’re passionate about and something you love. In Prelude to Mathematics, W.W. Sawyer wrote: “An activity engaged in purely for its consequences, without any pleasure for the activity itself, is likely to be poorly executed” (16 – 17). If possible, find something to major in which you enjoy for itself, or which you can learn to enjoy for itself.

Regardless of what you major in, let me reiterate something I wrote in the introduction: you are or should be most responsible for your own learning. This is true not only in school but in your entire life. You will get some bad teachers, some bad bosses, some bad clients, and some bad situations in your life. Nonetheless it is your responsibility to keep learning, to overcome obstacles, and to help yourself.

Students often want to be spoon-fed everything, but that’s not how the world works. People generally pay other people to solve their problems. Your goal is to develop the skills it takes to solve the problems other people have, so that they pay you. Let’s look at some professions and how, in an ideal world, each profession solves a problem:

  • Cop: Solves the need for public safety.
  • Scientist: Solves the need for learning how things actually work, and, tangentially to that, how to turn ideas and facts into products.
  • Petroleum Engineer: Solves the need for energy, which people require to get from point A to B via car, plane, or train, and for electricity.
  • Teacher: Solves the need for education, and helps turn economically useless children into productive adults (Senior).
  • Social Media Analyst: Solves the need to advertise through numerous electronic platforms.

You can occasionally find situations in which it’s possible to get paid without solving someone’s problem, but they’re rare. There are also important jobs that are nonetheless illegal but can be analyzed through the same method as the bullet above (for example, prostitutes solve the need for sex, and drug dealers solve the need for different experiences). People on the cutting edge of technology and social change often solve needs for themselves—Mark Zuckerberg needed a way to communicate with others online before most people really noticed that need.

Your teachers and professors, including me, are often not that good at identifying such needs.

Finally, note that you often can’t predict what will be useful and what won’t be. It’s also possible that the people designing your curriculum know more about the subject than you do.

How do I get an A?

One thing you shouldn’t do is say that all you want to do is get an A: as stated above, most professors are completely and utterly invested in their subject. When you ask how you get an “A,” they’re likely to be annoyed because you’re indicating you don’t care about learning, which is the best way to earn an A. Instead, you care about the badge. It’s like asking how you become poet laureate, as Ebenezer Cooke does in The Sot-Weed Factor: the question itself is wrong, because the right question is how you become a poet, and the laureateship will follow (Barth 73). If you ask professors how to get an A, they’ll also tell you what you already know: work hard at the class, show up, read the book(s) and related materials, form study groups, etc.

Another grad student in English said that she’s almost relieved when students say they just want to get an A, because it means she doesn’t have to worry about them or their grade. Paradoxically, when you say that you just want an A/B/C, you lower the probability that you’ll actually get it.

To get that A/B/C, demonstrate that you’re interested in the material, do all the reading, and show up to class every day. Go to the professor’s office hours to ask intelligent questions—like whether you’re on the right track regarding a paper—or what you could’ve done better on a quiz. By doing so, you’re showing that you’re interested in doing better, rather than saying you are. Novelists have a saying: “show, don’t tell,” which means that you should show what a character is thinking and why they are acting in a certain way rather than telling the reader. Readers are smart and will figure it out for themselves. Your professors will be able to figure out in a million ways whether you’re interested in a subject, and when you ask how you get an A, they’ll know you aren’t.

Oh, and don’t fear the library—it’s the big place with the books. If you conduct research with books, your professors will be impressed. And learn to use the online journals. If you don’t know what this  means, ask a librarian, who will assist you. They very seldom bite and are there to help, and most schools also conduct library help sessions at the beginning of each year. Indeed, almost everyone at a university is there to help you learn; you just need to a) want to learn and b) ask. Many students never get to point a, and of those who do, more should get to point b.

Reflection

I wrote this now because I’m old enough to, I think, have some perspective on universities while still being young enough to remember the shock and bewilderment of the first semester of my freshman year. This document reflects my academic training and preoccupation: it contains allusions and references to other work and is structured in such a way that you can skip easily from section to section. As a trade-off for its detail, however, weaker or uninterested students might lose interest in it before they come to the end, which is unfortunate because it describes the world they will largely be inhabiting for somewhere between one week and six if not more years.

Anecdotes from my own academic experience are included because discovering facts about the incentives in university life didn’t occur all at once for me. No one gave me a document like this; I was expected to either already know or understand most of what you just read, and as a result, I spent years drawing a mental map of universities. The professors and graduate students had spent long enough in the university atmosphere that they knew how universities were structured with the thoroughness you know your native language. I’ve written this in the hope that it will better explain to you (in the plural sense) what I’ve explained to many individuals.

My natural impetus is to remember when I have to repeat the same things over and over again, consider how I might convey all the things I’ve said to a large number of people, and then write those things down so that they might be read, which is a vastly more efficient information transfer mechanism than speech. Nonetheless, I realize that this document and my explanations are probably not perfect, so if you’ve read this to the best of your ability and still have questions, don’t be afraid to ask them. One thing universities should inculcate is inquisitiveness, and I hope I do so as a teacher and as a person.

Notice that this document has a version number in the upper-right corner: as time goes on and I receive questions or comments, I’ll probably change this document to reflect new concerns. When you ask questions, you’re not only helping yourself discover something: you’re helping the person you’re asking better understand the subject at hand and the nature of what they’re trying to say. By asking me questions about this document, you might help me ultimately improve it, and ultimately help those who read it in the future. If there is one cultural advantage universities should impart more than any other, it is the ability to ask questions about even the most fundamental things; confusion and uncertainty are often the sources of new knowledge.

As Paul Krugman, who won the 2008 Nobel Prize for Economics, said of his own research (which led him to the prize):

The models I wrote down that winter and spring were incomplete, if one demanded of them that they specify exactly who produced what. And yet they told meaningful stories. It took me a long time to express clearly what I was doing, but eventually I realized that one way to deal with a difficult problem is to change the question — in particular by shifting levels.

He also has a section called “question the question,” in which he recursively asks himself whether the question he has asked is the right one. For him, as for many people, questions are at the center of the learning universe, and if you learn to ask them promiscuously and then seek the answers, whether from me, your other professors, or from books, you’ll be better equipped to find the answers, do well in college, and do well in life. One challenge is often learning enough to be able to formulate the right questions, and with this in mind, I hope you know how to ask important questions about the institution you’re attending.

As noted previously, you can also download this essay in .pdf form.

Works Cited [4]

Barth, John. The Sot-Weed Factor. New York: Anchor Books, 1987.

Graham, Paul. “Undergraduation.” Personal website. March 2005. Accessed 7 December 2008. <http://paulgraham.com/college.html&gt;

Deresiewicz, William. Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life. Free Press, 2014.

Krugman, Paul. “How I Work.” Personal website. Accessed 11 November 2008. <http://web.mit.edu/krugman/www/howiwork.html&gt;

“Salary Increase by Major.” The Wall Street Journal. Undated. Accessed 7 December 2008. <http://online.wsj.com/public/resources/documents/info-Degrees_that_Pay_you_Back-sort.html?mod=googlenews_wsj&gt;

Sawyer, W.W. Prelude to Mathematics. New York: Dover Publications, 1982.

Sperber, Murray. Beer and Circus: How Big Time College Sports is Crippling Undergraduate Education. New York: Henry Holt and Company, 2001.

Spolsky, Joel. “Advice for Computer Science College Students.” Personal website. 2 January 2005. Accessed 7 December 2008. <http://joelonsoftware.com/articles/CollegeAdvice.html&gt;

“tenure.” The New Oxford American Dictionary. 2010. Mac OS X 10.6 Operating System.


[1] One useful study tip: if you read or hear a word you don’t know, look it up. You’ll expand your vocabulary and, concomitantly, the range of your thinking.

[2] In the hard sciences, for example, it’s often wise to ask professors if you can join their research labs, where you’ll gain valuable experience and make important connections. But most undergraduates don’t seem to realize that the first thing they have to do is ask. The second thing they need to do is show their professors that they won’t be a waste of time.

[3] A bunch of rocks near Neptune’s orbit, for those of you wondering.

[4] Writers include works cited pages so others can draw on the sources used to construct an argument. Contrary to popular belief among freshmen, they’re not just pointless hoops teachers set up, and they become progressively more important as you matriculate.

Where Good Ideas Come From – Steven Berlin Johnson's new book

I already pre-ordered Steven Berlin Johnson’s new book, Where Good Ideas Come From: The Natural History of Innovation, but if I hadn’t, this video would have convinced me to:

Sounds like an excellent complement to Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, since both are about structuring lives and minds are ideas and their implementation. This is an obvious topic of interest to novelists and academics, since both require a) lots of ideas and b) even more implementation of those ideas.

One thing I’ll be watching for closely in the book: around minute 3:30, the video says that the Internet isn’t going to make us more distracted in a bad way—it will make us more interconnected so that hunches and combine into ideas faster. That implies Nicholas Carr’s The Shallows is mostly wrong, which is an argument I’m skeptical about: I suspect that we need a combination of quiet, contemplative space of the sort the Internet is driving out along with the combination of ideas that originate from various sources. If one side becomes too lopsided, the creativity equation fails.

To be sure, it’s unwise to judge a book before reading it, and I want to see how the debate plays out.

Regular readers probably already know Johnson through my repeated references to his essay Tool for Thought, which is about Devonthink Pro and changed the way I work. I regularly tell my better students as well as friends to read this essay and use DTP in the way Johnson describes if they’re at all interested in ideas and writing.

Where Good Ideas Come From – Steven Berlin Johnson’s new book

I already pre-ordered Steven Berlin Johnson’s new book, Where Good Ideas Come From: The Natural History of Innovation, but if I hadn’t, this video would have convinced me to:

Sounds like an excellent complement to Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, since both are about structuring lives and minds are ideas and their implementation. This is an obvious topic of interest to novelists and academics, since both require a) lots of ideas and b) even more implementation of those ideas.

One thing I’ll be watching for closely in the book: around minute 3:30, the video says that the Internet isn’t going to make us more distracted in a bad way—it will make us more interconnected so that hunches and combine into ideas faster. That implies Nicholas Carr’s The Shallows is mostly wrong, which is an argument I’m skeptical about: I suspect that we need a combination of quiet, contemplative space of the sort the Internet is driving out along with the combination of ideas that originate from various sources. If one side becomes too lopsided, the creativity equation fails.

To be sure, it’s unwise to judge a book before reading it, and I want to see how the debate plays out.

Regular readers probably already know Johnson through my repeated references to his essay Tool for Thought, which is about Devonthink Pro and changed the way I work. I regularly tell my better students as well as friends to read this essay and use DTP in the way Johnson describes if they’re at all interested in ideas and writing.

So you wanna be a writer: What Anthony Bourdain can tell you even when he's not talking about writing

There’s a great essay called “So You Wanna Be a Chef” by Anthony Bourdain, who wrote Kitchen Confidential. Based on “So You Wanna Be a Chef,” culinary schools sound rather like MFA programs. Money drives both decisions, even when artistry is supposed to:

But the minute you graduate from school—unless you have a deep-pocketed Mommy and Daddy or substantial savings—you’re already up against the wall. Two nearly unpaid years wandering Europe or New York, learning from the masters, is rarely an option. You need to make money NOW.

You could replace “cooking” with “writing” and “being a chef” with “being a writer” in Bourdain’s essay and have more or less the same outcome. Going into the “hotels and country clubs” side of the business is like getting tenure as a professor. There are a few differences between the fields—you’re never too old to be a writer—but similarities proliferate. Like this:

Male, female, gay, straight, legal, illegal, country of origin—who cares? You can either cook an omelet or you can’t. You can either cook five hundred omelets in three hours—like you said you could, and like the job requires—or you can’t. There’s no lying in the kitchen.

You can either sit (or stand) at a computer for years, producing words, or you can’t. There’s no lying at the keyboard. If you want to be a writer, the keyboard is where you’re going to spend a lot of your time (Michael Chabon on book tour in Seattle for The Yiddish Policemen’s Union: “If you want to write a novel you have to sit on your ass.” I can testify that the same is true of writing a blog). All the chatter in the world about how how you prefer early Ian McEwan to late Ian McEwan isn’t going to help you produce words.

As with many disciplines, what’s important is not just being good or adequate—it’s being amazing. “There is, as well, a big difference between good work habits (which I have) and the kind of discipline required of a cook at Robuchon.” There is a big difference between good work habits and being an artist: a surprisingly large number of people can crap out a novel if given sufficient time and motivation. Milan Kundera in The Curtain:

Every novel created with real passion aspires quite naturally to a lasting aesthetic value, meaning to a value capable of surviving its author. To write without having that ambition is cynicism: a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.

This overstates the case: an indifferent or “mediocre” novel by a “mediocre novelist” does not tangibly hurt anyone, and its most likely fate is to be ignored—which is the most likely fate of any novelist. But the writer needs to aspire “to a lasting aesthetic value,” which means that merely existing and producing something isn’t enough. Hence my derogatory phrase: “crap out a novel.”

Instead of traveling to “Find out how other people live and eat and cook,” as Bourdain tells the chef to do, the writer must read widely and voraciously and omnivorously. If you’re writing in a genre, read the classics. If you’re a literary novelist, read some of the better genre fiction (it’s out there). Read books about writing. Read books not about writing to learn how the world works. Get out of your literary comfort zone with some frequency. You’ll need it.

Also wise: “Treating despair with drugs and alcohol is a time-honored tradition—I’d just advise you to assess honestly if it’s really as bad and as intractable a situation as you think.” Steven King writes in On Writing about his own problems with drugs. He points out that drinking or taking drugs doesn’t make you a writer—if you’re a writer, you might drink or take drugs, but skipping straight to the drugs doesn’t do anything for you.

The bottom line: creative fields and top performers in many disciplines appear to have more in common than not. From what I’ve read, the same basic dynamic described by Bourdain applies not just to cooking and writing, but to software hacking, most kinds of research, athletes, architecture, music, and most forms of art. Don’t pursue these fields unless you want to master them. And you probably don’t. And if you do, you might be better off not realizing how difficult they are before you start, because you might never start.

So you wanna be a writer: What Anthony Bourdain can tell you even when he’s not talking about writing

There’s a great essay called “So You Wanna Be a Chef” by Anthony Bourdain, who wrote Kitchen Confidential. Based on “So You Wanna Be a Chef,” culinary schools sound rather like MFA programs. Money drives both decisions, even when artistry is supposed to:

But the minute you graduate from school—unless you have a deep-pocketed Mommy and Daddy or substantial savings—you’re already up against the wall. Two nearly unpaid years wandering Europe or New York, learning from the masters, is rarely an option. You need to make money NOW.

You could replace “cooking” with “writing” and “being a chef” with “being a writer” in Bourdain’s essay and have more or less the same outcome. Going into the “hotels and country clubs” side of the business is like getting tenure as a professor. There are a few differences between the fields—you’re never too old to be a writer—but similarities proliferate. Like this:

Male, female, gay, straight, legal, illegal, country of origin—who cares? You can either cook an omelet or you can’t. You can either cook five hundred omelets in three hours—like you said you could, and like the job requires—or you can’t. There’s no lying in the kitchen.

You can either sit (or stand) at a computer for years, producing words, or you can’t. There’s no lying at the keyboard. If you want to be a writer, the keyboard is where you’re going to spend a lot of your time (Michael Chabon on book tour in Seattle for The Yiddish Policemen’s Union: “If you want to write a novel you have to sit on your ass.” I can testify that the same is true of writing a blog). All the chatter in the world about how how you prefer early Ian McEwan to late Ian McEwan isn’t going to help you produce words.

As with many disciplines, what’s important is not just being good or adequate—it’s being amazing. “There is, as well, a big difference between good work habits (which I have) and the kind of discipline required of a cook at Robuchon.” There is a big difference between good work habits and being an artist: a surprisingly large number of people can crap out a novel if given sufficient time and motivation. Milan Kundera in The Curtain:

Every novel created with real passion aspires quite naturally to a lasting aesthetic value, meaning to a value capable of surviving its author. To write without having that ambition is cynicism: a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.

This overstates the case: an indifferent or “mediocre” novel by a “mediocre novelist” does not tangibly hurt anyone, and its most likely fate is to be ignored—which is the most likely fate of any novelist. But the writer needs to aspire “to a lasting aesthetic value,” which means that merely existing and producing something isn’t enough. Hence my derogatory phrase: “crap out a novel.”

Instead of traveling to “Find out how other people live and eat and cook,” as Bourdain tells the chef to do, the writer must read widely and voraciously and omnivorously. If you’re writing in a genre, read the classics. If you’re a literary novelist, read some of the better genre fiction (it’s out there). Read books about writing. Read books not about writing to learn how the world works. Get out of your literary comfort zone with some frequency. You’ll need it.

Also wise: “Treating despair with drugs and alcohol is a time-honored tradition—I’d just advise you to assess honestly if it’s really as bad and as intractable a situation as you think.” Steven King writes in On Writing about his own problems with drugs. He points out that drinking or taking drugs doesn’t make you a writer—if you’re a writer, you might drink or take drugs, but skipping straight to the drugs doesn’t do anything for you.

The bottom line: creative fields and top performers in many disciplines appear to have more in common than not. From what I’ve read, the same basic dynamic described by Bourdain applies not just to cooking and writing, but to software hacking, most kinds of research, athletes, architecture, music, and most forms of art. Don’t pursue these fields unless you want to master them. And you probably don’t. And if you do, you might be better off not realizing how difficult they are before you start, because you might never start.

%d bloggers like this: