We lack perspective: notes from Alain de Botton’s The Pleasures and Sorrows of Work

Yet our world of abundance, with seas of wine and alps of bread, has hardly turned out to be the ebullient place dreamt of by our ancestors in the famine-stricken years of the Middle Ages. The brightest minds spend their working lives simplifying or accelerating functions of unreasonable banality. Engineers write theses on the velocities of scanning machines and consultants devote their careers to implementing minor economies in the movements of shelf-stackers and forklift operators. The alcohol-inspired fights that break out in market towns on Saturday evenings are predictable symptoms of fury at our incarceration. They are a reminder of the price we pay for our daily submission at the altars of prudence and order – and of the rage that silently accumulates beneath a uniquely law-abiding and compliant surface.

1) A lot of engineers like their jobs and look at them as solving a series of puzzles: “theses on the velocities of scanning machines” are only as banal as you make them. In addition, even if you do find them banal, if you can make a faster scanning machine and sell it for a lot of money, you may not care when you retire to paint water colors for the rest of your life.

2) Fights say more about the dumb fighters than about the human condition.

3) Humans might simply never be, as a group, overtly happy in whatever conditions we experience; realizing this might release us from unreasonable expectations. A cultural fixation on happiness might paradoxically prevent us from experiencing what we think or imagine we most want or desire.

4) Related to three, people who leave work to drink on the weekends are probably intentionally looking for fights: I doubt the behavior can be blamed solely on alcohol. Many people seem to undergo a two-step process: they consciously drink so they can unconsciously act out in the ways they’d actually like to. My question is simple: why not just go to step two and be intellectually honest with ourselves?

5) Stumbling on Happiness discusses how and why we feel unhappy when we compare ourselves to others. Most of us don’t compare ourselves to people in the “Middle Ages;” we compare ourselves to our wives’ sisters’ husbands, to paraphrase that famous aphorism (switch gender roles as appropriate to you, the reader, and your gender / sexual orientation).

6) We submit “at the altars of prudence and order” because the alternative is often worse. That being said, I think Western society underestimates the power and importance of trance, ecstasy, transcendence, atë—all things that, denied and repressed, seem to manifest themselves in unusual ways (see The Secret History for more on this. Still, if the alternative to prudence and order is chaos, no iPhone, longer commutes, and living a dicey part of town, prudence and order sound pretty good—as does self-imposed “incarceration.”

7) The Pleasures and Sorrows of Work is, like much of de Botton’s work, nicely balanced between readability and intellectual engagement, reasoned and learned without being pedantic. These are harder notes to strike than may be obvious at first.

Thoughts on the first 100 pages of Jeffrey Eugenides' The Marriage Plot

1) I would have stopped reading The Marriage Plot if it weren’t also related to some of my academic work. It captures the feel of slogging through a 19th Century novel. As you might imagine, this isn’t a compliment.

2) Until about 100 pages in, no characters have real problems. They have fake, rich-college-student problems. I’m not opposed to such problems for the people experiencing them—I remember having similar ones and thinking they were significant at the time, too—but the real problem in the form of Leonard’s psychotic breakdown should arrive closer to page 40 or 50. Madeleine’s minor undergraduate affairs are much less interesting and hilarious than Karen Owen’s “An education beyond the classroom: excelling in the realm of horizontal academics” (which is a PowerPoint document). Owen’s work feels more honest.

3) If you want a better but less hyped novel about the undergraduate experience in an Ivy-League setting, try Tom Perrotta’s Joe College. Notice that you can also get the hardback for $4, shipped, from Amazon. Notice too how Danny in that novel has real problems: he’s a fish-out-of-water, his father’s business might be falling apart, and his actions have real consequences for him and others around him. He has to master a skill (being a lunch-truck driver) and understand that skill. Failure may result in his ejection from Edenic Yale. So far no one in The Marriage Plot has a real job; they’re like characters in Jane Austen. There may be consequences coming in the latter sections, but based on the dust jacket (a trip to India to find one’s self, a possible stint in grad school), I’m not optimistic.

4) Eugenides’ earlier novels both have major conflicts and problems from the beginning: Middlesex asks how to survive and adapt as a transexual (who as a group still have major problems in contemporary society, compared to average heterosexuals) and how to flee dictator-encumbered countries, while The Virgin Suicides (probably my favorite of Eugenides’ work) asks about what really happened to the Lisbon sisters—and, because of the very clever narrative structure, we can never really find out. It’s teasing yet effective, melancholy and happy, a meditation on how we understand the past, deal with love, grow up, don’t grow up, and much more. That last bit sounds grandiose and stupid, but in the context of the novel it’s not.

5) Given the timeline in the section I’ve read so far—late 1970s, early 1980s—I keep thinking about the most consequential thing happening in the world at that time: the personal computer revolution in Silicon Valley. Jobs, Wozniack, Gates, and millions of other, less famous names were building the future. This is an insanely unfair criticism of a novel, but it’s stuck in my mind anyway, like a background process that occasionally pops an alert into my consciousness: some people are doing real things. I dismiss the alert, but it’s set to go off occasionally anyway, and I don’t have the heart to sudo kill -9.

EDIT: I was reading Hacker News this morning and found this:

The offices of Zelnick Media were packed on a recent evening for #DigitalWes, an alumni gathering for the graduates of Wesleyan University who had made their way from jam bands and cultural theory to the warp-speed world of Silicon Alley. Guests nibbled shrimp and steak skewers while taking in a sumptuous view of midtown Manhattan from the roof deck. The hosts were Strauss Zelnick and his partner, Jim Freidlich, both class of ’79, whose Take Two Interactive has produced some of the best-selling and most controversial video games of the past decade.

Same demographic, same timeline, note the mention of “cultural theory.”

6) Reading The Game has spoiled me on excessive beta-male behavior. Watching Mitchell around the beautiful and distant Madeleine mostly makes me want to tell him what he’s doing wrong. The Game was published in 2005, so saying this about a novel set before The Game’s publication isn’t fair, but the book still crystalized for me a) what not to do, b) how to eliminate certain kinds of obviously unsuccessful mating behavior, and c) how to think systematically about useful principles in men dealing with women. Being a whiny hanger-on to a person with relatively high dating market value is not good for Mitchell or for Madeleine, the object of his desire. Note that this is not limited to men: I also have low tolerance for women who spend long periods of time throwing themselves on distant alpha males who at best hook up with and then dump them. Don’t want to be hooked up with and dumped? Don’t chase alpha males whose primary attraction appears to be their unattainability. I don’t love novels whose characters’ primary problems can be solved with a simple, one-line piece of advice that, if followed, will result in the solution to said problem.

7) Nineteenth-century novels are not good guides to behavior in the 21st century. Hell, they’re not even good guides to behavior in Brown in the 1979 – 1983 period. This is as true for Madeline and for others. Literary theory is also a pretty crappy guide to real life, which may be part of the reason theory’s hold on English departments has loosened in the last 30 years. Still, perhaps the most hilarious and best scene involves Madeleine throwing Roland Barthes’ A Lover’s Discourse, which alleges that there is no such thing as love, only the speaking of love, at the boy she loves.

8) I can follow the inside-baseball parts of literary theory (Barthes, Derrida, and other English-department heroes appear, mostly as signals of what various characters believe), but I doubt such things would be of great interest to anyone not in English departments. This relates to #5: it turns out that the really important stuff happening in this time period is happening among tech people, not among grad students in the humanities. A novel about someone who jumps from the one to the other might be interesting, and it could dramatize events with real consequences that don’t automatically revolve around sex and death. Intellectual curiosity is an underutilized motivation in fiction.

9) Another book to read if you want campus-war stuff: Richard Russo’s Straight Man, which is also much funnier.

EDIT: 10) See my full review here.

Process, outcomes, and random discoveries

I was listening to a Fresh Air interview with Brad Pitt, the guy who plays Billy Beane in the Moneyball movie, and Pitt said something very interesting: Billy Beane realized that baseball is mostly about “process” and maximizing your odds. A single pitch or a single at-bat is basically random; a terrible player could homer, a great one strike out. But if you have faith in the process and fidelity to it, you’ll maximize your chance of success over time. Notice those words: “maximize your chance of success.” You won’t automatically succeed in whatever the endeavor might be, but we live in a chaotic, random world where no one is guaranteed anything.

So I heard this interview about a week ago. Since then, I’ve seen a bunch of similar stuff, which keeps reappearing like, if I were a person who wasn’t convinced things are random, the world is trying to tell me something. Here’s a description of Steve Jobs: “What was important to Jobs was not making money per se, but the process of creation.” That word, “process,” appears again: if it’s right, the money will follow if you get the process right. When a Playboy interviewer asks Justin Timberlake “Why [. . .] some celebrities crack and fade and others, like you, just keep on keeping on? Have you figured that out?,” Timberlake says he doesn’t know but will speculate, and he goes on to say:

I think it’s about process. If you care about the process of what you’re doing, you can care about the actual work. You’ll stick around. The other thing is, you always need to be learning something new. In whatever I’ve done, I’ve always looked at myself as a beginner. Hopefully I can continue to do that for the next 30 years as I grow into an older man.

He’s trying to do with music what Billy Beane is trying to do with baseball and what Steve Jobs was trying to do with consumer technology. Or what Alain de Botton describes in The Pleasures and Sorrows of Work, in which the author sees a worker in a Belgian biscuit factory whose “manner drew attention away from what he was doing in favour of how he was doing it.” If you attend to how you do something, the outcome will tend to improve more than absolute attention to the outcome. It seems like a lot of experts, a lot of people who can do good work year after year, are really focusing on process refining. This might map to “experimental” and “conceptual” artists, to use Galenson’s terminology in Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity. As I read more about what makes artists, scientists, and others succeed, I increasingly realize that a focus on process is essential, if not the essential thing.

And it’s something I’m noticing over and over again, in a variety of contexts. When I started grad school, I began going to the University of Arizona’s Ballroom Dance Club. This is hilarious: if you asked a girl who had the misfortune of going with me to high school dances about what I’d be like a couple years later, I doubt any would’ve guessed, “Dancing.” Fewer still would’ve guessed, “At least being a competent dancer.” To aspire to “good” or “masterful” is probably unwise, but “competent” is well within my reach—and within almost anyone’s reach, really, if you have the desire. And ballroom club is all about the fundamentals too: here’s how you should move. Here’s how you isolate a single part of your body. The overall look, feel, and flow of any dance is composed of individual motions and a dancer’s control over those individual motions, which eventually come to appear to be a single, fluid motion. But it isn’t. It’s the result of the dancer breaking down each individual part and practicing it until it becomes part of him.

One time, a guy who’d been dancing for about a decade had us spend about half an hour of an hour-long classes on spins. Skilled dancers can perform nearly perfect 360-degree spins every time. I can’t. I usually end up ten to fifty degrees off. I can’t get my body, shoes, and motion harmonized sufficiently to ensure that I can perform perfect spins. But I keep working on it, in the hopes of improving this seemingly simple but actually complex activity. I’m doing in dancing what Billy Beane is doing in baseball, Justin Timberlake is doing in music, Steve Jobs was doing in technology, and you should probably be doing in your own field or fields.

And if your practice isn’t as good as it should be this time, focus on improving your process so you’ll be better next time. As you, the reader, might imagine, the same principle applies to other things. Like classes. Since I now teach and take them, I have a lot of experience with students who want to fight about grades. I don’t budge, but every semester students want to fight either during the semester or the end. I try to convey to them that grades are imperfect but they’re really about learning; concentrate on learning and the achievement, whether in grade or other form, will eventually follow.

Most of them don’t believe me. This is unfortunate, since most students also don’t know that, as Paul Graham writes, there are really Two Kinds of Judgment:

Sometimes judging you correctly is the end goal. But there’s a second much more common type of judgement where it isn’t. We tend to regard all judgements of us as the first type. We’d probably be happier if we realized which are and which aren’t.

The first type of judgement, the type where judging you is the end goal, include court cases, grades in classes, and most competitions. Such judgements can of course be mistaken, but because the goal is to judge you correctly, there’s usually some kind of appeals process. If you feel you’ve been misjudged, you can protest that you’ve been treated unfairly.

Nearly all the judgements made on children are of this type, so we get into the habit early in life of thinking that all judgements are.

But in fact there is a second much larger class of judgements where judging you is only a means to something else. These include college admissions, hiring and investment decisions, and of course the judgements made in dating. This kind of judgement is not really about you.

To be fair, I am trying to judge them correctly. But the second class of judgments bleed into grading: the grade is the means of trying to get students to be better writers. When they want to fight about grades, they haven’t fully internalized that I’m trying to get them into a process-oriented mode despite the school setting. The grades are outcomes and a necessary evil—and, besides, some students are simply more skilled than others.

But if students have fidelity to the process—to becoming, in my classes, better writers, or in other classes, better at whatever the class is attempting to impart—they’re going to maximize the probability of long-term success. And I wonder if students internalize the outcome-oriented mode of school—”My worth depends on my grades”—and then find themselves shocked when they’re plunged into the process-oriented real-world, where no one grades you, success or failure can’t be measured via GPA, and even people who do everything “right” may still fail for reasons outside their control.

This is probably doubly painful because students are used to type one judgments, not type two, and instructors don’t do much to disabuse students. Instructors don’t do enough to encourage resilience, and maybe we should, or should more than we do now.

By the way, I’m not just climbing the mountain and shouting at the unwashed masses below. I tell myself the same thing about writing fiction (or blog posts): I’ve probably gotten dozens of requests from agents for partial or full manuscripts. None have panned out; some still have pieces of the latest novel. But I tell myself that a) I’m going to write a better novel next time and b) if I maintain fidelity to the craft of writing itself, I will eventually succeed. Alternately, I might simply start self-publishing, but that’s an issue for another post. The point here is about writing—and about what I’m doing right now.

I keep writing this blog not because it brings me fame and fortune—alas, it doesn’t—but because I like to write, I think through writing, and because some of the writing on this blog is and/or will be useful to others. And I like to think this blog makes me a better writer not only of blog posts, but also a better writer in other contexts. I’m focused on the process of improvement more than the outcome of conventional publication. Which isn’t to say I don’t want that outcome—I do—but I understand that the outcome is, paradoxically, a result of attention to something other than the outcome.

Desktop PCs aren’t going anywhere, despite the growth of phones and tablets, because they’re cheap

Articles like “As PCs Wane, Companies Look to Tablets” are both true and bogus. PCs aren’t going anywhere because they’re cheap. You can buy them reasonably close to cost. If you want the least expensive means of computing possible, you can’t beat PCs now and won’t be able to for years, at the very earliest. Sure, “making them has not been a great business for most American companies for almost a decade,” but that’s because consumers are deriving so much surplus from PCs. PCs are close to commodities, which is great for buyers, if not sellers.

The industry, the reporters who cover the industry, bloggers, and other people with a stake in the action want you to believe “TABLETS TABLETS TABLETS ARE COOL!!!!” because they want you to buy relatively high-margin tablets (and they need something write about). Current tablets are high-margin because they combine commodity hardware with OS lock-in. The industry wants to move closer to Apple’s model, since Apple gets away with what it does because a) it has great design and b) for a long time, and maybe up to the present, OS X was more fun and in some respects better designed than Windows. Lock-in and high margins? What’s not to love from a business perspective?

It’s not very much fun for journalists and bloggers who drive these stories about PCs to write, “Area man continues to derive immense intellectual, social, and efficiency value from the PC he bought five years ago and which continues to meet his needs adequately.” I wouldn’t read that story or post either. The tech press needs to find hype and trends. Tablets and cell phones are of course genuinely big deals and their impact will continue to reverberate—but just because one sector is waxing doesn’t mean another is automatically waning. Especially when that sector offers a lot of value for the money.

So: every time you see a call for tablet computing, regardless of its source, you should remember that somewhere behind it, there’s a manufacturer who wants to sell you more stuff at higher prices. Paul Graham calls such beasts “the Submarine,” and if you want to understand how you’re being marketed to, you should read that essay. The PC manufacturer can’t really sell you more stuff in PC laptops and desktops these days because they’re too inexpensive and interchangeable. Apple can sell you design and an unusual operating system.

Maybe Lenovo can charge above-average prices because of the Thninkpad’s reputation for durability, but that’s it. Everyone else is scrambling because consumers dominate producers when it comes to PCs. So we get stories like the one above; and, if, as Tyler Cowen speculates in this example, the U.S. economic model moves closer to Japan and capital depreciates, expect to see even more calls for tablets and so forth. Anything to avoid acknowledging that an existing stock of capital is Good Enough.

And you can expect to see misleading headlines like the one above. It’s frustrating to read stuff like this:

Computer makers are expected to ship only about 4 percent more PCs this year than last year, according to IDC, a research firm. Tablets, in contrast, are flying off store shelves. Global sales are expected to more than double this year to 24.1 million, according to Forrester Research.

How does an increase in the absolute number and the percentage of PCs sold an indicating of waning? I think that means computer makers will ship over a hundred million units, compared to a quarter as many tablets. I checked out Dell’s website, and one can buy a very nice Inspiron desktop with a dual-core AMD processor, 3 GB of RAM, and a 1 TB hard drive for about $400. Get a cheapie 20″ monitor, and you’ve got a very competent machine that’ll run Windows passable well for under $600. Get a sweet 24″ IPS monitor as good or better than the one in my 2007 24″ iMac for another $500, and you’re still under $1,000. That’s why desktops aren’t going anywhere and all this blah blah blah about tablets is important but also overrated by tech sites chasing the new shiny but who also think that everyone has, if not an unlimited budget, then at least a very substantial one for technical toys. Given my work, it’s probably not surprising that I have a higher-than-average budget for technical toys and tools, since I use my computer every day and often for very long stretches, but for people who aren’t writers, hackers, day traders, pornographers, and the like, having an expensive computer and a tablet and a phone is, if not overkill, then at least overpriced.

Some people get this—here’s a Time story that’s as an example—but too many don’t, especially in the press, which follows the tech industry like a marketing arm instead of an independent evaluator.

One more point: PCs are still better for some tasks. Maybe not for browsing Facebook and YouTube, but anything that requires a keyboard isn’t just better on a computer—it’s way better. Maybe students are going to write papers on iPads or iPad-like devices, but I’m skeptical, and even if one has a couple of substantial text-writing efforts a year, it’s going to be tempting to keep a keyboard around. I could be crazy; people are apparently writing novels on cell phones in Japan and now other countries, but producing a novel on a phone doesn’t sound appetizing from the perspective of either the writer, who can’t really get in the zone over the course of a hundred words, or the reader, who has to endure writing from someone who doesn’t appear to, say, go back and edit their novel as a coherent whole. Most people don’t seem to much like 19th Century novels that were published serially, and “lack of editing” and “lack of brevity” might be two reasons. The first will probably haunt cell phone novelists.

Then again, looking at the bestseller lists, maybe there isn’t much to go but down.

PCs and other form factors are going to coexist. Coexistence is a less sexy story than death, but it’s truer. In one Hacker News comment thread “jeffreymcmanus” observed, “People don’t stop buying the old stuff just because there’s new stuff. See also: horses, bicycles, cars.” Well, people have mostly stopped buying horses, because cars offer superior functionality in virtually all circumstances, but the point remains. Another commenter, “mcantelon,” said:

Yeah, which is why the “post-PC” terminology has a propaganda tone. It’s not going to be “post-PC”: more like “pop computing” or “computing lite”.

He’s right. Which is okay: I have nothing against tablets or cell phones. Use whatever works. Just don’t pretend PCs are going away or automatically declining.

EDIT 2015: As of this edit I’m using a 27″ Retina iMac. The hardware is incredible. The best is still yet to come.


See also this post on whether you should buy a laptop or desktop and this related post on the reliability of each form factor.

Desktop PCs aren’t going anywhere, despite the growth of phones and tablets—because they’re cheap

I’m tired of articles like “As PCs Wane, Companies Look to Tablets” You know why PCs aren’t going anywhere? Because they’re cheap. You can buy them reasonably close to cost. If you want the least expensive means of computing possible, you can’t beat PCs now and won’t be able to for years, at the very earliest. Sure, “making them has not been a great business for most American companies for almost a decade,” but that’s because consumers are deriving so much surplus from PCs. They’re not close to commodities. Which is great for buyers, if not sellers.

The industry, the reporters who cover the industry, bloggers, and other people with a stake in the action want you to believe “TABLETS TABLETS TABLETS ARE COOL!!!!” because they want you to buy relatively high-margin tablets. Those tablets are high-margin because they combine commodity hardware with OS lock-in. The industry wants to move closer to Apple’s model, since Apple gets away with what it does because a) it has great design and b) for a long time, and maybe up to the present, OS X was more fun and in some respects better designed than Windows. Lock-in and high margins? What’s not to love from a business perspective?

It’s also not very much fun for journalists and bloggers who drive these stories about PCs to write stories that say, “Area man continues to derive immense intellectual, social, and efficiency value from the PC he bought five years ago and which continues to meet his needs adequately.” I wouldn’t read that story or post either. The larger tech press needs to find something to hype. In this case, of course, tablets and cell phones are genuinely big deals and their impact will continue to reverberate—but just because one sector is waxing doesn’t mean another is automatically waning. Especially when that sector offers a lot of value for the money.

So: every time you see a call for tablet computing, regardless of its source, you should remember that somewhere behind it, there’s a manufacturer who wants to sell you more stuff at higher prices. Paul Graham calls such beasts “the Submarine,” and if you want to understand how you’re being marketed to, you should read that essay. The PC manufacturer can’t really sell you more stuff in PC laptops and desktops these days because they’re too inexpensive and interchangeable. Apple can sell you design and an unusual operating system. Maybe Lenovo can charge above-average prices because of the Thninkpad’s reputation for durability, but that’s it. Everyone else is scrambling because consumers dominate producers when it comes to PCs. So we get stories like the one above; and, if, as Tyler Cowen speculates in this example, the U.S. economic model moves closer to Japan and capital depreciates, expect to see even more calls for tablets and so forth. Anything to avoid acknowledging that an existing stock of capital is Good Enough.

And you can expect to see misleading headlines like the one above. It’s frustrating to read stuff like this:

Computer makers are expected to ship only about 4 percent more PCs this year than last year, according to IDC, a research firm. Tablets, in contrast, are flying off store shelves. Global sales are expected to more than double this year to 24.1 million, according to Forrester Research.

How does an increase in the absolute number and the percentage of PCs sold an indicating of waning? I think that means computer makers will ship over a hundred million units, compared to a quarter as many tablets. I checked out Dell’s website, and one can buy a very nice Inspiron desktop with a dual-core AMD processor, 3 GB of RAM, and a 1 TB hard drive for about $400. Get a cheapie 20″ monitor, and you’ve got a very competent machine that’ll run Windows passable well for under $600. Get a sweet 24″ IPS monitor as good or better than the one in my 2007 24″ iMac for another $500, and you’re still under $1,000. That’s why desktops aren’t going anywhere and all this blah blah blah about tablets is important but also overrated by tech sites chasing the new shiny but who also think that everyone has, if not an unlimited budget, then at least a very substantial one for technical toys. Given my work, it’s probably not surprising that I have a higher-than-average budget for technical toys and tools, since I use my computer every day and often for very long stretches, but for people who aren’t writers, hackers, day traders, pornographers, and the like, having an expensive computer and a tablet and a phone is, if not overkill, then at least overpriced.

Some people get this—here’s a Time story that’s as an example—but too many don’t, especially in the press, which follows the tech industry like a marketing arm instead of an independent evaluator.

One more point: PCs are still better for some tasks. Maybe not for browsing Facebook and YouTube, but anything that requires a keyboard isn’t just better on a computer—it’s way better. Maybe students are going to write papers on iPads or iPad-like devices, but I’m skeptical, and even if one has a couple of substantial text-writing efforts a year, it’s going to be tempting to keep a keyboard around. I could be crazy; people are apparently writing novels on cell phones in Japan and now other countries, but producing a novel on a phone doesn’t sound appetizing from the perspective of either the writer, who can’t really get in the zone over the course of a hundred words, or the reader, who has to endure writing from someone who doesn’t appear to, say, go back and edit their novel as a coherent whole. Most people don’t seem to much like 19th Century novels that were published serially, and I think “lack of editing” and “lack of brevity” might be two reasons, and the first will probably come back to haunt cell phone novelists.

Then again, looking at the bestseller lists, maybe there isn’t much to go but down.

PCs and other form factors are going to coexist. Again, it’s not as sexy a story, but it’s also a more true one. In one Hacker News comment thread “jeffreymcmanus” observed, “People don’t stop buying the old stuff just because there’s new stuff. See also: horses, bicycles, cars.” Well, people have mostly stopped buying horses, because cars offer superior functionality in virtually all circumstances, but the point remains. Another commenter, “mcantelon,” said:

Yeah, which is why the “post-PC” terminology has a propaganda tone. It’s not going to be “post-PC”: more like “pop computing” or “computing lite”.

He’s right. Which is okay: I have nothing against tablets, cell phones, and so forth. Use whatever works. Just don’t pretend PCs are going away or automatically declining.


See also this post on whether you should buy a laptop or desktop and this related post on the reliability of each form factor.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

Student choice, employment skills, and grade inflation

Edward Tenner’s Atlantic post asks, “Should We Blame the Colleges for High Unemployment?” and mostly doesn’t answer the question, instead focusing on employer hiring behavior. But I’m interested in the title question and would note that the original story says, “Fundamentally, students aren’t learning [in college] what they need to compete for the jobs that do exist.”

That may be true. But colleges and universities, whatever their rhetoric, aren’t bastions of pure idealistic knowledge; they’re also businesses, and they respond to customer demand. In other words, student demand. Students choose their own major, and it isn’t exactly news that engineers, computer scientists, mathematicians, and the like tend to make much more money than other majors, or that people in those disciplines are much more likely to find jobs. Students, however, by and large don’t choose them: they choose business, communications (“comm” for the university set), and sociology—all majors that, in most forms in most places, aren’t terribly demanding. I’ve yet to hear an electrical engineering major say that comm was just too hard, so she switched to engineering instead. As Richard Arum and Josipa Roksa show in Academically Adrift: Limited Learning on College Campuses, those majors aren’t, on average, very hard either, and they don’t impart much improvement in verbal or math skills. So what gives?

The easiest answer seems like the most right one: students aren’t going to universities primarily to get job skills. They’re going for other reasons: signaling; credentialing; a four-year party; to have fun; choose your reason here. And universities, eager for tuition dollars, will cater to those students—and to students who demand intellectual rigor. The former get business degrees and comm, while the latter get the harder parts of the humanities (like philosophy), the social sciences (like econ), or the hard sciences. It’s much easier to bash universities, with the implication of elaborately educated dons letting their product being watered down or failing, than it is to realize that universities are reacting to incentives, just as it’s much easier to bash weak politicians than it is to acknowledge that politicians give voters what they want—and voters want higher services and lower taxes, without wanting to pay for them. Then people paying attention to universities or politics notice, write articles and posts pointing out the contradiction, but fail to realize the contradiction exists.

You may also notice that most people don’t appear to choose schools based on academics. They choose schools based on proximity, or because their sports teams are popular. Indeed, another Atlantic blogger points out that “Teenagers [. . .] are apt to assemble lists of favored colleges through highly non-scientific methods involving innuendo, the results of televised football games, and what their friend’s older brother’s girlfriend said that one time at the mall.” Murray Sperber especially emphasizes sports in his book Beer and Circus: How Big-Time Sports Is Crippling Undergraduate Education.

By the way, this does bother me at least somewhat, and I’d like to imagine that universities are going to nobly hold the line against grade and credential inflation, against the desires of the people attending them. But I can also recognize the gap between my ideal world and the real world. I’m especially cognizant of the issue because student demand for English literature courses has held constant for decades, as Louis Menand says in The Marketplace of Ideas:

In 1970–71, English departments awarded 64,342 bachelor’s degrees; that represented 7.6 percent of all bachelor’s degrees, including those awarded in non-liberal arts fields, such as business. The only liberal arts category that awarded more degrees than English was history and social science, a category that combines several disciplines. Thirty years later, in 2000–01, the number of bachelor’s degrees awarded in all fields was 50 percent higher than in 1970–71, but the number of degrees in English was down both in absolute numbers—from 64,342 to 51,419—and as a percentage of all bachelor’s degrees, from 7.6 percent to around 4 percent.

Damn. Students, for whatever reason, don’t want English degrees as much as they once did. As a person engaged in English Literature grad school, this might make me unhappy, and I might argue for the importance of English lit. Still, I can’t deny that more people apparently want business degrees than English degrees, even if Academically Adrift demonstrates that humanities degrees actually impart critical thinking and other kinds of skills. I could blame “colleges” for this, as Tenner does; or I could acknowledge that colleges are reflecting demand, and the real issue isn’t with colleges—it’s with the students themselves.

U and I: A True Story — Nicholson Baker

U and IThere’s something weirdly winsome about U and I, but it’s definitely an acquired taste; much as you wouldn’t recommend a friend who’d never eaten fish start off by trying raw eel, I wouldn’t recommend a friend read U and I unless I already knew they were a) quirky, b) at least moderately well-read and c) interested in the process of writing. U and I is like—I keep resorting to similes because, really, I don’t know what else to do—the best written, longest blog post you’ve ever read.

It’s a meditation on memory that shouldn’t be taken too seriously (sample: Nabokov “detailed his three-by-five method of fictional composition so comprehensively that Gore Vidal said in some essay that he was sick of hearing about it”). And Baker has a sense of the absurd, which I find absurd and love; he gets academia too well: “I count myself fortunate in being able to extract all the pretend-scholarly pleasure I want out of my method without urging it on anyone else.” Actual scholars appear to get real pleasure by inflicting their method on others. “Urging” is too light a word for the things I’ve seen. Baker is very polite to use “urging,” and he’s polite in general, for all his opinions.

If he’s retained that politeness, House of Holes ought to be a rather unusual book given its reputedly pornographic and hallucinatory premise (a copy is sitting on my table, waiting for me to get around to it, while I slog through The Condition of Postmodernity—which is a definitive infliction of an academic system and the kind of book that ought to be paired with Grand Pursuit: The Story of Economic Genius so that one will at least come out the other side knowing where many problems in the first book lie. Sorry for the preceding paragraph. U and I makes me more digressive than usual. If you read it, you’ll understand. It’s an acquired taste, as I said, one that sometimes needs a bit more sugar and olive oil, but one I rather liked, though I can’t recommend it except to book obsessives, writers, bibliophiles, or the people foolish enough to want to understand them. Which probably covers a fair number of readers of this blog, but still. The warning is part of being polite.

He must not be an academic at heart: academics love to apply their theories to others, with as much intellectual violence as necessary to make it stick. It turns out that Baker hadn’t read all of Updike’s books, and, with many of them, he’d only read parts, which he doesn’t remember entirely. In fact, the book isn’t really about Updike all that much at all: it’s more about artistic neuroses, learning how to write, and playing with that fickle memory beast. For example: “Once you decide on a profession, you riffle back through your past to find early random indications of a learning toward your chosen interest and you nurture them into a false prominence: so it was naturally very important to me, as a writer on the make, to have this sixth-grade vocabularistic memory in its complete form.” Baker wants us to know that we create a narrative of success and set up retrospective wayposts that make success seem foreordained, when it probably isn’t. Even the less successful among us might think so: I remember my parents being astonished that I was going to major in English. They told me they expected business or econ. Now I better understand why. But in the long run, I’m not sure it matters. There were other possibilities one could’ve guessed based on my past. But I picked one and rolled with it. This is an example of me trying not to do intellectual violence to an idea: instead of saying, “Everyone works this way!” I posit some possibilities and move on from there.

Baker mentions “early Updike, whose boy-heroes are sometimes more sensitive and queasier-stomached than one wants them to be.” But he doesn’t go on to explain. He doesn’t really explain anything. He leaves the explaining to the reader; you get what he’s doing, or you don’t. In this respect, he’s the least academic of all: instead of wanting to elaborate us to death, he wants to let us be. I know what he means about boy-heroes; sometimes you want a giant animal to attack Rabbit and see what he’s made of, or for aliens to invade in Couples, offering Piet an opportunity to do something more than carpentry and cuckoldry. Not that there’s anything wrong with those things, precisely, but, well, you hope for a bit more at times. Baker also gives good sentence, but they are often long sentences, like this one:

[. . .] many of the novels that I’ve liked lately (The Beautiful Room is Empty, The Swimming-Pool Library, A Single Man) have been so directly premised on gaiety: you feel their creators’ exultation at having so much that wasn’t sayable finally available for analysis, and you feel that the sudden unrestrained scope given to the truth-telling urge in the Eastern homosphere has lent energy and accuracy to these artists’ nonsexual observations as well [. . .]

Notice the ellipses on either end. Notice too Baker’s use of the funny word “homosphere” with the funnier adjective “Eastern.” Is there a “Western” homosphere? If so, how is it different? More tans, fewer references to ascots? And what is an ascot anyway? I’ve never known it save for the butt of a joke, and the word “butt” should be funny here in the context of the “homosphere.” Kind of, anyway. Like I was saying—Baker does go on. But that’s the pleasure in him. With him. Through him. Whatever. Still, this is enough quote for now.

No, actually, I change my mind. Writing about Updike’s book Of the Farm, Baker says that “A photographer would not so directly use his professional equipment in the metaphors he applied to his immediate surroundings—he would use it sometimes, but not in the first paragraph of the story he told. Film and f-stops are huge real presences to him, and can’t be so easily manipulated as tokens of comparison.” Not necessarily. Consider all the writers who use book and writing metaphors; I think our profession does get into our minds deeply enough that we might reach to professions for our first metaphors. Paul Graham’s writing is full of metaphors involved software and computers. That’s part of what makes it so rich.

It bugs me when I read books about doctors or lawyers or hookers or whatever and find characters who don’t think in the world in terms of their profession. I mean, a hooker probably doesn’t need to see every interaction as like something with a John, and a lawyer doesn’t need to view every interaction as adversarial or use terms like “estoppel” on every page, but once in a while, you know, it’d be nice. It’d work. I haven’t read Of the Farm, however, so I can’t comment on it. The problem with being a reader is that you’ll never have enough time to read everything you should. So you rely on memory, that uncertain beast, more than you should, and you end up be a scholarly pedant or a scatterbrained essayist. A false binary, but roll with it. On average, the latter seem funnier, and, in my own view, when in doubt, go funny.

For all U and I’s weirdness, I’m keeping the book instead of giving it away or reselling it. Maybe in a couple years it’ll say something new to me. I only worry that, instead of seeing it as weird, I’ll see it as normal.