Mac OS 10.7 is out today, and I don’t care because “In the Beginning was the Command Line”

A few days ago, I was reading Neal Stephenson’s incredible essay “In the Beginning was the Command Line,” which you can download for free at the link. His work can’t really be summarized because the metaphors he develops are too potent and elaborate to flatten into a single line that describes what he does with them; by the time you finish summarizing, you might as well recreate the whole thing. Despite the folly in attempting summarization, I want to note that he’s cottoned on to the major cultural differences between Windows, Macs, and Unixes like Linux, and by the time you’re done you with his essay realize the fundamental divide in the world isn’t between right and left or religions and secular, but between contemporary “Morlocks” and “Eloi,” the former being the ones who run things and the latter being the ones who mostly consume them (you can see similar themes running through Turn On, Tune In, Veg Out and Anathem). In the meantime, visual culture has become a poorly understood but highly developed global force bathing virtually everyone in its ambiance, and that might not be such a bad thing most of the time. The last issue doesn’t have that much to do with this particular post, but if you want to understand it, and hence an aspect of the world, go read “In the Beginning was the Command Line.”

He wrote the essay in 1999, and the problem with operating systems, or “OSes” in nerd parlance, is that none of them were very good. They crashed frequently or were incredibly hard to use, especially for Eloi, or both. In the last ten years, they’ve gotten much less crashy (OS X, Windows) or much easier to use (Linux) or both, to the point where the differences to a random user who wants to write e-mails, look at YouTube videos, browse for adult material, and look at FaceBook status updates probably won’t notice the quirks of each operating system. Games are a major difference, since OS X and Windows have lots of modern games available and Linux doesn’t, but if you don’t care about games either—and I don’t—you’ll want to discount those.

Some of the major technical differentiators have shrunk: on OS X, you can now communicate with your machine using the Terminal; on mine, I’ve changed the color scheme to trendy green-on-black. Windows has a system called PowerShell, and Linux has various ways to hide the stuff underneath it. But the cultural differences remain. Windows machines still mostly come festooned with ugly stickers (“These horrible stickers are much like the intrusive ads popular on pre-Google search engines. They say to the customer: you are unimportant. We care about Intel and Microsoft, not you”) and a lot of crap-ware installed. OS X machines look like they were designed by a forward-thinking 1960s science fiction special effects person for use by the alien beings who land promising peace and prosperity but actually want to build a conduit straight into your mind and control your thoughts. Linux machines still sometimes want you to edit files in /src to get your damn wireless network working. Given the slowness of cultural change relative to technical change, it shouldn’t be surprising that many of Stephenson’s generalizations hold up even though many technical issues have changed.

This throat clearing leads to the subject of today’s much-hyped launch of Apple’s latest operating system, which is an incremental improvement to the company’s previous operating system. I’ve been using Macs since 2004. I started with an aluminum PowerBook that you can see in this appropriately messy picture. In that time, I’ve steadily upgraded from 10.3 to 10.6, but the move from 10.5 to 10.6 didn’t bring any tangible benefits to my day-to-day activities. It did, however, mess up some of the programs I used and still use regularly, which made me more gun-shy about OS updates than I have been previously. Now 10.7 is out, and you can read the best review of it here. It’s got a bunch of minor new features, most of which I won’t use and are overhyped by Apple’s ferocious marketing department, which most people call “the press.”

I’ve looked at those features and found nothing or nothing compelling. Many are aimed to laptops, but I don’t use a laptop as my primary computer or have a trackpad on my iMac, and it seems like the “gestures” that are now part of OS X, while useful, aren’t all that useful. Apple is also integrating various Internet services into the operating system, but I don’t really care about them either and don’t want to pay for iCloud. I don’t see the point for the kinds of things I do, which mostly tend towards various kind of text manipulation and some messing around with video. It’s not that I can’t afford the upgrade—Apple is only charging $30 for it. I just don’t need it and simultaneously find it annoying that Apple will only offer it through their proprietary “app store,” which means that when I need to reinstall because the hard drive dies I won’t be able to use disks to start the machine.

Still, those are all quibbles of the kind that start boring flame wars among nerds on the Internet. I’ve saved the real news for the very bottom of the page: it’s not about Apple’s OS upgrade, which, at one point, I would’ve installed on Day 1. I remember when OS X 10.4 came out, offering Spotlight, and I was blown away. Full-text search anywhere on your machine is great. It’s magical. I use it every day. Even 10.5 finally had integrated backup software. But 10.6 had a lot of developer enhancements I don’t use directly. Now, 10.7 has improved things further, but in a way that’s just not important to me. The real news is about how mature a lot of computer technology has become. By far the most useful hardware upgrade I’ve seen in the last ten years is a solid state drive (SSD), which makes boots times minimal and applications launch quickly. Even Word and Photoshop, both notorious resource hogs, launch in seconds. New OS versions used to routinely offer faster day-to-day operation as libraries were improved, but it’s not important to move from “fast enough” to “faster.” The most useful software upgrades I’ve seen were moving from the insecure early versions of Windows XP to OS X, and the move from 10.3 to 10.4. The move to 10.7 is wildly unexciting. So much so that I’m going to skip it.

If you look at the list of features in 10.7, most sound okay (like application persistence) but aren’t essential. I rather suspect I’m going to skip a lot of software and hardware upgrades in the coming years. Why bother? The new iterations of OSes aren’t likely to enable me to be able to do something substantial that I wasn’t able to do before, which, in my view, is what computers are supposed to do—like most of the things we make of buy. If you’re an economist, you could call this something like the individual production possibility curve. Installing Devonthink Pro expanded mine. Scrivener might have too. Mac Freedom definitely has, and I’m going to turn it on shortly after I post this essay. The latest operating system, though? Not so much. The latest software comes and goes, but the cultural differences—and discussions of what those differences mean—endure, even as they shrink over time.

EDIT: Somewhat relevant:

Mac OS 10.7 is out today, and I don't care because "In the Beginning was the Command Line"

A few days ago, I was reading Neal Stephenson’s incredible essay “In the Beginning was the Command Line,” which you can download for free at the link. His work can’t really be summarized because the metaphors he develops are too potent and elaborate to flatten into a single line that describes what he does with them; by the time you finish summarizing, you might as well recreate the whole thing. Despite the folly in attempting summarization, I want to note that he’s cottoned on to the major cultural differences between Windows, Macs, and Unixes like Linux, and by the time you’re done you with his essay realize the fundamental divide in the world isn’t between right and left or religions and secular, but between contemporary “Morlocks” and “Eloi,” the former being the ones who run things and the latter being the ones who mostly consume them (you can see similar themes running through Turn On, Tune In, Veg Out and Anathem). In the meantime, visual culture has become a poorly understood but highly developed global force bathing virtually everyone in its ambiance, and that might not be such a bad thing most of the time. The last issue doesn’t have that much to do with this particular post, but if you want to understand it, and hence an aspect of the world, go read “In the Beginning was the Command Line.”

He wrote the essay in 1999, and the problem with operating systems, or “OSes” in nerd parlance, is that none of them were very good. They crashed frequently or were incredibly hard to use, especially for Eloi, or both. In the last ten years, they’ve gotten much less crashy (OS X, Windows) or much easier to use (Linux) or both, to the point where the differences to a random user who wants to write e-mails, look at YouTube videos, browse for adult material, and look at FaceBook status updates probably won’t notice the quirks of each operating system. Games are a major difference, since OS X and Windows have lots of modern games available and Linux doesn’t, but if you don’t care about games either—and I don’t—you’ll want to discount those.

Some of the major technical differentiators have shrunk: on OS X, you can now communicate with your machine using the Terminal; on mine, I’ve changed the color scheme to trendy green-on-black. Windows has a system called PowerShell, and Linux has various ways to hide the stuff underneath it. But the cultural differences remain. Windows machines still mostly come festooned with ugly stickers (“These horrible stickers are much like the intrusive ads popular on pre-Google search engines. They say to the customer: you are unimportant. We care about Intel and Microsoft, not you”) and a lot of crap-ware installed. OS X machines look like they were designed by a forward-thinking 1960s science fiction special effects person for use by the alien beings who land promising peace and prosperity but actually want to build a conduit straight into your mind and control your thoughts. Linux machines still sometimes want you to edit files in /src to get your damn wireless network working. Given the slowness of cultural change relative to technical change, it shouldn’t be surprising that many of Stephenson’s generalizations hold up even though many technical issues have changed.

This throat clearing leads to the subject of today’s much-hyped launch of Apple’s latest operating system, which is an incremental improvement to the company’s previous operating system. I’ve been using Macs since 2004. I started with an aluminum PowerBook that you can see in this appropriately messy picture. In that time, I’ve steadily upgraded from 10.3 to 10.6, but the move from 10.5 to 10.6 didn’t bring any tangible benefits to my day-to-day activities. It did, however, mess up some of the programs I used and still use regularly, which made me more gun-shy about OS updates than I have been previously. Now 10.7 is out, and you can read the best review of it here. It’s got a bunch of minor new features, most of which I won’t use and are overhyped by Apple’s ferocious marketing department, which most people call “the press.”

I’ve looked at those features and found nothing or nothing compelling. Many are aimed to laptops, but I don’t use a laptop as my primary computer or have a trackpad on my iMac, and it seems like the “gestures” that are now part of OS X, while useful, aren’t all that useful. Apple is also integrating various Internet services into the operating system, but I don’t really care about them either and don’t want to pay for iCloud. I don’t see the point for the kinds of things I do, which mostly tend towards various kind of text manipulation and some messing around with video. It’s not that I can’t afford the upgrade—Apple is only charging $30 for it. I just don’t need it and simultaneously find it annoying that Apple will only offer it through their proprietary “app store,” which means that when I need to reinstall because the hard drive dies I won’t be able to use disks to start the machine.

Still, those are all quibbles of the kind that start boring flame wars among nerds on the Internet. I’ve saved the real news for the very bottom of the page: it’s not about Apple’s OS upgrade, which, at one point, I would’ve installed on Day 1. I remember when OS X 10.4 came out, offering Spotlight, and I was blown away. Full-text search anywhere on your machine is great. It’s magical. I use it every day. Even 10.5 finally had integrated backup software. But 10.6 had a lot of developer enhancements I don’t use directly. Now, 10.7 has improved things further, but in a way that’s just not important to me. The real news is about how mature a lot of computer technology has become. By far the most useful hardware upgrade I’ve seen in the last ten years is a solid state drive (SSD), which makes boots times minimal and applications launch quickly. Even Word and Photoshop, both notorious resource hogs, launch in seconds. New OS versions used to routinely offer faster day-to-day operation as libraries were improved, but it’s not important to move from “fast enough” to “faster.” The most useful software upgrades I’ve seen were moving from the insecure early versions of Windows XP to OS X, and the move from 10.3 to 10.4. The move to 10.7 is wildly unexciting. So much so that I’m going to skip it.

If you look at the list of features in 10.7, most sound okay (like application persistence) but aren’t essential. I rather suspect I’m going to skip a lot of software and hardware upgrades in the coming years. Why bother? The new iterations of OSes aren’t likely to enable me to be able to do something substantial that I wasn’t able to do before, which, in my view, is what computers are supposed to do—like most of the things we make of buy. If you’re an economist, you could call this something like the individual production possibility curve. Installing Devonthink Pro expanded mine. Scrivener might have too. Mac Freedom definitely has, and I’m going to turn it on shortly after I post this essay. The latest operating system, though? Not so much. The latest software comes and goes, but the cultural differences—and discussions of what those differences mean—endure, even as they shrink over time.

EDIT: Somewhat relevant:

Anathem — Neal Stephenson

I read Anathem when it came out and tried it again recently because I’m a literary masochist. It concerns a giant graduate school/university where really smart people gather in seclusion from the rest of humanity, who are busy running around distracted by cell phones (now called “jeejahs”), futuristic TV, and religious-style demagogues. Erasmus (get it?) is in the middle of this and realizes something bad is going to happen. He’s a low-ranked “avout” who lives in one of those cloisters, which demands an kind of autarky of ideas, a bit like Vermont without Internet access. There are a lot of passages like this, taken from the beginning:

Guests from extramuros, like Artisan Flec, were allowed to come in the Day Gate and view auts from the north nave when they were not especially contagious and, by and large, behaving themselves. This had been more or less the case for the last century and a half.

It’s unfair to take this out of context and not explain what the hell is going on. But for the first quarter to half of the book, there is no context until you’ve created your own.

Confused yet? Hopefully not too much; if you pick up Anathem, you will be further. The novel famously comes with a glossary, which reads like code with too many GOTOs in it. And if I make the novel sound ridiculous, I’m doing so intentionally and picking up the flavor of those lofty New Yorker reviews whose greatest tactic against the manufactured noise and lights that sometimes pass as popcorn movies is ridicule. In Stephenson’s case, the noise is highbrow and intellectual, or maybe pseudo-intellectual, but noise nonetheless, regardless of the number of philosophic references put in it.

The biggest problems with the silly vocabulary is that it a) makes the the novel harder to take seriously, even in a humorous way, and b) make it more likely that readers will abandon the novel before reading it, and in turn badmouth it to their friends (and on their blogs, as I’m doing). I wanted to like the novel, but Neal Stephenson is beginning to feel like Melville: someone who peaked before he stopped writing novels, to the detriment of his readers, but who nonetheless still writes a lot of unconventional and interesting stuff. Stephenson’s Moby Dick is Cryptonomicon, a novel still justifiably beloved, and his earlier novels The Diamond Age and Snow Crash are both unusually strong science fiction.

By now, one gets the sense no one restrains Stephenson’s grandest impulses: the long well-done novel is a unique beauty, but the poorly done long novel is more likely to be abandoned than finished, and one could say that all the more of a poorly done series of long novels like The Baroque Trilogy , which is destined not to be a collected in a single physical volume thanks to its heft.

In Further Fridays, John Barth writes of great thick books that “One is reminded that the pleasures of the one-night stand, however fashionable, are not the only pleasures. There is also the extended, committed affair; there is even the devoted, faithful, happy marriage. One recalls, among several non-minimalist Moderns, Vladimir Nabokov seconding James Joyce’s wish for ‘the ideal reader with the ideal insomnia.’ ” Neal Stephenson answers this call for heft and then some: Cryptonomicon is a marvelous book that would demand more than a single night of insomnia to read, and yet none of it seems extraneous, or at least not in a way that deserves to be cut. Even the several page description of how one should eat Captain Crunch seems apt to the mind of the hackers and proto-hackers Stephenson follows. So it is again with Anathem, a novel whose demands are much greater.

Stephenson has made steadily greater demands of his readers, and I wonder if those demands were most justified for Cryptonomicon. Midway through Quicksilver I gave up, and what The Baroque Trilogy demands in sheer length, Anathem demands in depth. As has often been mentioned in reviews, it has a glossary, and the dangers of it are well-expressed by this graph:

(I will note, however, that one of my favorite novels of all times has not just made up words, but an entire made-up language embedded: Lord of the Rings. So it’s important to note that the probability of a book being good descends but never reaches zero, at least as far as we can tell from this graph.)

One other point: as Umberto Eco said of The Name of the Rose:

But there was another reason [beyond verisimilitude to the perspective of a 14th C. monk] for including those long didactic passages. After reading the manuscript, my friends and editors suggested I abbreviate the first hundred pages, which they found very difficult and demanding. Without thinking twice, I refused, because, as I insisted, if somebody wanted to enter the abbey and live there for seven days, he had to accept the abbey’s own pace. If he could not, he would never manage to read the whole book. Therefore those first hundred pages are like penance or an initiation, and if someone does not like them, so much the worse for him. He can stay at the foot of the hill.
Entering a novel is like going on a climb in the mountains: you have to learn the rhythm of respiration, acquire the pace; otherwise you stop right away.

Or, worse, you might think you get to the mountain’s summit and then intellectually die during the descent (and yes, the link embedded in this sentence is highly relevant to the issue at hand).

In the novel, Stephenson is dealing with the potential for an increasingly bifurcated society with supernerds on one side and proles on the other. You can see the same ideas in 800 words instead of 120,000 in his essay Turn On, Tune In, Veg Out, which I assign to my freshmen every semester and which almost none of them really get.

Still, the concern that smart people are going to rule others does have a certain pedigree, and the idea of a cerebral superclass detached from the material world is hardly a new one; monks were an expression of it in a religious context for centuries if not longer. H.G. Wells thought of something not dissimilar in his idea of an “Open Conspiracy,” through which leading scientists and philosophers would form a benevolent world government. In Richard Rhodes’ The Making of the Atomic Bomb, he describes physicist Leo Szilard similar conception of an improbably super-competent elite ruling the world:

[…] we could create a spiritual leadership class with inner cohesion which would renew itself on its own.

Members of this class would not be award wealth or personal glory. To the contrary, they would be required to take on exceptional responsibilities, “burdens” that might “demonstrate their devotion.”

Sounds great. Keep them away from me.

People who like second-hand philosophy and who need a superiority complex or to feed one that’s developing might like Anathem. Mastering it is perhaps as esoteric as being able to quote at will from Hegel The Phenomenology of Spirit and about as fun. I’ve barely talked about the novel, the text, and the story because the story feels like a skeleton for the novel’s concerns. Again, like Melville, Stephenson seems to have forgotten about the pull of story in his later.

Umberto Eco, in contrast, is another writer of enormous books filled with ideas, and his two best—The Name of the Rose and Foucault’s Pendulum contrast with weaker efforts in those others—like The Island of the Day Before and The Invisible Flame of Queen Loanna—become obsessed with how story is told rather than the story itself. The “how” is a fine subject to address in novels, as many postmodernist novels do, but it can’t be subjugated to the what—otherwise one isn’t writing a novel; one is writing literary criticism. Trying to shoehorn the latter into the former isn’t going to create anything but boredom, with characters who aren’t characters but Vessels of Great Meaning. Erasmus in Anathem isn’t a person—he’s a convenient way to explore ideas. I’d like a character who explores the idea of why idea must be integrated into characters rather than vice-versa.

Is there something wrong with story? For a novel to work, its meaning has to be at most equal to, but more likely subsumed beneath, its story and the language used to convey that story. But Anathem is too busy preening to let that happen. I’m reminded of something Philip Pullman said regarding the His Dark Materials Trilogy: for every page he wrote he threw five away, and he concentrated ceaselessly on moving it along. That has our hero, Lyra, in a closet, where she’s hiding because she’s broken a rule and sees someone attempt to poison her father, a returning hero. The novel moves ever faster from there.

It’s a beginning so forceful that I’m recalling it by memory. Where does Anathem begin again? I can’t remember, and I look at the tome on my desk and considering finding out. If I were to force myself to remember, it would be doing so with all the joy of memorizing for school. His Dark Materials, in contrast, I remember for pure joy, and for its impact.

This is, to be sure, an overlong post, but it suits an overlong novel. Let this serve as a warning regarding and substitute for Anathem.

%d bloggers like this: