Apple finally releases new laptops:

Apple finally released new laptops, about four to six months after they should’ve. Still, the upgrades are impressive and if you’ve been on the fence or otherwise waiting, now’s the time. As usual, Apple’s hard drives are too small and their hard drive upgrade prices are usurious. People who want a Mac laptop and don’t want to pay full price should see sweet discounts on used and refurbished models in the next couple weeks.

I’m still using an iMac as a primary computer, so the announcements don’t affect me much. And my most important piece of Mac-only software is Devonthink Pro, which I still use according to a variation on this scheme, originally conceived by Steven Berlin Johnson (though he no longer uses DTP).

macbookpro_newOtherwise, many writers swear by Scrivener. I wrote The Hook in Scrivener, and due to the structure of that novel it was extremely useful. But for most novels I don’t find it essential; I don’t think most scenes can (or should) be reshuffled at will, which probably limits its utility.

Scrivener’s appeal for nonfiction projects is much more apparent to me. A couple months ago I finished the Grant Writing Confidential book manuscript (details to follow), and if most of the book hadn’t already existed  in the form of blog posts I would’ve used Scrivener as an organization tool.

In other laptop news, Dell has been producing Linux-native XPS laptops for a couple years, as does a smaller manufacturer called Purism. Given Apple’s lack of interest in non-smartphone products, it’s not a bad idea for Mac users to keep an eye on what everyone else is doing.

Briefly noted: The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World’s Most Important Company — Michael S. Malone

The Intel Trinity is another of these books that one wishes were better written but that remains interesting throughout nonetheless. For one thing, I bought the party line about Intel being essentially indomitable from its inception to the present. It wasn’t (“Frankly, the evidence argues that Intel may have been the most successful technology company ever founded by a dysfunctional start-up team—certainly by such a team that stayed together”). The book also injects ideas about history and legacy into a very contemporary culture:

Thanks to the amnesia of Silicon Valley and the digital world, [Robert Noyce] had almost been forgotten by each subsequent generation of techies as they elbowed their way to fame at a thousand dot-com companies, and then at Google, Facebook, and Twitter.

Intel_TrinityOver time, “Silicon Valley had finally and unexpectedly become history,” at least to its founders. Over a long enough time period anything becomes history, although Alan Kay famously said, “I also happen to believe in history. The lack of interest, the disdain for history is what makes computing not-quite-a-field. [. . . Computing is] complete pop culture. I’m not against pop culture. Developed music, for instance, needs a pop culture.”

It will take many more books and maybe many more years before it becomes less of a pop culture, if it ever does. The Intel Trinity is history, but the paper quality of the hardcover is oddly poor, as if the publishers themselves imagined the book to be non-essential and disposable. That contrast between the historical comments and the quality of the physical object itself is odd.

Still, it does trace the story of Intel from its beginnings and from the Fairchild exodus. Fairchild employees were apparently more dissolute and fun than modern tech company workers, or at least compared to the image of modern tech workers:

Just to repeat the anecdotes of the era is insufficient. There were endless after-work drunken gatherings at the nearby Wagon Wheel saloon, where women employees were hustled, marriages were broken, feuds were fed, and in time, employees were stolen by competitors.

Intel itself, however, was apparently less exciting, at least for most people.

The Intel Trinity also points to some of the reasons why Silicon Valley has been so hard to dislodge as a startup capitol (much as NYC is still a publishing capitol, despite outrageously high housing costs): “between 1967 and 1973, [hundreds of new companies] created not only a vast and prosperous business-technology community, but also the greatest start-up company incubator the world had ever seen” (59). At a time when most people were focused on free love and the word “start-up” to connote a new company didn’t really exist, northern California was blossoming. That’s a tremendous head start that’s hard to overcome today and will be hard to overcome tomorrow, despite San Francisco’s completely insane politics.

I haven’t yet got to the stories of Noyce, Moore and Grove. The latter has the most flabbergasting story, since he barely escaped World War II-era Hungary, when Hungary first fought on the side of the Germans, then was occupied by Germany, then was occupied by the Soviets, and eventually had its fledgling democracy crushed by the Soviets. Grove was Jewish and his survival story is amazing enough to have made me order his memoir, Swimming Across, which Malone draws from.

One has to wonder why the Intel trinity is less famous than, say, Bill Gates. I think the answer is simple: there were three of them. Larry Page and Sergey Brin are less famous than Gates, though their company is at least as important, largely because I think the press needs a single person to deify and to hate. Multiple founders makes the process of adulation and despair harder to conduct. Mark Zuckerberg is perhaps the most famous modern startup founder because he stands alone in the spotlight. This reading owes something to Zero to One, which is as much about culture and it is about startups. Culture is everywhere, even in technology, as Malone reminds us.

5K / “Retina” iMac and Mac OS X “Yosemite” thoughts

Note added Jan. 29: Yosemite update 10.10.2 seems to have solved the software problems noted below. The hardware continues to be excellent. The next version of OS X, El Capitan (10.11), is said to be “a return to slow-but-steady improvement after Yosemite’s upheaval.” Let’s hope so.

I’ve had a Retina iMac with Yosemite installed for close to a month, which is long enough to form a mostly positive impression—except for the bad software situation. The iMac’s “Retina” screen is as beautiful as advertised and is a tangible reminder of progress. If you use it you won’t want to go back, any more than you’d want to go back to celibacy after losing your virginity. But Yosemite crashes about once a day for reasons mysterious to me and to Apple support. Yosemite is a step backwards.

5k retina imacLet’s start with the headline feature: the screen difference between the preceding iMac and this one is real, but it’s not nearly as great as the difference between the low-resolution Macbook Pro and the Retina MacBook Pro. Retina MacBook Pros are not just better but insanely, can’t-go-back better than their predecessors. The Retina iMac is an improvement, a real improvement, but not as startling. If you’re in the market for an iMac-like computer and can afford the price premium, the better display is worth the cost. As noted above, you won’t want to go back. Standalone versions of the screen aren’t even available as of this writing; when they become available, they’ll likely cost $2,000 in and of themselves.

People who bought iMacs in the last two to three years should probably wait for the next iteration because it should be cheaper than this one. My last iMac came from 2010 or thereabouts and I was due for an upgrade, especially because I now do more with video than I previously had. Fast computers are to video what oil is to transportation: It’s almost impossible to have enough.

This iMac is not as quiet as my previous iMac, though it also isn’t noisy enough to make the fuss that would be necessary to return mine and try again. It has an annoying, intermittent buzz that’s noticeable in quiet rooms. I can imagine myself standing in an Apple store, most of which sound like an airplane engine is prepping for takeoff, and trying to get the support person to do what I want them to do, while the support person looks at me like I’m crazy.

In software terms the OS jumped from 10.6 to 10.10. The only day-to-day difference I notice is the addition of Messages, which lets me text to phones from my keyboard. Phone keyboards have always bothered me—which may be a sign of age—and Messages is a significant enhancement. Obviously there are numerous differences in the APIs and libraries available, which developers have leveraged or are leveraging, but they’re not as salient to me as the “headline” features. Still, I think about how every version of OS X from 10.0 – 10.6 saw major feature improvements that immediately made my life way better in an immediate, tangible way. I’m not seeing that in the 10.6 – 10.10 jump.

The major OS X features I love and constantly use were introduced years ago: Spotlight, Time Machine, Exposé. Messages could probably have been backported to 10.6 but wasn’t. Some cool new programs like Pixelmator seem to have been enabled by the backend OS work, but those programs don’t have a huge effect on my day-to-day life. Spotlight is harder to use and doesn’t seem to work as well as it did. Boolean operators also still appear to be absent.

The migration process from old iMac to new was painful and manual; Migration Assistant is notoriously unreliable and works terribly if a person, like me, is moving from two hard drives to a single drive. Permissions problems appeared, as they so often do. I could solve them with time and patience, but they might have stymied less sophisticated users. The general reluctance to upgrade any working computer is reasonable based on my experiences.

Once I had the basic migration done, I spent a lot of time turning off various “features.” Threaded mail messages in Mail.app don’t work very well for me, and the animations were annoying and slowed things down. Some pretty but non-functional could only be turned off through the command line. It would be nice to have a “fuck animations” checkbox in system preferences somewhere. Speed is of paramount importance.

Four years ago I wrote about my uninterest in upgrading from OS X 10.6 to 10.7, and recent experiences have borne out that reluctance. Most of the Apple operating systems since 10.6 appear to be less stable than 10.6. This Hacker News thread is full of stories about problems with Yosemite. Right now Yosemite is only on its .1 release, so the .2 release may solve some of the problems, but the stability issue should have been solved in beta. In addition to the WindowServer crashes linked to above, Flash crashes routinely in the browser. Final Cut Pro X is unusable because it crashes when attempting to import any new video. This will necessitate yet another call to Apple support. aFinder has crashed a couple times for mysterious reasons—despite the fact that the hardware in this machine is by every conceivable metric faster, better, and more tolerant than the hardware in the preceding machine.

What gives? I’m not the only one having these problems. In addition to the Hacker News thread, consider “AppleCore Rot,” another piece on Apple’s apparent quality problems. This in particular resonates: “Stuff that worked for years breaks, while new visual crapware is piled on endlessly.” I would like Apple to make it work first and make it pretty second.

In an email my Dad observed this about Yosemite and the linked Hacker news thread:

The first post [in the Hacker News thread] complains about the phone calling feature. About 22 years ago, I routinely made calls from the IBM PC 1 using Lotus Organizer 1.0 to dial from the address book with Windows 3.1 and the world’s slowest modem. But then again, Organizer was the best PIM [Personal Information Manager] ever and in some respects AmiPro and WordPro were better 20 years ago than Word is today.

I used to use Lotus WordPro too, and it was much stabler than Word today (although Word has been relatively stable in this version and the last version). Some software was better a decade ago than it is today.
That being said, many of the random waits that involved exporting photos from Lightroom or, to a lesser extent, videos from Final Cut Pro X have either disappeared altogether in the case of the former or shrunk dramatically in the case of the latter. That’s the few times FCP X has actually worked, however.  Some simple scripts run faster. Manipulating exifdata with Exiftool is faster. Web browsers no longer choke and sputter (that they did on a relatively modern machine could be the subject of another cranky post). Fast user switching is now genuinely fast. The advance of hardware, and what it enables, is not to be underestimated.

Still, there are other caveats. The larger hard drive is good, but Apple should really be offering four terabyte Fusion drives. I’m puzzled as to why it doesn’t, apart from simple cheapness. Large hard drives are one of the (dwindling number of) reasons people still use desktops.

My iMac had a 256 GB SSD and a 2 TB conventional drive; this one has 128 GB and 3 TB, respectively, which is anemic improvement in capacity terms (though the SSD is much faster and the interconnect between the SSD and CPU is also much faster). Still, 4 TB hard drives are widely available and there’s no good reason why Apple shouldn’t offer bigger hard drives. There are other nice-to-haves: USB 3 ports. Thunderbolt ports. External hard drive speeds are much faster. I’ve gone from having about 2 TB readily available to having 9 TB readily available, between a Drobo and the internal drives.

5k retina imac side viewAesthetics are a wash. From the front, the new iMac is visually indistinguishable from the old, and photos don’t convey the screen’s beauty. The profile shots show a much slenderer machine. Also notable in the pictures in the Elevation Stand, which began as a Kickstarter Project that I backed. It’s by far the best iMac stand I’ve found, and I’ve tried various solutions ranging from wood blocks to books to an expensive stand festooned with screws sold discovered through the Apple store. The Elevation Stand is expensive but it works. The space between the top and bottom of the stand will also fit external hard drives.

My desk is somewhat ugly and I’m perpetually telling myself I’ll clean it off and make it very pure and functional seeming, but for whatever reason it rarely actually happens and as soon as it does I end up re-covering it with books and cameras and cords and other crap. This is obviously not part of the iMac review.

The iMac itself is still beautiful. If you use a computer too much, as I do, you’ll want one.


Here is Farhad Manjoo’s paean to the iMac. Most tech reviewers are ecstatic, perhaps disproportionately to how ecstatic they should be, but they also know computers well and know that the screen is amazing given how expensive 4K and 5K screens cost. I have a 23″ side monitor and no desire or need to upgrade it.

Here is a Reddit thread that hits many of the same points I’ve made about OS stability. Here is Geoff Wozniak on why he quit OS X. You’ll recognize the themes I hit.

How to update Letterbox for Max OS X after the latest 10.6.8 security patch:

When Apple released the most recent 10.6.8 security patch, that patch broke Letterbox (see also here), an insanely useful Mail.app plugin that allows all three Mail.app panels to be viewed vertically. This view maximizes screen real estate, which is very important for those of us on widescreen displays—which is to say, virtually all Mac users. But this 2010 OS X Daily post describes how to work around the last breakage caused by an Apple update. These are their instructions, except for the addition of two new UUIDs that I found for the latest version of 10.6.8:

* From the Finder, hit Command+Shift+G and enter ~/Library/Mail/ then hit Go
* Open Bundles (Disabled) rather than Bundles – note: if you have already opened Mail, the plugin is disabled, if you haven’t opened Mail yet, it will be in Bundles
* Right-click on Letterbox.mailbundle and select “Show Package Contents”
* Now open the “Contents” folder inside the Letterbox.mailbundle contents
* Using a text editor, open Info.plist (you can use TextEdit, don’t use Word)
* Scroll to the bottom of the Info.plist file and look for “SupportedPluginCompatibilityUUIDs” which is surrounded by key tags, below that will be a bunch of hex strings surrounded by string tags
* Add the following two strings to the bottom of the list (inside the array tags):

<string>064442B6-53C0-4A97-B71B-2F111AE4195B<string>
<string>588FF7D1-4310-4175-9980-145B7E975C02<string>

That’s the important part. The rest is fairly simple:

* Save these changes to the Info.plist file
* Go back to the Mac OS X desktop and hit Command+Shift+G again, then enter ~/Library/Mail/
* You’ll see these two folders again: Bundles and Bundles (Disabled), what you need to do is move the Letterbox.mailbundle plugin from the (Disabled) folder to the Bundles folder. Do this just by dragging the file from one folder window to the other.
* Relaunch Mail.app

You can also navigate to the folder ~/Library/Mail/Bundles on your own, without using the “Go” command.

A lot of people—especially the nerds likely to use Letterbox—have probably already moved to 10.7 or 10.8, though I still haven’t and am unlikely to in the foreseeable future.

The GeekDesk / writing space 2012 post:

Since my 2010 writing space post, quite a bit has changed. Here’s the new setup, viewed from a couple of angles, with an explanation below the photos:

Those of you who looked carefully, or even not very carefully, probably noticed something unusual: the desk is at two different heights. That’s because I’ve been using a GeekDesk for long enough to form an opinion on it, which is that I’d be reluctant to go back to a regular desk or a purely standing desk. I’ll write a longer review when I have more time, but the preceding sentence tells you most of what you need to know.

The other salient upgrade is from a 24″ iMac to a 27″ iMac with an SSD and a conventional hard drive. This machine inspired me to write “Mac OS 10.7 is out today, and I don’t care because ‘In the Beginning was the Command Line,’” because computers have now, finally, become “fast enough” and “good enough for my purposes. It’s taken a long time! I keep meaning to get a better stand than a pair of books, but that’s the sort of project that’s very easy to delay, indefinitely, until tomorrow.

The keyboard remains the same, and it’s hard to see what could make me replace the Kinesis Advantage. Its keys still feel new. The speakers aren’t very interesting, although they are external and thus better than the built-in ones, but they’re probably wasted on me because I don’t listen to music all that often, and they’re overkill for movies or TV shows. The external monitor is a 23” Dell with an IPS display, although I can’t remember the model number and don’t feel like looking it up. It’s a fine panel, but not very interesting. The lights on the back of the iMac are cheap Antec Halo LED lights, which are supposed to reduce eyestrain in dark rooms. Not sure if they actually do. I suspect turning down the iMac from “blinding” to “tolerate” would have as strong an effect.

You can see a Canon s100, which usually rides in my pocket. Sony now makes a better version of the s100—the RX100—but the RX100 is also $300 more. In a couple of shots there’s also a boring iPhone. If I weren’t on a family plan, I’d probably get a cheap Android phone, because I use maybe 5% of its features. Unless you’re doing a LOT of sexting, I don’t think I see the point in getting a more expensive “smart” phone over a less expensive one.

There’s also an Aeron, which is better for me than their recent Embody. Reasons for why I say that will follow when I have more time.

Desktop PCs aren’t going anywhere, despite the growth of phones and tablets, because they’re cheap

Articles like “As PCs Wane, Companies Look to Tablets” are both true and bogus. PCs aren’t going anywhere because they’re cheap. You can buy them reasonably close to cost. If you want the least expensive means of computing possible, you can’t beat PCs now and won’t be able to for years, at the very earliest. Sure, “making them has not been a great business for most American companies for almost a decade,” but that’s because consumers are deriving so much surplus from PCs. PCs are close to commodities, which is great for buyers, if not sellers.

The industry, the reporters who cover the industry, bloggers, and other people with a stake in the action want you to believe “TABLETS TABLETS TABLETS ARE COOL!!!!” because they want you to buy relatively high-margin tablets (and they need something write about). Current tablets are high-margin because they combine commodity hardware with OS lock-in. The industry wants to move closer to Apple’s model, since Apple gets away with what it does because a) it has great design and b) for a long time, and maybe up to the present, OS X was more fun and in some respects better designed than Windows. Lock-in and high margins? What’s not to love from a business perspective?

It’s not very much fun for journalists and bloggers who drive these stories about PCs to write, “Area man continues to derive immense intellectual, social, and efficiency value from the PC he bought five years ago and which continues to meet his needs adequately.” I wouldn’t read that story or post either. The tech press needs to find hype and trends. Tablets and cell phones are of course genuinely big deals and their impact will continue to reverberate—but just because one sector is waxing doesn’t mean another is automatically waning. Especially when that sector offers a lot of value for the money.

So: every time you see a call for tablet computing, regardless of its source, you should remember that somewhere behind it, there’s a manufacturer who wants to sell you more stuff at higher prices. Paul Graham calls such beasts “the Submarine,” and if you want to understand how you’re being marketed to, you should read that essay. The PC manufacturer can’t really sell you more stuff in PC laptops and desktops these days because they’re too inexpensive and interchangeable. Apple can sell you design and an unusual operating system.

Maybe Lenovo can charge above-average prices because of the Thninkpad’s reputation for durability, but that’s it. Everyone else is scrambling because consumers dominate producers when it comes to PCs. So we get stories like the one above; and, if, as Tyler Cowen speculates in this example, the U.S. economic model moves closer to Japan and capital depreciates, expect to see even more calls for tablets and so forth. Anything to avoid acknowledging that an existing stock of capital is Good Enough.

And you can expect to see misleading headlines like the one above. It’s frustrating to read stuff like this:

Computer makers are expected to ship only about 4 percent more PCs this year than last year, according to IDC, a research firm. Tablets, in contrast, are flying off store shelves. Global sales are expected to more than double this year to 24.1 million, according to Forrester Research.

How does an increase in the absolute number and the percentage of PCs sold an indicating of waning? I think that means computer makers will ship over a hundred million units, compared to a quarter as many tablets. I checked out Dell’s website, and one can buy a very nice Inspiron desktop with a dual-core AMD processor, 3 GB of RAM, and a 1 TB hard drive for about $400. Get a cheapie 20″ monitor, and you’ve got a very competent machine that’ll run Windows passable well for under $600. Get a sweet 24″ IPS monitor as good or better than the one in my 2007 24″ iMac for another $500, and you’re still under $1,000. That’s why desktops aren’t going anywhere and all this blah blah blah about tablets is important but also overrated by tech sites chasing the new shiny but who also think that everyone has, if not an unlimited budget, then at least a very substantial one for technical toys. Given my work, it’s probably not surprising that I have a higher-than-average budget for technical toys and tools, since I use my computer every day and often for very long stretches, but for people who aren’t writers, hackers, day traders, pornographers, and the like, having an expensive computer and a tablet and a phone is, if not overkill, then at least overpriced.

Some people get this—here’s a Time story that’s as an example—but too many don’t, especially in the press, which follows the tech industry like a marketing arm instead of an independent evaluator.

One more point: PCs are still better for some tasks. Maybe not for browsing Facebook and YouTube, but anything that requires a keyboard isn’t just better on a computer—it’s way better. Maybe students are going to write papers on iPads or iPad-like devices, but I’m skeptical, and even if one has a couple of substantial text-writing efforts a year, it’s going to be tempting to keep a keyboard around. I could be crazy; people are apparently writing novels on cell phones in Japan and now other countries, but producing a novel on a phone doesn’t sound appetizing from the perspective of either the writer, who can’t really get in the zone over the course of a hundred words, or the reader, who has to endure writing from someone who doesn’t appear to, say, go back and edit their novel as a coherent whole. Most people don’t seem to much like 19th Century novels that were published serially, and “lack of editing” and “lack of brevity” might be two reasons. The first will probably haunt cell phone novelists.

Then again, looking at the bestseller lists, maybe there isn’t much to go but down.

PCs and other form factors are going to coexist. Coexistence is a less sexy story than death, but it’s truer. In one Hacker News comment thread “jeffreymcmanus” observed, “People don’t stop buying the old stuff just because there’s new stuff. See also: horses, bicycles, cars.” Well, people have mostly stopped buying horses, because cars offer superior functionality in virtually all circumstances, but the point remains. Another commenter, “mcantelon,” said:

Yeah, which is why the “post-PC” terminology has a propaganda tone. It’s not going to be “post-PC”: more like “pop computing” or “computing lite”.

He’s right. Which is okay: I have nothing against tablets or cell phones. Use whatever works. Just don’t pretend PCs are going away or automatically declining.

EDIT 2015: As of this edit I’m using a 27″ Retina iMac. The hardware is incredible. The best is still yet to come.


See also this post on whether you should buy a laptop or desktop and this related post on the reliability of each form factor.

Desktop PCs aren’t going anywhere, despite the growth of phones and tablets—because they’re cheap

I’m tired of articles like “As PCs Wane, Companies Look to Tablets” You know why PCs aren’t going anywhere? Because they’re cheap. You can buy them reasonably close to cost. If you want the least expensive means of computing possible, you can’t beat PCs now and won’t be able to for years, at the very earliest. Sure, “making them has not been a great business for most American companies for almost a decade,” but that’s because consumers are deriving so much surplus from PCs. They’re not close to commodities. Which is great for buyers, if not sellers.

The industry, the reporters who cover the industry, bloggers, and other people with a stake in the action want you to believe “TABLETS TABLETS TABLETS ARE COOL!!!!” because they want you to buy relatively high-margin tablets. Those tablets are high-margin because they combine commodity hardware with OS lock-in. The industry wants to move closer to Apple’s model, since Apple gets away with what it does because a) it has great design and b) for a long time, and maybe up to the present, OS X was more fun and in some respects better designed than Windows. Lock-in and high margins? What’s not to love from a business perspective?

It’s also not very much fun for journalists and bloggers who drive these stories about PCs to write stories that say, “Area man continues to derive immense intellectual, social, and efficiency value from the PC he bought five years ago and which continues to meet his needs adequately.” I wouldn’t read that story or post either. The larger tech press needs to find something to hype. In this case, of course, tablets and cell phones are genuinely big deals and their impact will continue to reverberate—but just because one sector is waxing doesn’t mean another is automatically waning. Especially when that sector offers a lot of value for the money.

So: every time you see a call for tablet computing, regardless of its source, you should remember that somewhere behind it, there’s a manufacturer who wants to sell you more stuff at higher prices. Paul Graham calls such beasts “the Submarine,” and if you want to understand how you’re being marketed to, you should read that essay. The PC manufacturer can’t really sell you more stuff in PC laptops and desktops these days because they’re too inexpensive and interchangeable. Apple can sell you design and an unusual operating system. Maybe Lenovo can charge above-average prices because of the Thninkpad’s reputation for durability, but that’s it. Everyone else is scrambling because consumers dominate producers when it comes to PCs. So we get stories like the one above; and, if, as Tyler Cowen speculates in this example, the U.S. economic model moves closer to Japan and capital depreciates, expect to see even more calls for tablets and so forth. Anything to avoid acknowledging that an existing stock of capital is Good Enough.

And you can expect to see misleading headlines like the one above. It’s frustrating to read stuff like this:

Computer makers are expected to ship only about 4 percent more PCs this year than last year, according to IDC, a research firm. Tablets, in contrast, are flying off store shelves. Global sales are expected to more than double this year to 24.1 million, according to Forrester Research.

How does an increase in the absolute number and the percentage of PCs sold an indicating of waning? I think that means computer makers will ship over a hundred million units, compared to a quarter as many tablets. I checked out Dell’s website, and one can buy a very nice Inspiron desktop with a dual-core AMD processor, 3 GB of RAM, and a 1 TB hard drive for about $400. Get a cheapie 20″ monitor, and you’ve got a very competent machine that’ll run Windows passable well for under $600. Get a sweet 24″ IPS monitor as good or better than the one in my 2007 24″ iMac for another $500, and you’re still under $1,000. That’s why desktops aren’t going anywhere and all this blah blah blah about tablets is important but also overrated by tech sites chasing the new shiny but who also think that everyone has, if not an unlimited budget, then at least a very substantial one for technical toys. Given my work, it’s probably not surprising that I have a higher-than-average budget for technical toys and tools, since I use my computer every day and often for very long stretches, but for people who aren’t writers, hackers, day traders, pornographers, and the like, having an expensive computer and a tablet and a phone is, if not overkill, then at least overpriced.

Some people get this—here’s a Time story that’s as an example—but too many don’t, especially in the press, which follows the tech industry like a marketing arm instead of an independent evaluator.

One more point: PCs are still better for some tasks. Maybe not for browsing Facebook and YouTube, but anything that requires a keyboard isn’t just better on a computer—it’s way better. Maybe students are going to write papers on iPads or iPad-like devices, but I’m skeptical, and even if one has a couple of substantial text-writing efforts a year, it’s going to be tempting to keep a keyboard around. I could be crazy; people are apparently writing novels on cell phones in Japan and now other countries, but producing a novel on a phone doesn’t sound appetizing from the perspective of either the writer, who can’t really get in the zone over the course of a hundred words, or the reader, who has to endure writing from someone who doesn’t appear to, say, go back and edit their novel as a coherent whole. Most people don’t seem to much like 19th Century novels that were published serially, and I think “lack of editing” and “lack of brevity” might be two reasons, and the first will probably come back to haunt cell phone novelists.

Then again, looking at the bestseller lists, maybe there isn’t much to go but down.

PCs and other form factors are going to coexist. Again, it’s not as sexy a story, but it’s also a more true one. In one Hacker News comment thread “jeffreymcmanus” observed, “People don’t stop buying the old stuff just because there’s new stuff. See also: horses, bicycles, cars.” Well, people have mostly stopped buying horses, because cars offer superior functionality in virtually all circumstances, but the point remains. Another commenter, “mcantelon,” said:

Yeah, which is why the “post-PC” terminology has a propaganda tone. It’s not going to be “post-PC”: more like “pop computing” or “computing lite”.

He’s right. Which is okay: I have nothing against tablets, cell phones, and so forth. Use whatever works. Just don’t pretend PCs are going away or automatically declining.


See also this post on whether you should buy a laptop or desktop and this related post on the reliability of each form factor.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

Mac OS 10.7 is out today, and I don't care because "In the Beginning was the Command Line"

A few days ago, I was reading Neal Stephenson’s incredible essay “In the Beginning was the Command Line,” which you can download for free at the link. His work can’t really be summarized because the metaphors he develops are too potent and elaborate to flatten into a single line that describes what he does with them; by the time you finish summarizing, you might as well recreate the whole thing. Despite the folly in attempting summarization, I want to note that he’s cottoned on to the major cultural differences between Windows, Macs, and Unixes like Linux, and by the time you’re done you with his essay realize the fundamental divide in the world isn’t between right and left or religions and secular, but between contemporary “Morlocks” and “Eloi,” the former being the ones who run things and the latter being the ones who mostly consume them (you can see similar themes running through Turn On, Tune In, Veg Out and Anathem). In the meantime, visual culture has become a poorly understood but highly developed global force bathing virtually everyone in its ambiance, and that might not be such a bad thing most of the time. The last issue doesn’t have that much to do with this particular post, but if you want to understand it, and hence an aspect of the world, go read “In the Beginning was the Command Line.”

He wrote the essay in 1999, and the problem with operating systems, or “OSes” in nerd parlance, is that none of them were very good. They crashed frequently or were incredibly hard to use, especially for Eloi, or both. In the last ten years, they’ve gotten much less crashy (OS X, Windows) or much easier to use (Linux) or both, to the point where the differences to a random user who wants to write e-mails, look at YouTube videos, browse for adult material, and look at FaceBook status updates probably won’t notice the quirks of each operating system. Games are a major difference, since OS X and Windows have lots of modern games available and Linux doesn’t, but if you don’t care about games either—and I don’t—you’ll want to discount those.

Some of the major technical differentiators have shrunk: on OS X, you can now communicate with your machine using the Terminal; on mine, I’ve changed the color scheme to trendy green-on-black. Windows has a system called PowerShell, and Linux has various ways to hide the stuff underneath it. But the cultural differences remain. Windows machines still mostly come festooned with ugly stickers (“These horrible stickers are much like the intrusive ads popular on pre-Google search engines. They say to the customer: you are unimportant. We care about Intel and Microsoft, not you”) and a lot of crap-ware installed. OS X machines look like they were designed by a forward-thinking 1960s science fiction special effects person for use by the alien beings who land promising peace and prosperity but actually want to build a conduit straight into your mind and control your thoughts. Linux machines still sometimes want you to edit files in /src to get your damn wireless network working. Given the slowness of cultural change relative to technical change, it shouldn’t be surprising that many of Stephenson’s generalizations hold up even though many technical issues have changed.

This throat clearing leads to the subject of today’s much-hyped launch of Apple’s latest operating system, which is an incremental improvement to the company’s previous operating system. I’ve been using Macs since 2004. I started with an aluminum PowerBook that you can see in this appropriately messy picture. In that time, I’ve steadily upgraded from 10.3 to 10.6, but the move from 10.5 to 10.6 didn’t bring any tangible benefits to my day-to-day activities. It did, however, mess up some of the programs I used and still use regularly, which made me more gun-shy about OS updates than I have been previously. Now 10.7 is out, and you can read the best review of it here. It’s got a bunch of minor new features, most of which I won’t use and are overhyped by Apple’s ferocious marketing department, which most people call “the press.”

I’ve looked at those features and found nothing or nothing compelling. Many are aimed to laptops, but I don’t use a laptop as my primary computer or have a trackpad on my iMac, and it seems like the “gestures” that are now part of OS X, while useful, aren’t all that useful. Apple is also integrating various Internet services into the operating system, but I don’t really care about them either and don’t want to pay for iCloud. I don’t see the point for the kinds of things I do, which mostly tend towards various kind of text manipulation and some messing around with video. It’s not that I can’t afford the upgrade—Apple is only charging $30 for it. I just don’t need it and simultaneously find it annoying that Apple will only offer it through their proprietary “app store,” which means that when I need to reinstall because the hard drive dies I won’t be able to use disks to start the machine.

Still, those are all quibbles of the kind that start boring flame wars among nerds on the Internet. I’ve saved the real news for the very bottom of the page: it’s not about Apple’s OS upgrade, which, at one point, I would’ve installed on Day 1. I remember when OS X 10.4 came out, offering Spotlight, and I was blown away. Full-text search anywhere on your machine is great. It’s magical. I use it every day. Even 10.5 finally had integrated backup software. But 10.6 had a lot of developer enhancements I don’t use directly. Now, 10.7 has improved things further, but in a way that’s just not important to me. The real news is about how mature a lot of computer technology has become. By far the most useful hardware upgrade I’ve seen in the last ten years is a solid state drive (SSD), which makes boots times minimal and applications launch quickly. Even Word and Photoshop, both notorious resource hogs, launch in seconds. New OS versions used to routinely offer faster day-to-day operation as libraries were improved, but it’s not important to move from “fast enough” to “faster.” The most useful software upgrades I’ve seen were moving from the insecure early versions of Windows XP to OS X, and the move from 10.3 to 10.4. The move to 10.7 is wildly unexciting. So much so that I’m going to skip it.

If you look at the list of features in 10.7, most sound okay (like application persistence) but aren’t essential. I rather suspect I’m going to skip a lot of software and hardware upgrades in the coming years. Why bother? The new iterations of OSes aren’t likely to enable me to be able to do something substantial that I wasn’t able to do before, which, in my view, is what computers are supposed to do—like most of the things we make of buy. If you’re an economist, you could call this something like the individual production possibility curve. Installing Devonthink Pro expanded mine. Scrivener might have too. Mac Freedom definitely has, and I’m going to turn it on shortly after I post this essay. The latest operating system, though? Not so much. The latest software comes and goes, but the cultural differences—and discussions of what those differences mean—endure, even as they shrink over time.

EDIT: Somewhat relevant:

Mac OS 10.7 is out today, and I don’t care because “In the Beginning was the Command Line”

A few days ago, I was reading Neal Stephenson’s incredible essay “In the Beginning was the Command Line,” which you can download for free at the link. His work can’t really be summarized because the metaphors he develops are too potent and elaborate to flatten into a single line that describes what he does with them; by the time you finish summarizing, you might as well recreate the whole thing. Despite the folly in attempting summarization, I want to note that he’s cottoned on to the major cultural differences between Windows, Macs, and Unixes like Linux, and by the time you’re done you with his essay realize the fundamental divide in the world isn’t between right and left or religions and secular, but between contemporary “Morlocks” and “Eloi,” the former being the ones who run things and the latter being the ones who mostly consume them (you can see similar themes running through Turn On, Tune In, Veg Out and Anathem). In the meantime, visual culture has become a poorly understood but highly developed global force bathing virtually everyone in its ambiance, and that might not be such a bad thing most of the time. The last issue doesn’t have that much to do with this particular post, but if you want to understand it, and hence an aspect of the world, go read “In the Beginning was the Command Line.”

He wrote the essay in 1999, and the problem with operating systems, or “OSes” in nerd parlance, is that none of them were very good. They crashed frequently or were incredibly hard to use, especially for Eloi, or both. In the last ten years, they’ve gotten much less crashy (OS X, Windows) or much easier to use (Linux) or both, to the point where the differences to a random user who wants to write e-mails, look at YouTube videos, browse for adult material, and look at FaceBook status updates probably won’t notice the quirks of each operating system. Games are a major difference, since OS X and Windows have lots of modern games available and Linux doesn’t, but if you don’t care about games either—and I don’t—you’ll want to discount those.

Some of the major technical differentiators have shrunk: on OS X, you can now communicate with your machine using the Terminal; on mine, I’ve changed the color scheme to trendy green-on-black. Windows has a system called PowerShell, and Linux has various ways to hide the stuff underneath it. But the cultural differences remain. Windows machines still mostly come festooned with ugly stickers (“These horrible stickers are much like the intrusive ads popular on pre-Google search engines. They say to the customer: you are unimportant. We care about Intel and Microsoft, not you”) and a lot of crap-ware installed. OS X machines look like they were designed by a forward-thinking 1960s science fiction special effects person for use by the alien beings who land promising peace and prosperity but actually want to build a conduit straight into your mind and control your thoughts. Linux machines still sometimes want you to edit files in /src to get your damn wireless network working. Given the slowness of cultural change relative to technical change, it shouldn’t be surprising that many of Stephenson’s generalizations hold up even though many technical issues have changed.

This throat clearing leads to the subject of today’s much-hyped launch of Apple’s latest operating system, which is an incremental improvement to the company’s previous operating system. I’ve been using Macs since 2004. I started with an aluminum PowerBook that you can see in this appropriately messy picture. In that time, I’ve steadily upgraded from 10.3 to 10.6, but the move from 10.5 to 10.6 didn’t bring any tangible benefits to my day-to-day activities. It did, however, mess up some of the programs I used and still use regularly, which made me more gun-shy about OS updates than I have been previously. Now 10.7 is out, and you can read the best review of it here. It’s got a bunch of minor new features, most of which I won’t use and are overhyped by Apple’s ferocious marketing department, which most people call “the press.”

I’ve looked at those features and found nothing or nothing compelling. Many are aimed to laptops, but I don’t use a laptop as my primary computer or have a trackpad on my iMac, and it seems like the “gestures” that are now part of OS X, while useful, aren’t all that useful. Apple is also integrating various Internet services into the operating system, but I don’t really care about them either and don’t want to pay for iCloud. I don’t see the point for the kinds of things I do, which mostly tend towards various kind of text manipulation and some messing around with video. It’s not that I can’t afford the upgrade—Apple is only charging $30 for it. I just don’t need it and simultaneously find it annoying that Apple will only offer it through their proprietary “app store,” which means that when I need to reinstall because the hard drive dies I won’t be able to use disks to start the machine.

Still, those are all quibbles of the kind that start boring flame wars among nerds on the Internet. I’ve saved the real news for the very bottom of the page: it’s not about Apple’s OS upgrade, which, at one point, I would’ve installed on Day 1. I remember when OS X 10.4 came out, offering Spotlight, and I was blown away. Full-text search anywhere on your machine is great. It’s magical. I use it every day. Even 10.5 finally had integrated backup software. But 10.6 had a lot of developer enhancements I don’t use directly. Now, 10.7 has improved things further, but in a way that’s just not important to me. The real news is about how mature a lot of computer technology has become. By far the most useful hardware upgrade I’ve seen in the last ten years is a solid state drive (SSD), which makes boots times minimal and applications launch quickly. Even Word and Photoshop, both notorious resource hogs, launch in seconds. New OS versions used to routinely offer faster day-to-day operation as libraries were improved, but it’s not important to move from “fast enough” to “faster.” The most useful software upgrades I’ve seen were moving from the insecure early versions of Windows XP to OS X, and the move from 10.3 to 10.4. The move to 10.7 is wildly unexciting. So much so that I’m going to skip it.

If you look at the list of features in 10.7, most sound okay (like application persistence) but aren’t essential. I rather suspect I’m going to skip a lot of software and hardware upgrades in the coming years. Why bother? The new iterations of OSes aren’t likely to enable me to be able to do something substantial that I wasn’t able to do before, which, in my view, is what computers are supposed to do—like most of the things we make of buy. If you’re an economist, you could call this something like the individual production possibility curve. Installing Devonthink Pro expanded mine. Scrivener might have too. Mac Freedom definitely has, and I’m going to turn it on shortly after I post this essay. The latest operating system, though? Not so much. The latest software comes and goes, but the cultural differences—and discussions of what those differences mean—endure, even as they shrink over time.

EDIT: Somewhat relevant:

%d bloggers like this: