“Going Nowhere Really Fast, or How Computers Only Come in Two Speeds” is half-right. Here’s the part that’s right:
[…] it remains obvious that computers come in just two speeds: slow and fast. A slow computer is one which cannot keep up with the operator’s actions in real time, and forces the hapless human to wait. A fast computer is one which can, and does not.
Today’s personal computers (with a few possible exceptions) are only available in the “slow” speed grade.
So far so good: I wish I didn’t have to wait as long as I do for Word to open documents or load or for OS X to become responsive after reboot. But then there’s the reason offered as to why computers feel subjectively slower in many respects than they did:
The GUI of my 4MHz Symbolics 3620 lisp machine is more responsive on average than that of my 3GHz office PC. The former boots (into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created) faster than the latter boots into its syrupy imponade hell.
This implies that the world is filled with “bloat.” But such an argument reminds me of Joel Spolsky’s Bloatware and the 80/20 myth. He says:
A lot of software developers are seduced by the old “80/20” rule. It seems to make a lot of sense: 80% of the people use 20% of the features. So you convince yourself that you only need to implement 20% of the features, and you can still sell 80% as many copies.
Unfortunately, it’s never the same 20%. Everybody uses a different set of features.
Exactly. And he goes on to quote Jamie Zawinski saying, “Convenient though it would be if it were true, Mozilla [Netscape 1.0] is not big because it’s full of useless crap. Mozilla is big because your needs are big. Your needs are big because the Internet is big. There are lots of small, lean web browsers out there that, incidentally, do almost nothing useful.”
That’s correct; Stanislav’s 4MHz Symbolics 3620 lisp machine was/is no doubt a nice computer. But modern, ultra-responsive computers don’t exist not because people like bloat—they don’t exist because people in the aggregate choose trade-offs that favor a very wide diversity of uses. People don’t want to make the trade-offs that fast responsiveness implies in sufficient numbers for there to be a market for such a computer.
Nothing is stopping someone from making a stripped-down version of, say, Linux that will boot “into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created faster than the latter boots into its syrupy imponade hell.” But most people evidently prefer the features that modern OSes and programs offer. Or, rather, they prefer that modern OSes support THEIR pet feature and make everything as easy to accomplish as possible at the expense of speed. If you take out their favorite feature… well, then you can keep your superfast response time and they’ll stick with Windows.
To his credit, Stanislav responded to a version of what I wrote above, noting some of the possible technical deficiencies of Linux:
If you think that a static-language-kernel abomination like Linux (or any other UNIX clone) could be turned into a civilized programming environment, you are gravely mistaken.
That may be true: my programming skill and knowledge end around simple scripting and CS 102. But whatever the weaknesses of Linux, OS X, and Windows, taken together they represent uncounted hours of programming and debugging time and effort. For those of you who haven’t tried it, I can only say that programming is an enormous challenge. To try and replicate all that modern OSes offer would be hard—and probably effectively impossible. If Stanislav wants to do it, though, I’d be his first cheerleader—but the history of computing is also rife with massive rewrites of existing software and paradigms that fail. See, for example, GNU/Hurd for a classic example. It’s been in development since 1990. Did it fail for technical or social reasons? I have no idea, but the history of new operating systems, however technically advanced, is not a happy one.
Stanislav goes on to say:
And if only the bloat and waste consisted of actual features that someone truly wants to use.
The problem is that one man’s feature is another’s bloat, and vice-versa, which Joel Spolsky points out, and that’s why the computer experience looks like it does today: because people hate bloat, unless it’s their bloat, in which case they’ll tolerate it.
He links to a cool post on regulated utilities as seen in New York (go read it). But I don’t think the power grid metaphor is a good one because transmission lines do one thing: move electricity. Computers can be programmed to do effectively anything, and, because users’ needs vary so much, so does the software. You don’t have to build everything from APIs to photo manipulation utilities to web browsers on top of power lines.
Note the last line of Symbolics, Inc.: A failure of heterogeneous engineering, which is linked to in Stanislav’s “About” page:
Symbolics is a classic example of a company failing at heterogeneous engineering. Focusing exclusively on the technical aspects of engineering led to great technical innovation. However, Symbolics did not successfully engineer its environment, custormers [sic], competitors and the market. This made the company unable to achieve long term success.
That kind of thinking sounds, to me, like the kind of thinking that leads one to lament how “slow” modern computers are. They are—from one perspective. From another, they enable things that the Lisp machine didn’t have (like, say, YouTube).
However, I’m a random armchair quarterback, and code talks while BS walks. If you think you can produce an OS that people want to use, write it. But when it doesn’t support X, where “X” is whatever they want, don’t be surprised when those people don’t use it. Metcalfe’s Law is strong in computing, and there is a massive amount of computing history devoted to the rewrite syndrome; for another example, see Dreaming in Code, a book that describes how an ostensibly simple task became an engineering monster.
Ironically people have built Linux which is stripped down and boots faster by being less flexible and more hardware specific (e.g. MeeGo and the work done to boot it in 5 seconds). With a LOT of tweaking my EeePC 900 with a very cheap and el cheapo gen-0 SSD boots to a working desktop in about 15 seconds and I get a web browser in about another 5. Not fantastic (newer faster machines with hard disks can boot faster!) but a darn sight faster than waiting the 10-15 minutes loading a game from a cassette tape took on my Acorn Electron in 1985.
Generally though, if you hate Linux/Unixy environments nothing will (or should) convince you otherwise though – Visual Studio and its ilk are a credit to Windows and the hard work that was put into them.
Yes and no. This is a thoughtful, articulate post, but I think you’ve only got half the story here. In addition to (or in some cases, mitigation of) the points you make above, I would add the following.
You make it sound like the programmers at Microsoft, etc., are doing all this for the customer. Adding all the features customers have clamored for, etc. Unfortunately, that hasn’t been true for a very long time. Rather, their market has become saturated, their competition scant, and most of their products have long since reached the point where 99% of the features uses want and need are already available.
But this poses a serious problem: if all of your customers have already bought your product, how do you ever make any more money? Two tempting answers: (1) charge them some kind of recurring subscriptions fees, and/or (2) add features nobody asked for, redesign the GUI in the putative interest of “increased efficiency”, etc. #1 is harder to get away with, so we see a great deal of #2 (pun intended). At this point, software companies are usually not delivering something new that people need, just something new for the sake of having something else to sell.
You also missed the fact that computers are slow because of legacy code. Marriage to the idea of backward compatibilty at almost all costs has led to millions of lines of legacy code in today’s operating systems and other applications. This is a completely different kind of bloat, one that is less visible to end users, but it has a huge effect.
PS. “code walks while BS walks” … Did you mean code talks …?
A quick response for now…
“You make it sound like the programmers at Microsoft, etc., are doing all this for the customer.”
Yeah, and people keep buying it. I keep buying the new versions of Office, hoping this one won’t crash (no luck so far). For that matter, I haven’t used any word processor that does everything I want consistently and doesn’t crash, partially because for the last seven years “what I want” has always included perfect opening / saving of Word files, since that’s what everyone else has…
Part of the reason is vendor lock-in, but if that were a greater problem than backward compatibility and so on, I think we’d see the real growth of alternatives (like Linux, or the OS it sounds like Stanislav is working on if it gets released). But we don’t, at least on mainstream desktops, and I doubt all the blame for that can be laid on nefarious companies—not that you do, but still. And people are responding to this: In the U.S., Windows 7’s greatest competitor is apparently not OS X or Linux—it’s Windows XP, which is “good enough,” despite whatever flaws it might have. I love these Ars Technica graphs that’ve shown Windows XP losing like .5% marketshare per month since 2007.
“PS. “code walks while BS walks” … Did you mean code talks …?”
I did. Fixed. Thanks! And I think it’s true: the Internet is filled with armchair quarterbacks like me. And not just in hacking, either, but in writing and many other fields…
When in doubt, if you can do something that works your voice immediately gets 100x louder than people like me who pontificate.