Thoughts on Steve Jobs — Walter Isaacson

I don’t think Steve Jobs, seen as a whole package, holds much of a lesson for us mortals, as Gary Stix argues here. Nonetheless, Steve Jobs the book is as fascinating as one should expect. The broad contours of his life and the book’s contents are well known, so I won’t repeat them here; I will note a few things:

1) As early as 1980, Jobs was “thrashing about for ways to produce something more radically different. At first he flirted with the idea of touchscreens, but he found himself frustrated. At one demonstration of the technology, he arrived late, fidgeted awhile, then abruptly cut off one of the engineers in the middle of their presentation [. . . .]” Notice how early he was thinking about a product that didn’t make it into shipping products until 2007. But I’m not that interested in touchscreens because, at least so far, they’re lousy for typing and other kinds of content creation. More than anything else I’m a writer, and I don’t see much use for iPads beyond checking Facebook, reading e-mails, and watching YouTube videos. Maybe they’d be useful as menus and such too. Charlie Stross gets this, and he a) actually has one and b) explains more about their uses and limitations Why I don’t use the iPad for serious writing.”

2) Not all of the book’s writing is great—phrases and ideas are too often repeated, and Isaacson shies from figurative or hyperbolic language, like a 13-year-old not quite ready to approach the opposite sex. Nonetheless, the books has enough evocative moments to balance its stylistic plodding, as in this moment: “Randy Wigginton, one of the engineers, summed it up: ‘The Apple III was kind of like a baby conceived during a group orgy, and later everybody had this bad headache, and there’s this bastard child, and everyone says, “It’s not mine.”‘”

I have yet to see an “individual orgy,” as opposed to a “group orgy,” but the metaphor nonetheless resonates.

3) Jobs didn’t think the same way most of us do about a wide array of topics. He didn’t think like the idiotic managers who think anything that can’t be measured automatically has no value. One can see non-standard thinking that works all over the book—it would be interesting to look too at people with non-standard thinking who fail—and I noticed this moment, at a Stanford class, where Jobs took business questions for while:

When the business questions tapered off, Jobs turned the tables on the well-groomed students. ‘How many of you are virgins?’ he asked. There were nervous giggles. ‘How many of you have taken LSD?’ More nervous laughter, and only one or two hands went up. Later Jobs would complain about the new generation of kids, who seemed to him more materialistic and careerist than his own. ‘When I went to school, it was right after the sixties and before this general wave of practical purposefulness had set in,’ he said. ‘Now students aren’t even thinking in idealistic terms, or at least nowhere near as much.’

Students are too shocked, and by the time they get to me they’re too often well-behaved in a dull way. I’ve mentioned weed in class, and the students are usually astonished. But I remember being a freshman, and most of the shock is undeserved. I went to school at Clark University, where mentions of pot smoking and LSD seemed fairly normal.

“Practical purposefulness” can be impractical when it blinds one to alternative possibilities that the well mannered simply cannot or will not imagine.

4) The last four paragraphs of the book are perfect.

5) Here’s Steven Berlin Johnson on the book; notice:

After devouring the first two-thirds of the book, I found myself skimming a bit more through the post-iPod years, largely because I knew so many of the stories. (Though Isaacson has extensive new material about the health issues, all of which is riveting and tragic.) At first, I thought that the more recent material was less compelling for just that reason: because it was recent, and thus more fresh in my memory. But it’s not that I once knew all the details about the battle with Sculley or the founding of NeXT and forgot them; it’s that those details were never really part of the public record, because there just weren’t that many outlets covering the technology world then.

This reminded me of a speech I gave a few years ago at SXSW, that began with the somewhat embarrassing story of me waiting outside the College Hill bookstore in 1987, hoping to catch the monthly arrival of MacWorld Magazine, which was just about the only conduit for information about Apple back then. In that talk, I went on to say:

If 19-year-old Steven could fast-forward to the present day, he would no doubt be amazed by all the Apple technology – the iPhones and MacBook Airs – but I think he would be just as amazed by the sheer volume and diversity of the information about Apple available now. In the old days, it might have taken months for details from a John Sculley keynote to make to the College Hill Bookstore; now the lag is seconds, with dozens of people liveblogging every passing phrase from a Jobs speech. There are 8,000-word dissections of each new release of OS X at Ars Technica, written with attention to detail and technical sophistication that far exceeds anything a traditional newspaper would ever attempt. Writers like John Gruber or Don Norman regularly post intricate critiques of user interface issues. (I probably read twenty mini-essays about Safari’s new tab design.) The traditional newspapers have improved their coverage as well: think of David Pogue’s reviews, or Walt Mossberg’s Personal Technology site. And that’s not even mentioning the rumor blogs.

So in a funny way, the few moments at the end of Steve Jobs where my attention flagged turned out to be a reminder of one of the great gifts that the networked personal computer has bestowed upon us: not just more raw information, but more substantive commentary and analysis, in real-time.

Except I’m a native to this environment: by the time I came to be cognizant of the world, this was already, if not a given, then at least very close. The later sections of the book had the feel of stuff I’ve already seen on the Internet, and much of the most interesting work analyzing Steve Jobs’ personality, predilections, and power had been done earlier.

To some extent, it’s always easier to chart rises than plateaus, and this is certainly true in Jobs’ case. The very end of Steve Jobs described the steps he’s taken to try ensuring the company continues in the mold of a company capable of producing great stuff—unlike most companies, which slowly come to be ruled by bean-counters and salarymen. Japanese companies like Sony are instructive here: Akio Morita‘s departure from the company coincided with its stagnation, which is most evident in its failure to see the iPod coming.

6) There are many subtle lessons that would be easy to miss in Steve Jobs and from Steve Jobs.

The Steve Jobs Biography

Like everyone else, I started Walter Isaacson’s Steve Jobs biography today. It’s wonderful. In the first pages, Isaacson gives a sense of how Jobs both viewed himself and was viewed in his place at Apple: “When he was restored to the throne at Apple [. . . .]” How many companies could see their CEOs as occupying thrones? Almost no one has or had the medieval level of control Jobs did over Apple. But he didn’t exercise that control capriciously: he used it to make things people want. Lower on the same page, Isaacson describes his unwillingness to write on Jobs at first, but he says that he “found myself gathering string on the subject” of Apple’s early history. “Gathering string:” it’s something I do all the time, using the methods Steven Berlin Johnson describes in this essay about DevonThink Pro. One imagines the string eventually being knit into a sweater, but first one has to have the material.

A page later, Isaacson says “The creativity that can occur when a feel for both the humanities and the sciences combine in one strong personality was the topic that most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to creating innovative economies in the twenty-first century.” By now, such an assertion is almost banal, but that doesn’t mean it isn’t right and doesn’t mean it shouldn’t be asserted. Whenever you hear someone creating the false binary C. P. Snow discusses deconstructs in Two Cultures, point them to Jobs, who is merely the most salient example of why there aren’t two or more cultures—there’s one. You can call it creative, innovative, human-centered, discovery-oriented, bound by makers, or any number of other descriptions, but it’s there. It’s not just “a key to creating innovative economies in the twenty-first century,” either. It’s a key to being.

One more impression: while discussing the Apple II and the role of marketing guy Mike Markkula, Isaac describes the three principles Markkula adopts: “empathy,” “focus,” and, most interesting to this discussion, the “awkwardly named [. . .] impute.” The last principle “emphasized that people form an opinion about a company or product based on the signals that it conveys. ‘People DO judge a book by its cover,’ he wrote.” He’s right, and that brings up this book as a physical object: it’s beautiful. A single black and white picture of Jobs as an older man, still look vaguely like a rapscallion, dominates the cover. Another picture of him, this time as a younger man, dominates the back. The pages themselves are very white, and the paper quality is high; ink doesn’t bleed through easily, and the paper resists feathering. Jobs agreed not to meddle with the text; Isaacson says “He didn’t seek any control over what I wrote.” But he did meddle around the text, however: “His only involvement came when my publisher was choosing the cover art. When he saw an early version of a proposed covert treatment, he disliked it so much that he asked to have input in designing a new version. I was both amused and willing, so I readily assented.”

Good. I wonder if Jobs had “input” in the paper quality too. Sometimes I wonder if publishers are themselves trying to encourage people to adopt eBooks through the use of lousy paper stock and spines in books, especially hardcovers. Take Steven Berlin Johnson’s excellent book, Where Good Ideas Come From. The cover is black, with yellow text shaped like a lightbulb. Excellent design. But the pages themselves are a brownish gray, like newsprint, and the glued binding feels flimsy. The paperback is probably worse. It’s not the kind of book one would imagine Steve Jobs allowing, but the state of Johnson’s book as a physical object indicates what publishers value: cutting corners, making things cheap, and subtly conveying to readers that the publisher doesn’t care enough to make it good.

Publishers, in other words, are ruled by accountants who probably say that you can save $.15 per book by using worse paper. Apple was ruled by a megalomaniac with a persnickety attention to detail. People love Apple. No one, not even authors, love publishers. The reasons are legion, but when I think about what a lot of recent books “impute” to the reader, I think about how Steve Jobs would make them do it differently if he could. If you’re reading this in the distant future, the idea of reading words printed on dead trees is probably as strange to you as riding a carriage would be to me, but for now it matters. And, more importantly, I think books will continue to exist as physical art objects as well as repositories for knowledge as long as the Jobs and Isaacsons of the world make them.

I’m not far into the biography and feel the call of other responsibilities. But I leave Steve Jobs reluctantly, which happens to too few books of any genre. And I have a feeling that thirty years from now I’ll be reading an interview with some inventor or captain of industry who cites Steve Jobs and Steve Jobs as inspirations in whatever that inventor accomplishes.

Process, outcomes, and random discoveries

I was listening to a Fresh Air interview with Brad Pitt, the guy who plays Billy Beane in the Moneyball movie, and Pitt said something very interesting: Billy Beane realized that baseball is mostly about “process” and maximizing your odds. A single pitch or a single at-bat is basically random; a terrible player could homer, a great one strike out. But if you have faith in the process and fidelity to it, you’ll maximize your chance of success over time. Notice those words: “maximize your chance of success.” You won’t automatically succeed in whatever the endeavor might be, but we live in a chaotic, random world where no one is guaranteed anything.

So I heard this interview about a week ago. Since then, I’ve seen a bunch of similar stuff, which keeps reappearing like, if I were a person who wasn’t convinced things are random, the world is trying to tell me something. Here’s a description of Steve Jobs: “What was important to Jobs was not making money per se, but the process of creation.” That word, “process,” appears again: if it’s right, the money will follow if you get the process right. When a Playboy interviewer asks Justin Timberlake “Why [. . .] some celebrities crack and fade and others, like you, just keep on keeping on? Have you figured that out?,” Timberlake says he doesn’t know but will speculate, and he goes on to say:

I think it’s about process. If you care about the process of what you’re doing, you can care about the actual work. You’ll stick around. The other thing is, you always need to be learning something new. In whatever I’ve done, I’ve always looked at myself as a beginner. Hopefully I can continue to do that for the next 30 years as I grow into an older man.

He’s trying to do with music what Billy Beane is trying to do with baseball and what Steve Jobs was trying to do with consumer technology. Or what Alain de Botton describes in The Pleasures and Sorrows of Work, in which the author sees a worker in a Belgian biscuit factory whose “manner drew attention away from what he was doing in favour of how he was doing it.” If you attend to how you do something, the outcome will tend to improve more than absolute attention to the outcome. It seems like a lot of experts, a lot of people who can do good work year after year, are really focusing on process refining. This might map to “experimental” and “conceptual” artists, to use Galenson’s terminology in Old Masters and Young Geniuses: The Two Life Cycles of Artistic Creativity. As I read more about what makes artists, scientists, and others succeed, I increasingly realize that a focus on process is essential, if not the essential thing.

And it’s something I’m noticing over and over again, in a variety of contexts. When I started grad school, I began going to the University of Arizona’s Ballroom Dance Club. This is hilarious: if you asked a girl who had the misfortune of going with me to high school dances about what I’d be like a couple years later, I doubt any would’ve guessed, “Dancing.” Fewer still would’ve guessed, “At least being a competent dancer.” To aspire to “good” or “masterful” is probably unwise, but “competent” is well within my reach—and within almost anyone’s reach, really, if you have the desire. And ballroom club is all about the fundamentals too: here’s how you should move. Here’s how you isolate a single part of your body. The overall look, feel, and flow of any dance is composed of individual motions and a dancer’s control over those individual motions, which eventually come to appear to be a single, fluid motion. But it isn’t. It’s the result of the dancer breaking down each individual part and practicing it until it becomes part of him.

One time, a guy who’d been dancing for about a decade had us spend about half an hour of an hour-long classes on spins. Skilled dancers can perform nearly perfect 360-degree spins every time. I can’t. I usually end up ten to fifty degrees off. I can’t get my body, shoes, and motion harmonized sufficiently to ensure that I can perform perfect spins. But I keep working on it, in the hopes of improving this seemingly simple but actually complex activity. I’m doing in dancing what Billy Beane is doing in baseball, Justin Timberlake is doing in music, Steve Jobs was doing in technology, and you should probably be doing in your own field or fields.

And if your practice isn’t as good as it should be this time, focus on improving your process so you’ll be better next time. As you, the reader, might imagine, the same principle applies to other things. Like classes. Since I now teach and take them, I have a lot of experience with students who want to fight about grades. I don’t budge, but every semester students want to fight either during the semester or the end. I try to convey to them that grades are imperfect but they’re really about learning; concentrate on learning and the achievement, whether in grade or other form, will eventually follow.

Most of them don’t believe me. This is unfortunate, since most students also don’t know that, as Paul Graham writes, there are really Two Kinds of Judgment:

Sometimes judging you correctly is the end goal. But there’s a second much more common type of judgement where it isn’t. We tend to regard all judgements of us as the first type. We’d probably be happier if we realized which are and which aren’t.

The first type of judgement, the type where judging you is the end goal, include court cases, grades in classes, and most competitions. Such judgements can of course be mistaken, but because the goal is to judge you correctly, there’s usually some kind of appeals process. If you feel you’ve been misjudged, you can protest that you’ve been treated unfairly.

Nearly all the judgements made on children are of this type, so we get into the habit early in life of thinking that all judgements are.

But in fact there is a second much larger class of judgements where judging you is only a means to something else. These include college admissions, hiring and investment decisions, and of course the judgements made in dating. This kind of judgement is not really about you.

To be fair, I am trying to judge them correctly. But the second class of judgments bleed into grading: the grade is the means of trying to get students to be better writers. When they want to fight about grades, they haven’t fully internalized that I’m trying to get them into a process-oriented mode despite the school setting. The grades are outcomes and a necessary evil—and, besides, some students are simply more skilled than others.

But if students have fidelity to the process—to becoming, in my classes, better writers, or in other classes, better at whatever the class is attempting to impart—they’re going to maximize the probability of long-term success. And I wonder if students internalize the outcome-oriented mode of school—”My worth depends on my grades”—and then find themselves shocked when they’re plunged into the process-oriented real-world, where no one grades you, success or failure can’t be measured via GPA, and even people who do everything “right” may still fail for reasons outside their control.

This is probably doubly painful because students are used to type one judgments, not type two, and instructors don’t do much to disabuse students. Instructors don’t do enough to encourage resilience, and maybe we should, or should more than we do now.

By the way, I’m not just climbing the mountain and shouting at the unwashed masses below. I tell myself the same thing about writing fiction (or blog posts): I’ve probably gotten dozens of requests from agents for partial or full manuscripts. None have panned out; some still have pieces of the latest novel. But I tell myself that a) I’m going to write a better novel next time and b) if I maintain fidelity to the craft of writing itself, I will eventually succeed. Alternately, I might simply start self-publishing, but that’s an issue for another post. The point here is about writing—and about what I’m doing right now.

I keep writing this blog not because it brings me fame and fortune—alas, it doesn’t—but because I like to write, I think through writing, and because some of the writing on this blog is and/or will be useful to others. And I like to think this blog makes me a better writer not only of blog posts, but also a better writer in other contexts. I’m focused on the process of improvement more than the outcome of conventional publication. Which isn’t to say I don’t want that outcome—I do—but I understand that the outcome is, paradoxically, a result of attention to something other than the outcome.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

And how does this apply to writers? Steve Jobs and the idea of “Ma”

From “How Steve Jobs ‘out-Japanned’ Japan:”

That ability to express by omission holds a central place in Jobs’s management philosophy. As he told Fortune magazine in 2008, he’s as proud of the things Apple hasn’t done as the things it has done. “The great consumer electronics companies of the past had thousands of products,” he said. “We tend to focus much more. People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas.” . . .

Jobs’s immersion in Zen and passion for design almost certainly exposed him to the concept of ma, a central pillar of traditional Japanese aesthetics. Like many idioms relating to the intimate aspects of how a culture sees the world, it’s nearly impossible to accurately explain — it’s variously translated as “void,” “space” or “interval” — but it essentially describes how emptiness interacts with form, and how absence shapes substance. If someone were to ask you what makes a ring a meaningful object — the circle of metal it consists of, or the emptiness that that metal encompasses? — and you were to respond “both,” you’ve gotten as close to ma as the clumsy instrument of English allows.

I think of the various things I have that might have “ma:” a pretentious Moleskine notebook, a Go board, certain books. But where do objects end and the internalization of an idea begin?

And how does this apply to writers? Steve Jobs and the idea of "Ma"

From “How Steve Jobs ‘out-Japanned’ Japan:”

That ability to express by omission holds a central place in Jobs’s management philosophy. As he told Fortune magazine in 2008, he’s as proud of the things Apple hasn’t done as the things it has done. “The great consumer electronics companies of the past had thousands of products,” he said. “We tend to focus much more. People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas.” . . .

Jobs’s immersion in Zen and passion for design almost certainly exposed him to the concept of ma, a central pillar of traditional Japanese aesthetics. Like many idioms relating to the intimate aspects of how a culture sees the world, it’s nearly impossible to accurately explain — it’s variously translated as “void,” “space” or “interval” — but it essentially describes how emptiness interacts with form, and how absence shapes substance. If someone were to ask you what makes a ring a meaningful object — the circle of metal it consists of, or the emptiness that that metal encompasses? — and you were to respond “both,” you’ve gotten as close to ma as the clumsy instrument of English allows.

I think of the various things I have that might have “ma:” a pretentious Moleskine notebook, a Go board, certain books. But where do objects end and the internalization of an idea begin?

Steve Jobs’ prescient comment

“The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That’s over. Apple lost. The desktop market has entered the dark ages, and it’s going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.”

(Emphasis added.)

—That’s from a 1996 interview with Jobs, and he was completely right: little of interest happened to the desktop interface virtually everyone uses until around 2003 or 2004, when OS X 10.3 was released. The first major useful change in desktops that I recall during the period was Spotlight in OS X 10.4, which was, not coincidentally, around the time I got a PowerBook.

Charlie Stross on the Real Reason Steve Jobs hates flash (and how lives change)

Charlie Stross has a typically fascinating post about the real reason Steve Jobs hates flash. The title is deceptive: the post is really about the future of the computing industry, which is to say, the future of our day-to-day lives.

If you read tech blogs, you’ve read a million people in the echo chamber repeating the same things to one another over and over again. Some of that stuff is probably right, but even if Stross is wrong, he’s at least pulling his head more than six inches off the ground, looking around, and saying “what are we going to do when we hit those mountains up ahead?”

And I don’t even own an iPad, or have much desire to be in the cloud for the sake of being in the cloud. But the argument about the importance of always-on networking is a strong one, even if, to me, it also points to the points to the greater importance of being able to disconnect distraction.

In the meantime, however, I’m going back to the story that I’m working on. Stories have the advantage that they’ll probably always be popular, even if the medium through which one experiences them changes. Consequently, I’m turning Mac Freedom on and Internet access off.

More on the long-predicted demise of reading

* Steve Jobs thinks reading is dead; Timothy Egan disagrees. If it’s dying, would it just hurry up?

* In the same vein: a paean to the departed past where civilization dwells. The Wonderful Past, redux.

* Philippa Gregory watches a novel with no literary merit anyway and still gets the Hollywood treatment. Compare to Philip Pullman, whose books have literary merit.

* Terry Teachout quoting T.S. Eliot. Compare and contrast to Richard Russo in Straight Man: “Virtually everybody in the English department has a half-written novel squirreled away in a desk drawer. I know this to be a fact because before they all started filing grievances against me, I was asked to read them. Sad little vessels all. Scuffy the Tugboat, lost and scared on the open sea. All elegantly written, all with the same artistic goal—to evidence a superior disposition.”

%d bloggers like this: