Apple finally releases new laptops:

Apple finally released new laptops, about four to six months after they should’ve. Still, the upgrades are impressive and if you’ve been on the fence or otherwise waiting, now’s the time. As usual, Apple’s hard drives are too small and their hard drive upgrade prices are usurious. People who want a Mac laptop and don’t want to pay full price should see sweet discounts on used and refurbished models in the next couple weeks.

I’m still using an iMac as a primary computer, so the announcements don’t affect me much. And my most important piece of Mac-only software is Devonthink Pro, which I still use according to a variation on this scheme, originally conceived by Steven Berlin Johnson (though he no longer uses DTP).

macbookpro_newOtherwise, many writers swear by Scrivener. I wrote The Hook in Scrivener, and due to the structure of that novel it was extremely useful. But for most novels I don’t find it essential; I don’t think most scenes can (or should) be reshuffled at will, which probably limits its utility.

Scrivener’s appeal for nonfiction projects is much more apparent to me. A couple months ago I finished the Grant Writing Confidential book manuscript (details to follow), and if most of the book hadn’t already existed  in the form of blog posts I would’ve used Scrivener as an organization tool.

In other laptop news, Dell has been producing Linux-native XPS laptops for a couple years, as does a smaller manufacturer called Purism. Given Apple’s lack of interest in non-smartphone products, it’s not a bad idea for Mac users to keep an eye on what everyone else is doing.

APE: Author, Publisher, Entrepreneur — Guy Kawasaki and Shawn Welch

ape_cover_KawasakiFor decades, books got published something like this: you, the writer, wrote and polished your book; you submitted a query letter and perhaps sample chapters to literary agents; an agent read the full manuscript; an agent took you on; the agent pitched your book to large publishing houses in New York; the editor, or ideally more than one editor, made an offer; the agent negotiated; and you got a book deal. This system worked kind of okay, and there wasn’t a better way to do it, but a lot of writers, including me, got hung up in the “an agent took you on” step.

Now, self-publishing has a realistic chance of success—defined as getting your work to readers and getting some amount of money from those readers—which offers opportunities and headaches. Big publishers know change is coming. The opportunities are obvious, and the headaches stem from having to learn a lot of stuff that publishers used to do, like cover design, knowing what a “widow” is, and figuring out how to hire a copy editor. APE: Author, Publisher, Entrepreneur wants to explain the new world, and it’s a book for a very specific group: people who are, for whatever reason, deeply interested in the publishing industry, and people who want to write a book, have written a book, or want to publish the book they’ve written. If you’re sure you don’t fall into those categories and aren’t likely to, stop reading. You’re probably wasting your time. If you want to know, keep going.

A few months ago I noted this, from Tim Parks’s “Does Money Make Us Write Better?“, in a links post:

When they are starting out writers rarely make anything at all for what they do. I wrote seven novels over a period of six years before one was accepted for publication. Rejected by some twenty publishers that seventh eventually earned me an advance of £1,000 for world rights. Evidently, I wasn’t working for money. What then? Pleasure? I don’t think so; I remember I was on the point of giving up when that book was accepted. I’d had enough. However much I enjoyed trying to get the world into words, the rejections were disheartening; and the writing habit was keeping me from a “proper” career elsewhere.

These kinds of stories infect writer interviews, as do tales of heroic perseverance. John Barth and William Goldman almost quit writing too. But more interesting still are the dark matter writers, the ones we don’t hear about because they gave up and aren’t being interviewed or writing introductions to reprints of their older books. I don’t want to be one of them. And I bet I can make more than £1,000, though I don’t know how long ago Parks began writing: adjusted for inflation, £1,000 might be a lot of money.

Kawasaki and Welch explain how to avoid being a dark matter writer. They say, “Will your book add value to people’s lives? This is a severe test, but if your answer is affirmative, there’s no doubt that you should write a book.” Still, people write books for all sorts of reasons, though I suspect the major reasons are related and twofold: the book they’d like to read doesn’t already exist, and they have something to say. Answers like “to add value to people’s lives” are good reasons to write a book, and good reasons to do many things. There is still some doubt. Writing a book can consume all your mental energy. It might add value to, say, two people’s lives, which might not justify the costs. Not everyone has the impetus towards book writing; to get through the difficulties of writing a book, I think that writing itself has to be fun, or fun at times (more on that later).

But the number of people who could write books and aren’t, in part because of the daunting publishing process, is much larger than the number who do write books. And that pool is getting larger. One challenge is that writers are going to have to think more like publishers, and publishers are going to have to think more like entrepreneurs. APE is about these transformations, and it takes its place near J.A. Konrath and Jack Kilborn’s The Newbie’s Guide to Publishing (Everything A Writer Needs To Know) and Kristine Kathryn Rusch’s Surviving the Transition: How Writers Can Thrive in the New World of Publishing (one thing writers evidently do, once they spend the painful time learning to self-publish, is write guides so that others can learn the same).

How useful APE will be to you depends on how much other reading you’ve done in the how-to-be-a-writer genre. I have trouble resisting it, and so sections of APE are less useful; some, like chapters 6, were fun but already well-known to me. The later ones, on the finer point of Kindle, Nook, and iBooks publishing, were exceedingly useful. I follow digital publishing closely, because I’m going to do it, but I still learned things: for example, I didn’t realize that Google Play exists. Google Play might not matter for me, or for you, but uploading to it requires little time beyond the effort necessary for iBooks and Barnes and Noble’s Nook.

Kawasaki and Welch also have overly strong views on tools (which may make sense given Kawasaki’s background: “For four years I evangelized Macintosh to software and hardware developers and led the charge against world-wide domination by IBM;” the word “evangelized” is key here, implying religious fervor that’s been transferred from God to Mac). I’ve learned some about photography in the last two years, perhaps a reaction against the extreme amount of reading and writing I’ve done, and in cameras, there’s a continual debate between the people who want the newest, coolest gear and who argue that the latest gear enables them to get shots they couldn’t have gotten before. Their intellectual adversaries argue that the most important tool is between the photographer’s ears and that composition, subject matter, and skill with what you have matters more than the newest cameras and the best lenses.

I’ve read impassioned pleas from both sides, and agreed fully with one side, then read the opposite, and agreed fully with them. There isn’t a right answer. One cliche in the photography community holds that every image you’ve admired was captured with worse gear than what you’ve got. Yet there’s also no reason to ignore the tools you’re using and the potential that new tools may unlock.

Kawasaki and Welch write, “In our book (again, pun intended), you should use a Macintosh. No computer makes you more creative and productive, because a Macintosh becomes part of you whereas you need to overcome other operating systems.” I don’t think it matters that much, which is somewhat funny because I’m writing this on an iMac. But pretty much any computer made in the last ten years will due, because, the most important parts of the writing process are a) a word processor and b) there is no b.

There are some nifty tools I use extensively, like Devonthink Pro, and some nifty tools that I’ve used less extensively but still helpfully at times, like Scrivener. Nonetheless, 95% of the real “work” of writing still happens on the level of the sentence and paragraph (though Kawasaki and Welch say of Scrivener, “I pride myself in having an organized mind, but my mind isn’t this organized”)*. A Mac is not going to give you great sentences. Neither is Windows or Linux or the tea you drink or the cafe you write at or the hot literary groupie offering you head or the pen you use. Great sentences, like change, come from within.

They also say, “We have never met anyone who regretted buying a Macintosh.” I have—like those who need perfect Exchange synchronization, or people who are seduced by the Mac’s cool factor, only to realize that the paying-the-rent factor is even more important. These are quibbles. Still, in one chapter the writers quote Zoe Winters, and I would repurpose her advice to apply to technology: “There is no shortcut to awesome.” Writing well is always a longcut, not a shortcut, and self-publishing arguably makes the road longer. There’s no real alternative, through software, hardware, or anything else.

The road may be long, but one can find comfort and encouragement along the way. Kawasaki and Welch write, “If You Want to Write by Brenda Ueland [. . .] changed my life by empowering me to write even though I didn’t consider myself a writer.” This is a common feeling, but it’s also one that’s long puzzled me: I spend very little, if any, time considering whether or not I’m “a writer.” I just do it. I didn’t need permission to be a writer, and neither do you. Alternately, if you do need permission, let me bestow it on you: a random stranger on the Internet has now dubbed thee a writer. Feel better?

You should. You should also realize that writing may be lonely in the moment, but it’s a way of bringing people together over time. This tension is implied in moments like these:: “Authors who write to impress people have difficulty remaining true to themselves. A better path is to write what pleases you and pray that there are others like you.” I would also add that few people are likely to be impressed anyway, and those who might be impressed will be more impressed if your book is written, at least some of the time, because you’re having fun and seeing where things go. Think about your favorite sexual experiences: few of them probably arose because you were putting a lot of pressure on yourself or your partners to have a Great Sexual Experience. Most of them probably arose because you and your partner(s) were relaxed and ready to have a good time by seeing where things go. So too with writing, and many other activities.

Sometimes writing will be painful, as Kawasaki and Welch note. I won’t deny it. But parts should be fun, and the fun will show in the final product.

In a few places, I’d like to see better writing in a book about writing. One chapter begins, “This section explains how to take a manuscript and turn it into a book. We assume that you have a rock-solid draft of your book.” “Rock-solid” turns up 74 million hits on Google. It’s a cliche. A book about writing should itself be impeccably written. This one is close—very close. Perhaps the next update will fix that.

Elsewhere, the writers say, “For example, The Schmoe Way by Joe Schmoe from Schmoe Press doesn’t cut it.” And “Pure text posts don’t cut it in the highly visual world of social media.” And “While printed books may never die (an ebook of Annie Leibovitz’s photographs won’t cut it) [. . . .]” What does “cut it,” and what is being cut? All of these could be improved: for example, “an ebook of Annie Leibovitz’s photographs is as useful as sheet music for someone who wants to hear Beethoven’s Fifth.” Maybe that’s a little clunky too, but it’s still an improvement because the metaphor is fresh. One could say, “Pure text posts in the highly visual world of social media make more sense than a pure text movie, but both are improved by images.”

Some words are wasted. The last sentence in this paragraph:

Undaunted, [Amanda] Hocking decided to self-publish her novels with Kindle Direct Publishing to pay for the $300 trip. She started with My Blood Approves, and by October 2010, she made over $20,000. Over the next twenty months, she made $2.5 million. The rest, as the saying goes, is history.

could be removed. I can only think of two similar nonfiction books that had no wasted words: Rework (the 37signals book, and one of the few books I’ve read that should be expanded) and Derek Sivers’ Anything You Want (where Sivers even talks about brevity and clarity in “You should feel pain when unclear“—”Writing that email to all customers would take me all day, carefully eliminating every unnecessary word, and reshaping every sentence to make sure it could not be misunderstood”). The best writing advice I’ve ever received is “omit unnecessary words.” Almost everyone is guilty of this crime at times, including me, in this post, in this blog, and in my other writing.

Their advice on serial commas is askew; Kawasaki and Welch favor serial commas (“A serial comma (or Oxford comma, as they say across the pond) prevents confusion when you are listing several items”), but serial commas can also create ambiguity.

These are minor issues, but I bring them up because nonfiction should aspire to be art. Kawasaki and Welch agree—they say, “Metaphors and similes beat the crap out of adjectives and adverbs, so use them when you can. For example, rather than saying, ‘Hockey is very violent,’ you could say, ‘Hockey is war on ice.'” Perhaps I’m overly fastidious about the War Against Cliche. Others who are highly attuned to language will notice too.

Some sections of APE linger in the mind long after they’re read, like this:

There are two kinds of people: eaters and bakers. Eaters think the world is a zero-sum game: what someone else eats, they cannot eat. Bakers do not believe that the world is a zero-sum game because they can bake more and bigger pies. Everyone can eat more. People trust bakers and not eaters.

It expresses a sentiment I’ve discussed in many contexts, but in a way I hadn’t conceived. My closest approximation came in “How to think about science and becoming a scientist:”

while society needs a certain number of lawyers to function well, too many lawyers leads to diminishing returns as lawyers waste time ginning up work by suing each other over trivialities or chasing ambulances.

By contrast, an excess of scientists and engineers means more people who will build the stuff that lawyers then litigate over. Scientists and engineers expand the size of the economic pie; lawyers mostly work to divide it up differently. Whenever possible, work to be a person who creates things, instead of a person who tries to take stuff created by someone else.

Kawasaki and Welch are bakers. They’re creators. They want to help you be one too. Still, according to them, you have to be the kind of writer who wants to “take control of their fate and embrace the ideas here in order to maximize their success.” A fair number of writers don’t appear to care about being able to “maximize their success” as measured by sales and finances, and in some literary circles cachet comes from not marketing one’s book, or appearing not to market it; sometimes not marketing becomes marketing, as examples like J. D. Salinger and Cormac McCarthy show.

This underlying model of success can seem claustrophobic, and, just I gave you permission to be be a writer above, I give you permission to be selective with social networks here: plans for Facebook, Twitter, LinkedIn, e-mail, Google+, and more would leave me with less writing time. I want to do things that really interest me, and that’s mostly long-form writing. Facebook and Twitter aren’t interesting, and I want the mental space they would otherwise occupy to be occupied by better things. I’m also reluctant to trust Facebook and Google+ because that gives those companies so much control over what I do and who I talk to. There was a recent kerfuffle when Facebook “turned down the volume” of businesses that had Facebook pages. That’s good for Facebook’s users but terrible for anyone who spent time and money encouraging people to interact on Facebook.

Facebook is, of course, where the people are. Using it is good advice, but it might also be useful to ask what you can say no to. In Anything You Want, Derek Sivers has a chapter called “No more yes. It’s either HELL YEAH! or no,” where he says that your reaction to most propositions should be one of those two extremes. To me, Facebook, Google+, and Twitter are in the lukewarm middle. Kawasaki and Welch “recommend using Google+ as a blogging platform.” Does it allow one to export nicely-formatted XML that will allow you to easily switch, if necessary? That’s a prerequisite, at least to me.

Kawasaki and Welch might be overly enamored of social media, and me underly enamored, but unless you want a Salinger-like existence you probably need to do something. There are few alternative to social media, e-mail, and other promotional efforts, and those efforts are a boon to outsiders. The authors say, “I’ve never come across an author who was happy with the marketing efforts of his publisher.” That might be because publishers have one thing that can’t be replicated by outsiders: distribution. Publishers are set up for a world where they control distribution. That advantage is eroding over time.

The chapters about social networking show you how to make sure you have access to new advantages.

The downside is that learning the business consumes time like space shuttles consume jet fuel. At the moment, however, APE is a relatively easy, comprehensible way of learning about all the steps that one should take to move from “guy with a story” or “guy with a long document” to “writes books that other people value and read.”


* I’ve only used Scrivener for one novel, called THE HOOK, that has different, named narrators at different times, like Tom Perrotta’s Election, Anita Shreve’s Testimony, or William Faulkner’s As I Lay Dying. Scrivener was an ideal tool for this task because it made rearranging sections easy, and it made reading each speaker’s full narrative, in order, easy. I can also see it being very useful for non-narrative nonfiction and or dissertations / academic books (James Fallows is a convert). For most fiction, I think the bigger problem is making the story cohere, not rearranging it.

How to update Letterbox for Max OS X after the latest 10.6.8 security patch:

When Apple released the most recent 10.6.8 security patch, that patch broke Letterbox (see also here), an insanely useful Mail.app plugin that allows all three Mail.app panels to be viewed vertically. This view maximizes screen real estate, which is very important for those of us on widescreen displays—which is to say, virtually all Mac users. But this 2010 OS X Daily post describes how to work around the last breakage caused by an Apple update. These are their instructions, except for the addition of two new UUIDs that I found for the latest version of 10.6.8:

* From the Finder, hit Command+Shift+G and enter ~/Library/Mail/ then hit Go
* Open Bundles (Disabled) rather than Bundles – note: if you have already opened Mail, the plugin is disabled, if you haven’t opened Mail yet, it will be in Bundles
* Right-click on Letterbox.mailbundle and select “Show Package Contents”
* Now open the “Contents” folder inside the Letterbox.mailbundle contents
* Using a text editor, open Info.plist (you can use TextEdit, don’t use Word)
* Scroll to the bottom of the Info.plist file and look for “SupportedPluginCompatibilityUUIDs” which is surrounded by key tags, below that will be a bunch of hex strings surrounded by string tags
* Add the following two strings to the bottom of the list (inside the array tags):

<string>064442B6-53C0-4A97-B71B-2F111AE4195B<string>
<string>588FF7D1-4310-4175-9980-145B7E975C02<string>

That’s the important part. The rest is fairly simple:

* Save these changes to the Info.plist file
* Go back to the Mac OS X desktop and hit Command+Shift+G again, then enter ~/Library/Mail/
* You’ll see these two folders again: Bundles and Bundles (Disabled), what you need to do is move the Letterbox.mailbundle plugin from the (Disabled) folder to the Bundles folder. Do this just by dragging the file from one folder window to the other.
* Relaunch Mail.app

You can also navigate to the folder ~/Library/Mail/Bundles on your own, without using the “Go” command.

A lot of people—especially the nerds likely to use Letterbox—have probably already moved to 10.7 or 10.8, though I still haven’t and am unlikely to in the foreseeable future.

The Steve Jobs Biography

Like everyone else, I started Walter Isaacson’s Steve Jobs biography today. It’s wonderful. In the first pages, Isaacson gives a sense of how Jobs both viewed himself and was viewed in his place at Apple: “When he was restored to the throne at Apple [. . . .]” How many companies could see their CEOs as occupying thrones? Almost no one has or had the medieval level of control Jobs did over Apple. But he didn’t exercise that control capriciously: he used it to make things people want. Lower on the same page, Isaacson describes his unwillingness to write on Jobs at first, but he says that he “found myself gathering string on the subject” of Apple’s early history. “Gathering string:” it’s something I do all the time, using the methods Steven Berlin Johnson describes in this essay about DevonThink Pro. One imagines the string eventually being knit into a sweater, but first one has to have the material.

A page later, Isaacson says “The creativity that can occur when a feel for both the humanities and the sciences combine in one strong personality was the topic that most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to creating innovative economies in the twenty-first century.” By now, such an assertion is almost banal, but that doesn’t mean it isn’t right and doesn’t mean it shouldn’t be asserted. Whenever you hear someone creating the false binary C. P. Snow discusses deconstructs in Two Cultures, point them to Jobs, who is merely the most salient example of why there aren’t two or more cultures—there’s one. You can call it creative, innovative, human-centered, discovery-oriented, bound by makers, or any number of other descriptions, but it’s there. It’s not just “a key to creating innovative economies in the twenty-first century,” either. It’s a key to being.

One more impression: while discussing the Apple II and the role of marketing guy Mike Markkula, Isaac describes the three principles Markkula adopts: “empathy,” “focus,” and, most interesting to this discussion, the “awkwardly named [. . .] impute.” The last principle “emphasized that people form an opinion about a company or product based on the signals that it conveys. ‘People DO judge a book by its cover,’ he wrote.” He’s right, and that brings up this book as a physical object: it’s beautiful. A single black and white picture of Jobs as an older man, still look vaguely like a rapscallion, dominates the cover. Another picture of him, this time as a younger man, dominates the back. The pages themselves are very white, and the paper quality is high; ink doesn’t bleed through easily, and the paper resists feathering. Jobs agreed not to meddle with the text; Isaacson says “He didn’t seek any control over what I wrote.” But he did meddle around the text, however: “His only involvement came when my publisher was choosing the cover art. When he saw an early version of a proposed covert treatment, he disliked it so much that he asked to have input in designing a new version. I was both amused and willing, so I readily assented.”

Good. I wonder if Jobs had “input” in the paper quality too. Sometimes I wonder if publishers are themselves trying to encourage people to adopt eBooks through the use of lousy paper stock and spines in books, especially hardcovers. Take Steven Berlin Johnson’s excellent book, Where Good Ideas Come From. The cover is black, with yellow text shaped like a lightbulb. Excellent design. But the pages themselves are a brownish gray, like newsprint, and the glued binding feels flimsy. The paperback is probably worse. It’s not the kind of book one would imagine Steve Jobs allowing, but the state of Johnson’s book as a physical object indicates what publishers value: cutting corners, making things cheap, and subtly conveying to readers that the publisher doesn’t care enough to make it good.

Publishers, in other words, are ruled by accountants who probably say that you can save $.15 per book by using worse paper. Apple was ruled by a megalomaniac with a persnickety attention to detail. People love Apple. No one, not even authors, love publishers. The reasons are legion, but when I think about what a lot of recent books “impute” to the reader, I think about how Steve Jobs would make them do it differently if he could. If you’re reading this in the distant future, the idea of reading words printed on dead trees is probably as strange to you as riding a carriage would be to me, but for now it matters. And, more importantly, I think books will continue to exist as physical art objects as well as repositories for knowledge as long as the Jobs and Isaacsons of the world make them.

I’m not far into the biography and feel the call of other responsibilities. But I leave Steve Jobs reluctantly, which happens to too few books of any genre. And I have a feeling that thirty years from now I’ll be reading an interview with some inventor or captain of industry who cites Steve Jobs and Steve Jobs as inspirations in whatever that inventor accomplishes.

Steve Jobs passes and the Internet speaks

I’ve never felt sad at the death of a famous person or someone I didn’t know. The recent news, however, does make me sad—probably because it seems like Steve Jobs’s personality infused everything Apple made. Maybe that’s just Apple’s marketing magic working on me, but if so, I’m still impressed, and I’m still not sure how to analyze a feeling of sadness about a person I never met, or how to go beyond what others have said about the loss of someone whose work and life’s work is so insanely great.

Like so many people writing about Jobs, I feel compelled to mention the hardware on which I’m doing it: a 27″ iMac with an impressively fast SSD and incredibly small footprint given the monitor’s size. Since getting an aluminum PowerBook in 2004, each subsequent Mac has been more amazing than the one preceding it—especially because I didn’t think it was possible to be more amazed than the one preceding it. There’s an iPhone sitting nearby, and in the near future that might become an iPhone 4S. So few devices feel so right, and I think people respond to Apple because it understands the link between technological function and feelings as few others do or few others can.

I look around to see what else I use and think about whether I know anything about the people behind those things: certainly not the Honda Civic I drive. Not the tea kettle I use to boil water. Not the Dell secondary monitor, whose badge could be stripped and another appended with absolutely no one noticing. I know a little about the Jeff Weber, who designed the Embody with Bill Stumpf, but that’s mostly because of wonky interest on my par. Try as I might, I can’t think of anyone else remotely like Jobs in achievement, fame, importance, and ubiquity. That person might be out there, but I don’t know who he is. His work is anonymous in a way Jobs’s has never been. He makes stuff with character in a world where so much stuff utterly lacks it.

Take the Apple logo off the iMac, and you’ll still have a machine that makes one stop and take account. And those improvements! Jobs offers lessons to the ambitious: Good is never good enough; you can always go further; done is never done enough; and, even if those things aren’t true, time will make them true. I wouldn’t be surprised if, 200 years from now, Jobs is still taken to be one of the pillars of his age, known to some extent by non-specialists, like Edison or Ford.

The Internet is saying a lot about Jobs. People are linking to the text version of his 2005 Stanford graduation speech. The Atlantic is explaining Why We Mourn Steve Jobs. Here’s someone almost as obscure as me writing Steve Jobs, 1955 – 2011: “Today marked the end of an era. Each of the quotes below is linked to a eulogy or collection of reflections on the passing of Steve Jobs.” Stephen Wolfram of Mathematica and A New Kind of Science fame remembers Jobs and Jobs’s encouragement too. There are probably more tributes and commentaries than anyone could read, even if they had the inclination. Let me add to the pile, and to the pile of people saying they feel a strange connection to the man, however ridiculous that feeling might be. It’s ridiculous, but it’s real, like that connection between person and tool, user and computer. The connection is real in part because Jobs helped make it real.


EDIT: See also Megan McArdle on the Jobs speech:

The problem is, the people who give these sorts of speeches are the outliers: the folks who have made a name for themselves in some very challenging, competitive, and high-status field. No one ever brings in the regional sales manager for a medical supplies firm to say, “Yeah, I didn’t get to be CEO. But I wake up happy most mornings, my kids are great, and my golf game gets better every year.”

In addition, I usually hate watching videos on the Internet because most are overrated, but Colbert on Jobs is not. Also available in the funny / genuine vein: “Last American Who Knew What The Fuck He Was Doing Dies,” courtesy of the Onion.

Heres’t the tech columnist Walt Mossberg on Jobs.

And how does this apply to writers? Steve Jobs and the idea of "Ma"

From “How Steve Jobs ‘out-Japanned’ Japan:”

That ability to express by omission holds a central place in Jobs’s management philosophy. As he told Fortune magazine in 2008, he’s as proud of the things Apple hasn’t done as the things it has done. “The great consumer electronics companies of the past had thousands of products,” he said. “We tend to focus much more. People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas.” . . .

Jobs’s immersion in Zen and passion for design almost certainly exposed him to the concept of ma, a central pillar of traditional Japanese aesthetics. Like many idioms relating to the intimate aspects of how a culture sees the world, it’s nearly impossible to accurately explain — it’s variously translated as “void,” “space” or “interval” — but it essentially describes how emptiness interacts with form, and how absence shapes substance. If someone were to ask you what makes a ring a meaningful object — the circle of metal it consists of, or the emptiness that that metal encompasses? — and you were to respond “both,” you’ve gotten as close to ma as the clumsy instrument of English allows.

I think of the various things I have that might have “ma:” a pretentious Moleskine notebook, a Go board, certain books. But where do objects end and the internalization of an idea begin?

And how does this apply to writers? Steve Jobs and the idea of “Ma”

From “How Steve Jobs ‘out-Japanned’ Japan:”

That ability to express by omission holds a central place in Jobs’s management philosophy. As he told Fortune magazine in 2008, he’s as proud of the things Apple hasn’t done as the things it has done. “The great consumer electronics companies of the past had thousands of products,” he said. “We tend to focus much more. People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas.” . . .

Jobs’s immersion in Zen and passion for design almost certainly exposed him to the concept of ma, a central pillar of traditional Japanese aesthetics. Like many idioms relating to the intimate aspects of how a culture sees the world, it’s nearly impossible to accurately explain — it’s variously translated as “void,” “space” or “interval” — but it essentially describes how emptiness interacts with form, and how absence shapes substance. If someone were to ask you what makes a ring a meaningful object — the circle of metal it consists of, or the emptiness that that metal encompasses? — and you were to respond “both,” you’ve gotten as close to ma as the clumsy instrument of English allows.

I think of the various things I have that might have “ma:” a pretentious Moleskine notebook, a Go board, certain books. But where do objects end and the internalization of an idea begin?

How to buy a Mac

Apple updates their computers every nine to fifteen months or so. If you buy at the beginning of the “product cycle,” you usually get really good bang for their buck: fast components for reasonably prices. Toward the end of the product cycle, deals aren’t as good.

People often ask for advice about whether they should buy that MacBook or iMac; this is especially common on the Ars Technica Mac Board, and I’ve realized that there’s a relatively simply algorithm to determine whether you should buy now or wait. One person, “masonk,” made this handy flow chart, which is explained in words below:

The standard advice:

If you don’t have a working, usable computer and need one, buy it.

Check the Mac Rumor’s Buyers Guide. Has the computer been updated within the last six months? If so, buy it: an upgrade in the near future is unlikely.

If not, are we within six weeks of the World Wide Developer Conference, MacWorld (or whatever January event might replace it), or a “special media event?” Can you wait the six weeks—that is, do you have a computer that’s still usable? If so, wait, as there’s a good chance of product updates.

If we’re not within six weeks of a major event, buy it anyway, as you don’t know when an update might appear.

Steve Jobs’ prescient comment

“The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That’s over. Apple lost. The desktop market has entered the dark ages, and it’s going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.”

(Emphasis added.)

—That’s from a 1996 interview with Jobs, and he was completely right: little of interest happened to the desktop interface virtually everyone uses until around 2003 or 2004, when OS X 10.3 was released. The first major useful change in desktops that I recall during the period was Spotlight in OS X 10.4, which was, not coincidentally, around the time I got a PowerBook.

Writing space: 2010

About three years ago I posted a picture of my then-writing space. Now, this Hacker News post inspires me to do the same:

The setup: a 24″ iMac with a 20″ Dell monitor to the side. The most important parts of the desk are the Humanscale keyboard tray mounted underneath a Maxon 1000 Series. The keyboard: a Kinesis Advantage.