“The Internationalists” and making war illegal

At Astral Codex Ten, there’s a great review essay on The Internationalists, a book about “the 1928 Kellogg-Briand Peace Pact” (I hadn’t heard of it either), which sought to “declare war illegal.” There are some obvious ways in which war has continued, but the thrust of The Internationalists and the essay seems to be that things have overall been moving in the right direction. Even authoritarian countries like Russia work to play down their warfare and conquest aims, particularly to their own populations. Part of the reason countries appear to have historically gone to war is to get rich by stealing things from other people, and to get more “land” for one’s people. These reasons haven’t made sense for many decades, if they ever did; today, the largest companies in the world are tech companies, and you can’t steal Apple, Google, Microsoft, or Amazon through invasion. Even if these companies were in Ukraine, attempting to “steal” them through invasion wouldn’t work because the vast majority of their value is in their people and systems, who would flee (in the case of people) and which would disintegrate (in the case of systems) in the event of invasion.

China has gotten rich in the last few decades by making stuff people want, not by attempting to forcibly steal things through invasion. China might change this strategy through invading Taiwan, and in the process destroy companies like TSMC, but it’s almost certainly not going to get richer in the process, and will likely achieve the opposite. In many countries, including the United States, we could immediately become vastly richer by changing some of our laws, rather than invading other countries: Hsieh and Moretti, for example, “quantify the amount of spatial misallocation of labor across US cities and its aggregate costs. Misallocation arises because high productivity cities like New York and the San Francisco Bay Area have adopted stringent restrictions to new housing supply, effectively limiting the number of workers who have access to such high productivity. Using a spatial equilibrium model and data from 220 metropolitan areas we find that these constraints lowered aggregate US growth by 36 percent from 1964 to 2009.”

36 percent! That’s a huge amount of growth—imagine making 36% more per year than you are right now. Like a lot of countries (though not Japan), we can dramatically increase aggregate wealth by liberalizing land-use laws. Essentially all countries have plenty of “space” for people—if we choose to let land owners do what they want to with their land. We’ve decided to be collectively poorer by not doing so, which seems unwise to me, but I’m one guy.

In most countries, too, birthrates are now at or below replacement levels. We’re not collectively able to reproduce ourselves, let alone need to somehow go find more “space” for others. Polling consistently shows American women want two or three kids, but most are having one or two, perhaps because they feel they can’t afford to have more. Maybe we should try to make the cost of living lower, so that more people can enjoy it—that is, the “living.” Instead, we’re perversely doing the opposite. “Perversity” may be the theme of this essay.

The anonymous reviewer says that “The US keeps starting or engaging in wars, like in Libya, Afghanistan, and Iraq,” but he or she doesn’t go further: There’s an interesting counterfactual history of the United States in which we don’t invade Iraq, spending around $2 trillion (“trillion” with a “t”). Let’s say we spend 10% of that, or $200 billion, on other things, such as true energy independence. Although Iraq wasn’t really about “stealing” Iraqi oil, Iraq—like Russia and Iran—wouldn’t have the money to create globally significant mischief without selling oil. What could we have done instead of invading Iraq? We could have invested substantially in battery technology and manufacturing, thus driving the cost of batteries for car applications, five to ten years earlier than actually happened—and we could’ve cut gas and oil usage far faster than we did. We’d get environmental benefits, too, on top of the geopolitical ones.

There are arguments like this around nuclear fusion power plants:

“Fusion is 30 years away and always will be.”

What happened? Why has fusion failed to deliver on its promise in the past?

By the 1970s, it was apparent that making fusion power work is possible, but very hard. Fusion would require Big Science with Significant Support. The total cost would be less than the Apollo Program, similar to the International Space Station, and more than the Large Hadron Collider at CERN. The Department of Energy put together a request for funding. They proposed several different plans. Depending on how much funding was available, we could get fusion in 15-30 years.

How did that work out?

fusion_funding

Along with the plans for fusion in 15-30 years, there was also a reference: ‘fusion never’. This plan would maintain America’s plasma physics facilities, but not try to build anything new.

Actual funding for fusion in the US has been less than the ‘fusion never’ plan.

The reason we don’t have fusion already is because we, as a civilization, never decided that it was a priority. Fusion funding is literally peanuts: In 2016, the US spent twice as much on peanut subsidies as on fusion research.

We’ve been consistently spending less on fusion than we did in the ’70s. The largest fusion project, the International Thermonuclear Experimental Reactor (ITER), is now going to cost around $21 billion—or about half of the $40 billion in weapons we’re shipping to Ukraine (Russia is a petro state and, without income from oil and gas sales, it would be unlikely to be able to fund a true war effort). $21 billion is also about 1% of what we’ve spent on the Iraq war. Maybe we’d not have working, commercially viable nuclear fusion here in 2022, but we’d be far closer than we are. Instead of investing in true energy independence, we’ve been investing in warfare, which seems like a bad trade-off. MRNA vaccines have made the world billions if not trillions of dollars richer, apart from saving a million lives in the United States alone. Maybe we should do more of that (I’m using the word “maybe” with some archness).

There’s a world in which we take the long view in an attempt to stop funding authoritarian regimes and stop invading them, and we instead focus on trying to get to the future faster. Most of the wars involving the United States in the last 30 years have been at least partially traceable to oil and gas (Saudi Arabia being the home of 15 of the 19 9/11 attackers, and being a putative ally of the U.S. but not exactly the good guys). Instead of saying, “Hey, maybe we ought to think about this relationship between warfare and gas,” we’ve decided to keep fighting random wars piecemeal. As of this writing, we’re not fighting Russia directly, but we’re not not fighting Russia. Simultaneously, had Germany invested heavily in conventional nuclear fission plants, it would’ve imported billions less in gas from Russia, and it would be poised to switch to electric vehicles. Russia’s warfare capabilities would likely be far worse than they are. Germany’s emissions could be far lower than they are. (France, to its credit, gets most of its electricity from nuclear sources: contrary to stereotype, the country isn’t composed entirely of Houellebecqian bureaucrats, sex workers, and waiters.)

Making war illegal is good, but making it uneconomical is also good, and the latter may help encourage the former. War is dumb and people get richer without it—one hopes the Chinese Community Party (CCP) sees this, as we did not during 2001 – 2003. Making war even more uneconomical than it is now requires a civilization that thinks further than a few months into the future. Maybe we should get on that. Things that are illegal and dumb aren’t very enticing.

Life: The artists and the analysist edition

“One advantage of thinking about psychoanalysis as an art, instead of a science, is that you don’t have to believe in progress.”

—Adam Philips, “The Art of Nonfiction No. 7” in The Paris Review. Compare to “Politics repeats itself while science and art make it new.”

Links: The writer, the adjunct, the technology

* Professors, we need you! (Maybe.)

* This is probably fake but definitely hilarious and true to my own teaching experience.

* “Do We Really Need Negative Book Reviews?” I tend to answer “Yes, with qualifications,” and indeed I write many fewer negative reviews than I once did. Then again I write many fewer reviews in general than I once did.

* “Is Paying Adjuncts Crap Killing Technological Innovation?” Hat tip and further commentary: Dean Dad.

* Technological Progress Isn’t GDP Growth and, relatedly, Tyler Cowen: “Robert Gordon’s sequel paper on the great stagnation.”

* Inside DuckDuckGo, Google’s Tiniest, Fiercest Competitor, which I use as my primary search engine:

How could DuckDuckGo, a tiny, Philadelphia-based startup, go up against Google? One way, he wagered, was by respecting user privacy. Six years later, we’re living in the post-Snowden era, and the idea doesn’t seem so crazy.

* “Why Is Academic Writing So Academic?“, which is to say, bad?

Why little black books instead of phones and computers

“Despite being a denizen of the digital world, or maybe because he knew too well its isolating potential, Jobs was a strong believer in face-to-face meetings.” That’s from Walter Isaacson’s biography of Steve Jobs. It’s a strange way to begin a post about notebooks, but Jobs’ views on the power of a potentially anachronistic practice applies to other seemingly anachronistic practices. I’m a believer in notebooks, though I’m hardly a luddite and use a computer too much.

The notebook has an immediate tactile advantage over phones: they aren’t connected to the Internet. It’s intimate in a way computers aren’t. A notebook has never interrupted me with a screen that says, “Wuz up?” Notebooks are easy to use without thinking. I know where I have everything I’ve written on-the-go over the last eight years: in the same stack. It’s easy to draw on paper. I don’t have to manage files and have yet to delete something important. The only way to “accidentally delete” something is to leave the notebook submerged in water.Notebook stack

A notebook is the written equivalent of a face-to-face meeting. It has no distractions, no pop-up icons, and no software upgrades. For a notebook, fewer features are better and fewer options are more. If you take a notebook out of your pocket to record an idea, you won’t see nude photos of your significant other. You’re going to see the page where you left off. Maybe you’ll see another idea that reminds you of the one you’re working on, and you’ll combine the two in a novel way. If you want to flip back to an earlier page, it’s easy.

The lack of editability is a feature, not a bug, and the notebook is an enigma of stopped time. Similar writing in a computer can function this way but doesn’t for me: the text is too open and too malleable. Which is wonderful in its own way, and that way opens many new possibilities. But those possibilities are different from the notebook’s. It’s become a cliche to argue that the technologies we use affect the thoughts we have and the way we express those thoughts, but despite being cliche the basic power of that observation remains. I have complete confidence that, unless I misplace them, I’ll still be able to read my notebooks in 20 years, regardless of changes in technology.

In Distrust That Particular Flavor, William Gibson says, “Once perfected, communication technologies rarely die out entirely; rather, they shrink to fit particular niches in the global info-structure.” The notebook’s niche is perfect. I don’t think it’s a coincidence that Moleskine racks have proliferated in stores at the same time everyone has acquired cell phones, laptops, and now tablets.

In The Shallows, Nicholas Carr says: “The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.” Cell phones subtly change our relationship with time. Notebooks subtly change our relationship with words and drawings. I’m not entirely sure how, and if I were struggling for tenure in industrial design or psychology I might start examining the relationship. For now, it’s enough to feel the relationship. Farhad Manjoo even cites someone who studies these things:

“The research shows that the type of content you produce is different whether you handwrite or type,” says Ken Hinckley, an interface expert at Microsoft Research who’s long studied pen-based electronic devices. “Typing tends to be for complete sentences and thoughts—you go deeper into each line of thought. Handwriting is for short phrases, for jotting ideas. It’s a different mode of thought for most people.” This makes intuitive sense: It’s why people like to brainstorm using whiteboards rather than Word documents.

IMG_2100I like to write in notebooks despite carrying around a smartphone. Some of this might be indicative of the technology I grew up with—would someone familiar with smartphone touchscreens from age seven have sufficiently dexterous fingers to be faster than they would be with paper?—but I think the obvious answer to “handwriting or computer?” is “both, depending.” As I write this sentence, I have a printout of a novel called ASKING ANNA in front of me, covered with blue pen, because editing on the printed page feels different to me than editing on the screen. I write long-form on computers, though. The plural of anecdote is not data. Still, I have to notice that using different mediums appears to improve the final work product (insert joke about low quality here).

There’s also a shallow and yet compelling reason to like notebooks: a disproportionate number of writers, artists, scientists, and thinkers like using them too, and I suspect that even contemporary writers, artists, scientists, and thinkers realize that sometimes silence and not being connected is useful, like quiet and solitude.

In “With the decline of the wristwatch, will time become just another app?”, Matthew Battles says:

Westerners have long been keenly interested in horology, as David Landes, an economic historian, points out in Revolution in Time, his landmark study of the development of timekeeping technology. It wasn’t the advent of clocks that forced us to fret over the hours; our obsession with time was fully in force when monks first began to say their matins, keeping track of the hours out of strict religious obligation. By the 18th century, secular time had acquired the pressure of routine that would rule its modern mode. Tristram Shandy’s father, waiting interminably for the birth of his son, bemoans the “computations of time” that segment life into “minutes, hours, weeks, and months” and despairs “of clocks (I wish there were not a clock in the kingdom).” Shandy’s father fretted that, by their constant tolling of the hours, clocks would overshadow the personal, innate sense of time—ever flexible, ever dependent upon mood and sociability.

The revolution in electronic technology is wonderful in many ways, but its downsides—distraction, most obviously—are present too. The notebook combats them. Notebooks are an organizing or disorganizing principle: organizing because one keeps one’s thoughts, but disorganizing because one cannot rearrange, tag, and structure thoughts in a notebook as one can on a screen (Devonthink Pro is impossible in the real world, and Scrivener can be done but only with a great deal of friction).

Once you try a notebook, you may realize that you’re a notebook person. You might realize it without trying. If you’re obsessed with this sort of thing, see Michael Loper / Rands’ Sweet Decay, which is better on validating why a notebook is important than evaluating the notebooks at hand. It was also written in 2008, before Rhodia updated its Webbie.

Like Rands, I’ve never had a sewn binding catastrophically fail. As a result, notebooks without sewn bindings are invisible to me. I find it telling that so many people are willing to write at length about their notebooks and use a nominally obsolete technology.

Once you decide that you like notebooks, you have to decide which one you want. I used to like Moleskines, until one broke, and I began reading other stories online about the highly variable quality level.

So I’ve begun ranging further afield.

I’ve tested about a dozen notebooks. Most haven’t been worth writing about. But by now I’ve found the best reasonably available notebooks, and I can say this: you probably don’t actually want a Guildhall Pocket Notebook, which is number two. You want a Rhodia Webnotebook.

Like many notebooks, the Guildhall starts off with promise: the pages do lie flat more easily than alternatives. Lines are closely spaced, maximizing writable area, which is important in an expensive notebook that shouldn’t be replaced frequently.

IMG_3900I like the Guildhall, but it’s too flimsy and has a binding that appears unlikely to withstand daily carry. Mine is already bending, and I haven’t even hauled it around that much. The Rhodia is somewhat stiffer. Its pages don’t lie flat quite as easily. The lines should go to the end of each page. But its great paper quality and durability advantage make it better than the alternatives.

The Rhodia is not perfect. The A7 version, which I like better than the 3.5 x 5.5 American version, is only available in Europe and Australia, which entails high shipping costs. The Webbie’s lines should stretch to the bottom of the page and be spaced slightly closer together. The name is stupid; perhaps it sounds better in French. The notebook’s cover extends slightly over its paper instead of aligning perfectly. Steve Jobs would demand perfect alignment. To return to Isaacson’s biography:

The connection between the design of a product, its essence, and its manufacturing was illustrated for Jobs and Ive when they were traveling in France and went into a kitchen supply store. Ive picked up a knife he admired, but then put it down in disappointment. Jobs did the same. ‘We both noticed the tiny bit of glue between the handle and the blade,’ Ive recalled. They talked about how the knife’s good design had been ruined by the way it was being manufactured. ‘We don’t like to think of our knives as being glued together,’ Ive said. ‘Steve and I care about things like that, which ruin the purity and detract from the essence of something like a utensil, and we think alike about how products should be made to look pure and seamless.

I wish the Rhodia were that good. But the Rhodia’s virtues are more important than its flaws: the paper quality is the highest I’ve seen, and none of the Rhodias I’ve bought have broken. If anyone knows of a notebook that combines the Rhodia’s durability with the qualities it lacks, by all means send me an e-mail.


More on the subject: The Pocket Notebooks of 20 Famous Men.

EDIT: See also Kevin Devlin’s The Death of Mathematics, which is about the allure of math by hand, rather than by computer; though I don’t endorse what he says, in part because it reminds me so much of Socrates decrying the advent of written over oral culture, I find it stimulating.

Shaping Things and Bruce Sterling's technoculture

Design is hard to do. Design is not art. But design has some of the requirements of art. The achievement of greatness in art or design requires passionate virtuosity. VIRTUOSITY means thorough mastery of craft. PASSION is required to focus human effort to a level that transcends the norm. Some guitarists have passion, especially young ones. Some have virtuosity, especially old ones. Some few have both at once, and during some mortal window of superb achievement, they are great guitarists.

That’s from Bruce Sterling’s Shaping Things, and I admire the distinction between design and art, which overlap to some extent but not totally; his point about “passionate virtuosity” is one I’ve seen elsewhere but is worth repeating, because it seems like so many seemingly different fields require the same thing. Certainly writing does, and one sees too many people with the passion or the virtuosity but not both.

Another sample:

I do write a great deal about technology. That became my theme as an artist. The human reaction to technological change—nothing interests me more. I want and need to know all about it. I want to plumb its every aspect. I even want to find new words for aspects of it that haven’t as yet been described.

I would guess artists, especially of narrative arts, are going to have to pay steadily more attention to technology: it informs too many lives too much to ignore, and people have as many disparate response “to technological change” as they do to love.

The book itself—Shaping Things—is interesting without being captivating. It needs more examples and case studies, and fewer grand pronouncements; it resembles a lot of literary theory in this way. If you get a physical copy, you’ll also find terrible design, with all kinds of doodads, weird fonts, random backgrounds, and so forth, all of which distract from readability in the name of being weird (those capitalizations in the blockquote above are in the text). It’s a kind of anti-Apple product.

The book’s design is distinctive, but distinctive is automatically good, and as a mechanism for transferring ideas via text Shaping Things isn’t optimal because of those distractions. Nonetheless, the idea density is high, and I’m going to keep my copy, at least for the time being. Like Sterling, I’ve become steadily more interested in design and what design says about people and culture. I’m not sure how that’ll work into my fiction, but long-simmering ideas and interests tend to emerge in unpredictable ways. For example: I’ve thought about a novel in which a camera shows an emotionally stunted photographer—along the Conrad and Houllebecq lines—who thinks in the language of photography itself what the photographer takes to be the future. Or is it? Photographers have a rich array of metaphors to draw on, and they have to be attuned to light, shapes, and the interplay of things and colors. Cameras themselves are technologies, and in the last 15 years they’ve become computers, with rapid advancements from year to year and all of the technolust that implies.

I don’t know where this idea might go, or if it will go at all, but I’ve been mulling it for a long time. A character like the one or ones I’m imagine would be reacting to technological change. I won’t say “nothing interests me more,” as Sterling does, but human reaction to technology is certainly up there, as I increasingly think it has to be, for people in virtually any field, if one wants any real shot at understanding what’s going on.

Shaping Things and Bruce Sterling’s technoculture

Design is hard to do. Design is not art. But design has some of the requirements of art. The achievement of greatness in art or design requires passionate virtuosity. VIRTUOSITY means thorough mastery of craft. PASSION is required to focus human effort to a level that transcends the norm. Some guitarists have passion, especially young ones. Some have virtuosity, especially old ones. Some few have both at once, and during some mortal window of superb achievement, they are great guitarists.

That’s from Bruce Sterling’s Shaping Things, and I admire the distinction between design and art, which overlap to some extent but not totally; his point about “passionate virtuosity” is one I’ve seen elsewhere but is worth repeating, because it seems like so many seemingly different fields require the same thing. Certainly writing does, and one sees too many people with the passion or the virtuosity but not both.

Another sample:

I do write a great deal about technology. That became my theme as an artist. The human reaction to technological change—nothing interests me more. I want and need to know all about it. I want to plumb its every aspect. I even want to find new words for aspects of it that haven’t as yet been described.

I would guess artists, especially of narrative arts, are going to have to pay steadily more attention to technology: it informs too many lives too much to ignore, and people have as many disparate response “to technological change” as they do to love.

The book itself—Shaping Things—is interesting without being captivating. It needs more examples and case studies, and fewer grand pronouncements; it resembles a lot of literary theory in this way. If you get a physical copy, you’ll also find terrible design, with all kinds of doodads, weird fonts, random backgrounds, and so forth, all of which distract from readability in the name of being weird (those capitalizations in the blockquote above are in the text). It’s a kind of anti-Apple product.

The book’s design is distinctive, but distinctive is automatically good, and as a mechanism for transferring ideas via text Shaping Things isn’t optimal because of those distractions. Nonetheless, the idea density is high, and I’m going to keep my copy, at least for the time being. Like Sterling, I’ve become steadily more interested in design and what design says about people and culture. I’m not sure how that’ll work into my fiction, but long-simmering ideas and interests tend to emerge in unpredictable ways. For example: I’ve thought about a novel in which a camera shows an emotionally stunted photographer—along the Conrad and Houllebecq lines—who thinks in the language of photography itself what the photographer takes to be the future. Or is it? Photographers have a rich array of metaphors to draw on, and they have to be attuned to light, shapes, and the interplay of things and colors. Cameras themselves are technologies, and in the last 15 years they’ve become computers, with rapid advancements from year to year and all of the technolust that implies.

I don’t know where this idea might go, or if it will go at all, but I’ve been mulling it for a long time. A character like the one or ones I’m imagine would be reacting to technological change. I won’t say “nothing interests me more,” as Sterling does, but human reaction to technology is certainly up there, as I increasingly think it has to be, for people in virtually any field, if one wants any real shot at understanding what’s going on.

From the Department of "No Shit:" technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

From the Department of “No Shit:” technology and computers are not silver bullets for education

File this New York Times article under “no shit:”

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.


* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

Will we ever find out what happened to Flip Video?

According to the San Francisco Chronicle, “Cisco Killed The Flip Cam A Day Before It Was Going To Get A Cool New Live Broadcast Feature.” Which is pretty frustrating: why kill the unit right before a major upgrade that’s presumably all sunk costs? The WSJ has one possible answer in “After Cisco Sacrifices His Baby to the Gods of Wall Street, Flip Founder Jon Kaplan Speaks!“, where Kara Swisher says that axing Flip was an “effort to assure Wall Street that it was no longer serious about its wacky foray into the consumer market.” But does it have to be so public? So symbolic?

And it is symbolic: Arik Hesseldahl points out that Cisco lumps the revenue from Flip into an “other” category on its financial statements. He then goes on: “This ‘other revenue’ totaled $2.6 billion in Cisco’s fiscal 2010, up from $1.6 billion in fiscal 2009. The biggest single factor for that billion-dollar boost was $317 million in Flip camera sales. You read that right: Cisco just shut down a business that brought in $317 million in sales in its last fiscal year.”

He says, “Make no mistake, the Flip was and is a culturally significant product.” It was, and, as regular readers know, I almost never write about consumer gadgets because most of the time there’s no point and people who write about them are just wasting their breath. But the Flip was fun in that shocking, surprising way that the original iPods were. Gadgets rarely have that effect—they’re as rare, or maybe rarer, as a book that really speaks to me. But a book is forever while gadgets come and go.

I think it’s the pointlessness of closing Flip that annoys me so much. They made a fun product that a corporate leviathan is killing just because it can. Unfortunately, posts like this one aren’t likely to have much of an effect. There’s a Facebook page devoted to saving Flip, but it only has 407 members as of this writing, and, in Cisco terms, that’s indistinguishable from zero.

Still, David Pogue’s post “The Tragic Death of the Flip” has 13 pages of comments, most from people with the same reaction I did. Killing a beloved product is counterproductive, considering how hard it is to develop and sell a beloved product, and I still wonder why Cisco axed instead of sold the company. A hundred million dollars is presumably better than zero. But I’m not sure we’ll ever find out.

EDIT: Some feedback points out that still-video hybrid cameras like Panasonic’s will likely take over Flip’s market. Could be, but I think the two serve different people. Those Panasonic cameras are a lot more expensive and in key ways less fun to use. I have a Canon camera for pictures and while it’s great for what it is, Flips are more approachable and more portable.

The world is getting better, In the Plex edition

From Steven Levy’s In the Plex; How Google Thinks, Works, and Shapes our Lives, an astonishingly good and detailed book that, as of page 146, doesn’t feel padded:

[. . .] the founders themselves embraced ‘Don’t be evil’ as a summation of their own hopes for the company. That was what Google was about: two young men who wanted to do good, gravitated to a new phenomenon (the Internet) that promised to be a history-making force for good, developed a solution that would gather the world’s information, level the Tower of Babel, and link millions of processors into a global prosthesis for knowledge. And if the technology they created would make the world a better place, so would their company; Google would be a shining beacon for the way corporations should operate: an employee-centric, data-driven leadership pampering a stunningly bright workforce that, for its own part, lavished all its wit and wizardry on empowering users and enriching advertising customers. From those practices, the profits would roll in. Ill intentions, flimflammery, and greed had no role in the process. If temptation sounded its siren call, one could remain on the straight path by invoking Amit Patel’s florid calligraphy on the whiteboards of the Googleplex: ‘Don’t be evil.’ Page and Brin were good, and so must be the entity they founded.

Ambition linked to knowledge of how to execute is evident throughout the book, but especially here, given that the company’s major players aren’t just content with being big—they want to be big and be good, with a presumably evolving definition of what “good” means. This is a bit like the United States itself, which isn’t collectively content to merely be—there’s a very long cultural strain of being an icon or role model. Such a desire often leads the country to unfortunate lurches that mostly seem to be corrected as time goes on.

Reading the news on a day-to-day basis often gives one a sense of doom and disaster. Reading a book like In the Plex reminds one that the world is going places even if politicians and the politics they make don’t realize it. The world is big and strange, and it’s getting more so over time—if one takes the time to realize it. Google may or may not “be a shining beacon,” but its goals are hard not to admire, even if they’re cloaked i religious language (“the straight path”). I use Google most days without thinking about all the thought behind the company, which is busy making the world a different place very fast.

It helps that Levy is telling the story; much like Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything, he manages to compress a great deal of information and personality into a small space. He imparts some of the sense of magic Google itself is supposed to inculcate—notice the reference to “wit and wizardry”—and some of the sense of optimism that we can do things if we really want to.