Video Games Live — concert review

A friend and I saw Video Games Live, the concert featuring primarily music from video games; the show was emphatically so-so, mostly because the music kept being interrupted for banal reasons, chiefly related to defending the idea of video games as an art form. The structure of the concert went like this: the musicians would play for five to ten minutes, then a guy would show up to declare that video games are ART, DAMNIT! or run a contest, or show a video game, or pick his nose, or whatever. Then the music would resume. But is a show devoted to music of games really an ideal venue for the purpose of trying to show video games are art? In other concerts I’ve been to, no one comes out to defend Beethoven or The Offspring as art: it’s merely assumed. You’ll know video games are art when people stop claiming they are and merely assume that they are.

I feel the worst for the musicians themselves, who presumably haven’t spent more than 10,000 hours of practice time for underdeveloped pieces that, to highly trained ears, probably sound bombastic or manipulative, like bad romances seem to literary critics. You could see them looking at one another when the conductor / showman stopped to extol the virtues of video games and drench himself in glory for putting the show together.

You may notice that I haven’t mentioned much about the music: that’s because the show wasn’t really about music. Some video game music is interesting and deserves serious attention; Final Fantasy is particularly famous for its soundtracks. The Mario theme music has become a pop culture cliche. But you won’t find attention to music at Video Games Live: look elsewhere for that.

Without being able to discuss much of the music, someone dealing with the concert is left to discuss what the nominal concert really engages. Like a dizzying array of phenomena, Tyler Cowen has asked similar questions about the status of video games and art, which he engages a little bit here regarding a New York Times piece and also here. Salon.com is asking the same questions, but is more rah-rah about video games. I don’t think anyone has argued that video games don’t “matter,” whatever that means in the context. It seems unlikely to me that games will have a strong claim to art until they can deal with sexuality in a mature way—which paintings, novels, poetry, and movies have all accomplished.

We’ll know video games are art when their defenders stop saying that video games are art and merely assume they are while going about their business. This change happened in earnest with novels around the late nineteenth and early twentieth centuries, as Mark McGurl argues in The Novel Art: Elevations of American Fiction after Henry James. Maybe it’s happening now with video games. If so, I don’t think Video Games Live is helping.

One good thing: my friend won tickets. So the only cost of the show was opportunity, not money.

Careers—and careerism—in academia and criticism

Careers in criticism examines what D.G. Myers thinks can be done about the possible problem of lousy literary criticism. It’s worth reading, but I suspect that the other problem, which goes undiscussed in this post, is the difficulty of deciding what is good criticism: many people complain that lots of academic and other criticism is bad (I probably count myself in their ranks much of the time), but they tend to disagree with what would be good in its stead. Deciding is particularly hard in a field where wildly divergent ideas of what constitutes quality exists. Therefore you get… gridlock, high school politics, and so forth.

How to solve this? Myers says:

[Elberry] thinks that I am suggesting that “critics should write about less well-known books,” but I suggest this only as a method, a practical expedient, for undertaking their real responsibility: namely, to contribute to literary knowledge. The demand upon critics (in the university and out) must be, not to “write something new and different,” but to add something new and different to the store of human understanding.

I bet that most people who are writing just to “write something new and different” would argue they are adding to the store of human knowledge. I definitely agree with Myers’ formulation on a high level but am not sure how to implement this on a lower level. The best ideas I can come up resolve issues in academic publishing: right now, it can take years to publish an essay in a peer-reviewed journal, which then locks it behind pay walls on the Internet. The length raises the obvious and uncomfortable question: if it takes three years to publish a paper, is the paper really that important? That this process takes forever is hardly new; Lucky Jim mocked it in the 1950s.

My solution: have peer-reviewed journals “publish” online, and have publication be a link to the author’s paper on the author’s website. The journal’s editor could also copy that paper to their own site after anonymous peer review. That way, the information is freely available, especially to people in countries where most universities can’t afford journal subscriptions under the present model; the theoretical “size” of a journal could be limitless, although the practicalities of reading would probably still limit that size; there would still be a recognized body of work that makes up, say “Modern Fiction Studies;” and the journal could still issue a print edition every n months or years for those who prefer it. This would cause the journal to lose the revenue stream that currently comes from publishers, but that stream seems to be so small that universities could replace it in return for the prestige of housing the journal. Alternately, the exceedingly low cost of web publishing—one could buy server hosting with 200GB+ per month transfer limits and so forth for $100/month—could obviate the (relatively) high cost structures that journals already have while reducing barriers to entry.

Current top-notch journals have no incentive to adopt this model, as it would challenge their hegemony, but if lesser journals began adopting it and scholars preferred it, the quality in my wiki-like journal would rise, and competition might force top-notch journals to adopt the same strategies if they’re going to retain their position. Since publishing in English lit seems mostly a prestige and influence game, this strategy has few drawbacks I can perceive. If anyone knows of a reputable journal (which is to say: one backed by a university with at least a few years of regular publication) that’s already doing this, I’d love to hear about it.

The other change is one I read about in Freakonomics, the blog: require peer reviewers to say publish/no publish on each paper, and give comments, rather than giving comments with the implication that, if they’re not taken, one will automatically be rejected. Rather than having a three- to four-draft round-robin time-waster of questionable benefit, a peer reviewer would have to say “yes/no,” on the first iteration in its current condition, and the reviewer’s comments would be an option rather than requirement. This structural change seems less important than the one above.

Anyway, given that I’m in grad school for English lit, expect more on this topic in the future, since I’m now tasting the peer review that many others have called bitter and find that they’re mostly right.


EDIT: Myers has a follow-up post, with a response to some of my comments, here.

Life: Critics and artists edition

Stolen from Terry Teachout:

“A man who tells me my play is very bad, is less my enemy than he who lets it die in silence. A man, whose business it is to be talked of, is much helped by being attacked.”

Samuel Johnson (quoted in James Boswell, Journal of a Tour to the Hebrides)

More words of advice for the writer of a negative review

Nigel Beale quotes Helen Gardner:

“Critics are wise to leave alone those works which they feel a crusading itch to attack and writers whose reputations they feel a call to deflate. Only too often it is not the writer who suffers ultimately but the critic…”

Beale asks: “Which is great and poetic and all, however, is silence enough?”

To me, the chief function of the critic ought to be explore a work as honestly as possible and to illuminate to the best of her abilities. This means openness and it means being willing to say that a work is weak (and why), as well as showing how it is weak. In other words, you should be able to answer the who, what, where, when, why, and how on it, with an emphasis on the last two.

One should squelch “a crusading itch to attack and writers whose reputations they feel a call to deflate,” if you’re attacking merely to attack, or merely because someone’s balloon is overinflated. For example, Tom Wolfe seems a frequent and, to my mind, unfair object of ridicule among critics. But if you’re rendering a knowledge opinion that happens to be negative, you’re doing what you should be, and what I strive to. Often this means writing about why a book fails—perhaps too frequently.

Good reviews and Updike

Every attempt at review and criticism ought to be good—but that doesn’t mean positive. A review should be “good” in the sense of well-done and engaging might be a negative one. In an ideal world, the book should decide that as much as the critic.

John Updike’s rules for reviewing are worth following to the extent possible. I would emphasize three of them:

1. Try to understand what the author wished to do, and do not blame him for not achieving what he did not attempt.

2. Give him enough direct quotation–at least one extended passage–of the book’s prose so the review’s reader can form his own impression, can get his own taste.

5. If the book is judged deficient, cite a successful example along the same lines, from the author’s ouevre or elsewhere. Try to understand the failure. Sure it’s his and not yours?

In the end, I think such rules are designed to keep the reviewer as honest as the reviewer can be. I keep coming back to the word “honesty” because it so well encapsulates the issues raised by Beale, Updike, Orwell, and others.

I especially like the “direct quotation” comment because there are no artificial word limits on web servers, meaning that you should give the reader a chance to disagree with your assessment through direct experience. Quoting of a sufficient amount of material will give others a chance to make their own judgments. Merit can be argued but not proven: thus, a critic can avoid silence and unfair attack.

As the above shows, I like Beale’s answer—”no”—which seems so obvious as to barely need stating. I’d rephrase Gardner’s assertion to this: “beware of relentlessly and thoughtlessly attacking.”

The Aeron, The Rite of Spring, and Critics

In Malcolm Gladwell’s book Blink: The Power of Thinking Without Thinking, he quotes Bill Dowell, who was the lead researcher for Herman Miller during the development and release of the now-famous Aeron in the early 1990s; I’m sitting in one as I type this. The Aeron eventually sold fantastically well and became a symbol of boom-era excess, aesthetic taste, ergonomic control, excessive time at computers, and probably other things as well. But Dowell says that the initial users hated the chair and expressed their displeasure in focus groups and testing sites. According to him, “Maybe the word ‘ugly’ was just a proxy for ‘different.’ ”

That’s a long wind-up for an analogy that explains how Helen Gardner might be telling us that when we instinctively dislike, we might be reacting against novelty rather than its real merit, as critics and listeners notoriously did during Stravinsky’s The Rite of Spring. She’s wise to warn us about that danger, because it’s how people who pride themselves on taste and knowledge become conservative, stuffy critics. If we’re saying something is “bad” merely because it’s “different,” then we’ve already effectively died aesthetically because we’re no longer able to expand what “good” means. One thing I like about Terry Teachout’s criticism and his blog, About Last Night, is that he has strong opinions but still very much seems to have aesthetic suppleness.

But the Aerons and Ulysses of the world are exceedingly rare. Dune and Harry Potter aren’t among them. Joseph O’Neill’s Netherland at least might be, which I concede obliquely in my post about it.

Most works of art are, by definition, average.

The question is: to what extent is that a bad thing? Maybe none at all: an average novel doesn’t cause the death or disfigurement of children, or propagate social inequality, or do any number of other pernicious things. Its chief ill is that it wastes time for the person who reads it and perceives it as average (as opposed to the person who reads it and judges it extraordinary, which many Harry Potter readers have evidently done).

Milan Kundera thinks otherwise—in The Curtain, he writes, “… a mediocre plumber may be useful to people, but a mediocre novelist who consciously produces books that are ephemeral, commonplace, conventional—thus not useful, thus burdensome, thus noxious—is contemptible.” He gives himself a key out here: the word “consciously.” I doubt many writers consciously set out to produce commonplace books, or do so with that intent, and so may be rescued from the burden of Kundera’s scorn. Like the criminal justice system, Kundera separates those who knowingly commit a crime from those who do so accidentally.

You need to have read widely, however, to be capable of knowing the average from the incredible, and those whose effusive praise for Harry Potter and Dan Brown splatters the web show they haven’t. Hence, perhaps, the hesitance many Amazon reviewers show toward low scores, which one of Beale’s commenters observes.

The Aerons of Art

I now look at the Aeron as beautiful, and to me the over-stuffed office chairs that used to symbolize lawyerly and corporate status look as quaint as black and white photos of Harvard graduation classes without women or minorities. If we’re open to seeing the new, I think we’ll be safe enough in condemning the indifferent and pointing towards the genuinely astonishing works that are very much out there.

Edit: The Virginia Quarterly Review weighs in.

The chronic fear of reading’s demise

As if you needed more on reading and its benefits (as I discuss here, here, here, and here), see People of the Screen from the New Atlantis. It’s a long article worth reading in full, but these paragraphs stand out:

Whether one agrees with the NEA or with Bloom, no one can deny that our new communications technologies have irrevocably altered the reading culture. In 2005, Northwestern University sociologists Wendy Griswold, Terry McDonnell, and Nathan Wright identified the emergence of a new “reading class,” one “restricted in size but disproportionate in influence.” Their research , conducted largely in the 1990s, found that the heaviest readers were also the heaviest users of the Internet, a result that many enthusiasts of digital literacy took as evidence that print literacy and screen literacy might be complementary capacities instead of just competitors for precious time.

[…]

Just as Griswold and her colleagues suggested the impending rise of a “reading class,” British neuroscientist Susan Greenfield argues that the time we spend in front of the computer and television is creating a two-class society: people of the screen and people of the book. The former, according to new neurological research, are exposing themselves to excessive amounts of dopamine, the natural chemical neurotransmitter produced by the brain. This in turn can lead to the suppression of activity in the prefrontal cortex, which controls functions such as measuring risk and considering the consequences of one’s actions.

Writing in The New Republic in 2005, Johns Hopkins University historian David A. Bell described the often arduous process of reading a scholarly book in digital rather than print format: “I scroll back and forth, search for keywords, and interrupt myself even more often than usual to refill my coffee cup, check my e-mail, check the news, rearrange files in my desk drawer. Eventually I get through the book, and am glad to have done so. But a week later I find it remarkably hard to remember what I have read.”

[…]

But the Northwestern sociologists also predicted, “as Internet use moves into less-advantaged segments of the population, the picture may change. For these groups, it may be that leisure time is more limited, the reading habit is less firmly established, and the competition between going online and reading is more intense.” This prediction is now coming to pass: A University of Michigan study published in the Harvard Educational Review in 2008 reported that the Web is now the primary source of reading material for low-income high school students in Detroit. And yet, the study notes, “only reading novels on a regular basis outside of school is shown to have a positive relationship to academic achievement.”

I realize the irony of sharing this on the Internet, where it’s probably being read on the same screens criticized by the study, and perhaps demonstrating the allegedly rising divide between screen readers and book readers.

Compare the section above to my post on Reading: Wheaties, marijuana, or boring? You decide, which discusses the innumerable articles on reading’s decline (or maybe not). Alan Jacobs has an excellent post on Frum and Literature in which he observes that reading, especially real books, has probably always been a minority taste and probably always will be. Orwell opens his 1936 essay “In Defence of the Novel” by saying “It hardly needs pointing out that at this moment the prestige of the novel is extremely low, so low that the words ‘I never read novels,’ which even a dozen years ago were generally uttered with a hint of apology are now always uttered in a tone of conscious pride.” The whole piece is available in the collection Essays.

Finally, consider From Books, New President Found Voice in the New York Times, which I’m sure every book/lit blogger has already linked to by now:

Much has been made of Mr. Obama’s eloquence — his ability to use words in his speeches to persuade and uplift and inspire. But his appreciation of the magic of language and his ardent love of reading have not only endowed him with a rare ability to communicate his ideas to millions of Americans while contextualizing complex ideas about race and religion, they have also shaped his sense of who he is and his apprehension of the world.

Mr. Obama’s first book, “Dreams From My Father” (which surely stands as the most evocative, lyrical and candid autobiography written by a future president), suggests that throughout his life he has turned to books as a way of acquiring insights and information from others — as a means of breaking out of the bubble of self-hood and, more recently, the bubble of power and fame. He recalls that he read James Baldwin, Ralph Ellison, Langston Hughes, Richard Wright and W. E. B. Du Bois when he was an adolescent in an effort to come to terms with his racial identity and that later, during an ascetic phase in college, he immersed himself in the works of thinkers like Nietzsche and St. Augustine in a spiritual-intellectual search to figure out what he truly believed.

Without his experience in books, Obama probably wouldn’t be where he is, and millions of others must silently share the same condition of achieving what they have thanks largely due to their learning. But they seldom get a voice in the pronouncements about reading’s decline, and those articles seldom acknowledge that, while society might lose a great deal from the allegedly decreasing literacy of its members, those members will lose vastly more on an individual level, and few will even realize what they’ve lost.

(Hat tip Andrew Sullivan.)

The chronic fear of reading’s demise set against its benefits

As if you needed more on reading and its benefits (as I discuss here, here, here, and here), see People of the Screen from the New Atlantis. It’s a long article worth reading in full, but these paragraphs stand out:

Whether one agrees with the NEA or with Bloom, no one can deny that our new communications technologies have irrevocably altered the reading culture. In 2005, Northwestern University sociologists Wendy Griswold, Terry McDonnell, and Nathan Wright identified the emergence of a new “reading class,” one “restricted in size but disproportionate in influence.” Their research , conducted largely in the 1990s, found that the heaviest readers were also the heaviest users of the Internet, a result that many enthusiasts of digital literacy took as evidence that print literacy and screen literacy might be complementary capacities instead of just competitors for precious time.

[…]

Just as Griswold and her colleagues suggested the impending rise of a “reading class,” British neuroscientist Susan Greenfield argues that the time we spend in front of the computer and television is creating a two-class society: people of the screen and people of the book. The former, according to new neurological research, are exposing themselves to excessive amounts of dopamine, the natural chemical neurotransmitter produced by the brain. This in turn can lead to the suppression of activity in the prefrontal cortex, which controls functions such as measuring risk and considering the consequences of one’s actions.

Writing in The New Republic in 2005, Johns Hopkins University historian David A. Bell described the often arduous process of reading a scholarly book in digital rather than print format: “I scroll back and forth, search for keywords, and interrupt myself even more often than usual to refill my coffee cup, check my e-mail, check the news, rearrange files in my desk drawer. Eventually I get through the book, and am glad to have done so. But a week later I find it remarkably hard to remember what I have read.”

[…]

But the Northwestern sociologists also predicted, “as Internet use moves into less-advantaged segments of the population, the picture may change. For these groups, it may be that leisure time is more limited, the reading habit is less firmly established, and the competition between going online and reading is more intense.” This prediction is now coming to pass: A University of Michigan study published in the Harvard Educational Review in 2008 reported that the Web is now the primary source of reading material for low-income high school students in Detroit. And yet, the study notes, “only reading novels on a regular basis outside of school is shown to have a positive relationship to academic achievement.”

I realize the irony of sharing this on the Internet, where it’s probably being read on the same screens criticized by the study, and perhaps demonstrating the allegedly rising divide between screen readers and book readers.

Compare the section above to my post on Reading: Wheaties, marijuana, or boring? You decide, which discusses the innumerable articles on reading’s decline (or maybe not). Alan Jacobs has an excellent post on Frum and Literature in which he observes that reading, especially real books, has probably always been a minority taste and probably always will be. Orwell opens his 1936 essay “In Defence of the Novel” by saying “It hardly needs pointing out that at this moment the prestige of the novel is extremely low, so low that the words ‘I never read novels,’ which even a dozen years ago were generally uttered with a hint of apology are now always uttered in a tone of conscious pride.” The whole piece is available in the collection Essays.

Finally, consider From Books, New President Found Voice in the New York Times, which I’m sure every book/lit blogger has already linked to by now:

Much has been made of Mr. Obama’s eloquence — his ability to use words in his speeches to persuade and uplift and inspire. But his appreciation of the magic of language and his ardent love of reading have not only endowed him with a rare ability to communicate his ideas to millions of Americans while contextualizing complex ideas about race and religion, they have also shaped his sense of who he is and his apprehension of the world.

Mr. Obama’s first book, “Dreams From My Father” (which surely stands as the most evocative, lyrical and candid autobiography written by a future president), suggests that throughout his life he has turned to books as a way of acquiring insights and information from others — as a means of breaking out of the bubble of self-hood and, more recently, the bubble of power and fame. He recalls that he read James Baldwin, Ralph Ellison, Langston Hughes, Richard Wright and W. E. B. Du Bois when he was an adolescent in an effort to come to terms with his racial identity and that later, during an ascetic phase in college, he immersed himself in the works of thinkers like Nietzsche and St. Augustine in a spiritual-intellectual search to figure out what he truly believed.

Without his experience in books, Obama probably wouldn’t be where he is, and millions of others must silently share the same condition of achieving what they have thanks largely due to their learning. But they seldom get a voice in the pronouncements about reading’s decline, and those articles seldom acknowledge that, while society might lose a great deal from the allegedly decreasing literacy of its members, those members will lose vastly more on an individual level, and few will even realize what they’ve lost.

(Hat tip Andrew Sullivan.)

Why are so many movies awful?

The short answer: they’re ruled by marketing, not by art, feeling, or emotion, to the extent that those characteristics can’t be captured by marketing.

The longer answer comes from Tad Friend’s article in the January 19 2009 issue of The New Yorker, “The Cobra: Inside a movie marketer’s playbook,” which describes how movies get made. Today, the answer is nearly identical to the question of how movies get marketed. My favorite quote is a little less than midway through:

” ‘Studios now are pimples on the ass of giant conglomerates,’ one studio’s president of production says. ‘So at green-light meetings it’s a bunch of marketing and sales guys giving you educated guesses about what a property might gross. No one is saying, “This director was born to make this movie.” ‘ “

“Pimples on the ass of giant conglomerates:” it’s a great metaphor that conveys precisely how much vast corporations care about art as well as the relative power of those existing within studios. Creativity isn’t dead, even in major studios’ presidents of production, but neither is cynicism, as the article shows in too many places to enumerate. “Cynical” might be too light a word—if Julie Salamon’s ‘The Devil’s Candy: The Bonfire of the Vanities Goes to Hollywood is somewhat cynical, then nothing except perhaps nihilism describes the Hollywood marketer’s mind as portrayed by Friend.

Read the whole article for more: it never comes out and baldly states what’s obvious, as I have. This blog only occasionally strays into territory dealing with movies; this analysis of Cloverfield is my only extended treatment of one, although this post discusses movie versions of Ian McEwan’s Atonement and George Crile’s Charlie Wilson’s War. Perhaps it isn’t a coincidence that the movies I tend to pay the most attention to are based off books; according to Friend’s article, such movies are “‘pre-awareness’ titles: movies like ‘Spider-Man’ whose stories the audience already knew from another medium […]” like virtually all that have made extraordinary amounts of money in the last decade. Movies also tend to raise a book’s profile enough to encourage me to read it when I otherwise wouldn’t; the movie version of Bernhard Schlink’s The Reader is an example of this.

I suppose the same question regarding why so many are so bad could be applied to books too, but books are often less obvious: critics seem to have (slightly) more power, and the sheer number of books makes the bad ones easier to ignore. Call it strength in diversity. Movies are noisier, and because there are fewer of them, each one collects more attention. But because they cost so much to make, they become a numbers game; I care vastly more about aesthetic worth than opening weekends. But, at least as shown in this article, Hollywood cares about those numbers.

It shows in their product.


EDIT: Wynton Marsalis, by way of Alex Ross:

 

At the root of our current national dilemmas is an accepted lack of integrity. We are assaulted on all sides by corruption of such magnitude that it’s hard to fathom. Almost everything and everyone seems to be for sale. Value is assessed solely in terms of dollars. Quality is sacrificed to commerce and truthful communication is supplanted by marketing.

In addition, see my comments on Julie Salamon’s The Devil’s Candy: The Bonfire of the Vanities Goes to Hollywood for more on how the way movies are made affects the movies that are made.

Further comments on John Barth’s Further Fridays

(See my initial laudatory post here.)

John Barth’s Further Fridays continued to delight till the end, and it hovers ceaselessly around literary questions about form, character, ways of telling, and meaning. Do those sound boring? Maybe when I list them, but when they become part of Barth’s stories—and the Further Friday pieces feel more like stories than essays—they come alive like a Maryland Blue Crab. Consider this great big chunk of quote—appropriate, maybe, for someone who often delivers great big chunks of novel—but it also shows some of Barth’s gift at the level of sentence and idea:

I confess to having gotten increasingly this way [as in, insisting for just facts, whatever those are] myself over the years—an occupational side effect, I believe, in the case of those of us for whom the experience of fiction can never be innocent entertainment. We’re forever sizing it up, measuring ourselves against its author, watching to see how the effects are managed and whether all the dramaturgical pistols that were hung on the wall in act one get duly fired in act three. We’re like those musicians who can’t abide background music: They can’t listen except professionally, and if they’re not in the mood to do that, they prefer conversation, street noise, silence—anything but music.

Right: notice the quick metaphor of the dramaturgical pistols—alluding to the idea that a gun seen in an early chapter should be fired in a later one—and the slightly more developed metaphor of the musician. The musician idea is particularly relevant to Barth, who played as a young man—more on that later—but it also expresses one of the central themes in his work: that innocence prolonged is detrimental to the person holding it and that naive readings eventually give way to sophisticated and experienced readings. They show the growth of not just the critic, writer, or reader, but also of the individual, whose early actions and impressions should be tempered by experience. But some attempt to prolong naiveté foolishly, while others forget to try and see the perspective of the innocent or the childlike joy that can lead to great art. So what is one to do? Muddle along as best one can, Barth seems to argue, and learn as much as you can about that imperfect state we call life and the reactions of other smart or wise people to it.

I realize that the above paragraph sounds almost like self-help lite, but it would be a mistake to see Barth that way, and he discusses far more than just the nature of a particular story. Elsewhere, he deals with literary categorization, which has never been among my favorite subjects because it often seems to generate vastly more noise than music, and its combatants often mistaken that cacophony for a symphony. Barth does a reasonably good job—which is to say, as good a job as one can, given the subject matter and persnickety pedants likely to be interested—of not being caught in its brambles. Adding sufficient qualification makes for fewer explosions but greater harmony; as Barth says of Roland Barthes’ Writing Degree Zero

“the whole of literature,” [as Barth quotes Barthes] “from Flaubert to the present day, becomes the problematics of language.” If only he had been content to say that “the problematics of language”—indeed, the problematics of every aspect of the medium of literature, not language alone—becomes one of several prominent field-identification marks of our literature after “Flaubert.” But that kind of reasonable modification, I suppose, de-zings such zingers.

Given the choice of being mostly right and demure or mostly wrong and provocative, Barth takes the mostly right path. Still, he’s not “demure” as in boring, and his essays are filled with unusual zest. Sometimes the footnotes are the best parts; the blockquote above is one, and he sneaks another comment into a footnote, though it’s reiterated elsewhere in the body text: “As for twentieth-century literary Postmodernism, I date it from when many of us stopped worrying about the death of the novel (a Modernist worry) and began worrying about the death of the reader—and of the planet—instead.” The sentiment has its tongue-in-cheek enough not to be taken completely seriously, and yet it’s accurate enough to consider further consideration. Maybe in jokes we tell the greatest truths that could never slide by as bald assertions.

The piece the modernist definition comes from was published in the 1980s, although it reprises arguments from 1968 and 1979, about which one can read more in The Friday Book. But its concerns are still germane: global climate change fears fuel cataclysmic scenarios that aren’t implausible, as do those involving the death of reading. Reading’s demise seems to be greatly exaggerated—what do most of us do online and via e-mail if not read, as Steven Berlin Johnson argues in Dawn of the Digital Natives—but the quality of reading seems to diminish apace online. Still, websites with global reach and many visitors seem fairly literate, and the only well-known, sub-literate blog I can think of is Mark Cuban’s, which I won’t dignify with a link. Then again, Cuban is also sitting on such a giant pile of cash that I doubt he cares about literacy, or Postmodernism.

Like Barth, I seem to have wandered a bit, and also like him, I’d like to circle back round to the main point of this post, which is to emphasize how good Further Fridays is. Sections repeat and reiterate earlier ideas, but I think of the repetitions more as variations in different keys than as irritants, and I think Barth would like that metaphor: he played jazz as a teenager and writes of going to Julliard to discover he had no or too little talent for music (my own musical talent, if I had any to begin with, has probably become undetectable thanks to lack of exercise). Milan Kundera also took up writing after music, and I wonder if other good example of musicians-turned-writers exist aside from Alex Ross, who turned from music to write about music. Barth is as self-referentially modest about his musical abilities as his other points, almost cloaking himself in faux humility when he writes, for instance: “My modest point is that the story of your life might be told as a series of career moves, or love affairs, or intellectual friendships, or houses lived in, or ideologies subscribed to (even magazines subscribed to), or physical afflictions suffered, or what have you, and that every one of those series might be recounted from very different perspectives, to very different effect.” Indeed: and we appreciate that, and the way it implicitly makes the case for reading. He preaches like the native to a religion he nonetheless realizes fewer practice:

If you happen to be a refugee from the Dorchester County tide marshes… as I was and remain, and particularly if you aspire to keep one foot at least ankle deep in your native bog while the other foot traipses through the wider world, it is well to have such an off-the-cart smorgasbord [of reading] under your belt, for ballast.

Incidentally, I’m fascinated with the catastrophic view of reading and its discontents: consider Jonathan Franzen’s introduction to How to Be Alone:

I used to consider it apocalyptically [there’s that end-times terminology again] worrisome that Americans watch a lot of TV and don’t read much Henry James. I used to be the kind of religious nut who convinces himself that, because the world doesn’t share his faith (for me, a faith in literature), we must be living in End Times.

I wonder too, as this blog probably demonstrates. Still, I’d argue that you can’t avoid keeping one foot in your native bog, regardless of whether that metaphorical bog is the boring suburbs of Bellevue, Washington, as it was for me, or the foothills of the Himalayas, or New York City, so you might as well do so in a way that makes you part of the wider rather than narrower world, so you can reconcile the two as best you can. The most efficient way to do so, it seems to me, is the way Barth recommends: promiscuous and wild reading, and ideally of books as interesting as Further Fridays.

John Barth’s Further Fridays is Recommended

I’m about halfway through Further Fridays, John Barth’s second “essay, lecture, and other nonfiction” collection and find it as pleasurable and intelligent as The Friday Book, his first. Perhaps my favorite essay thus far is “A Few Words About Minimalism,” which is anything but minimalist and contains this gem:

But at least among those of our aspiring writers promising enough to be admitted into good graduate writing programs… the general decline in basic language skills over the past two decades is inarguable enough to make me worry in some instances about their teaching undergraduates. Rarely in their own writing, whatever its considerable other merits, will one find a sentence of any syntactic complexity, for example, inasmuch as a language’s repertoire of other-than-basic syntactical devices permits its users to articulate other-than-basic thoughts and feelings, Dick-and-Jane prose tends to be emotionally and intellectually poorer than Henry James prose.

(Link (obviously) added by me.)

That second sentence is delicious: perhaps Barth overindulges on other-than-basic syntax to make a point, but the way the structure of the sentence helps make the point that the sentence’s content conveys makes it so impressive. Not only that, but it makes a case without over-making it: that key word “tends” gives Barth enough wiggle room to concede that one can find emotionally and intellectually powerful writing in relative simple prose, and he never states that complex prose must be more emotionally and intellectual more powerful.

That virtue of statement and qualification is present throughout; Further Fridays is the rare collection that doesn’t overstate its claims (I’m still thinking of you, John Armstrong, although it does so at the cost of necessary complexity. You can’t make nuanced arguments about the nature of literary categorization, or movements, or literature, in soundbites and slogans, and it’s also hard to do so from a dogmatic political or philosophical position. Fortunately, Barth seems to occupy none—or, as he might say, his lack of position is his position—and the result is a feeling, no doubt illusory, that I read from the perspective of someone who simply likes to read and likes stories. And I learn from him: I can throw in that “no doubt illusory” comment to protect myself from obvious criticisms while still making the overall point about the nature of criticism.

Expect more on Barth shortly.

How to Read and Why — Harold Bloom

Harold Bloom’s How to Read and Why is mostly an exercise close reading that tries to show how to learn by doing. The particular works Bloom chooses, ranging from Shakespeare to Borges to Proust, seem less important than the mere act of criticism; unlike most criticism, however, this one makes explicit the moral and other lessons it wants you to take. In some ways, How to Read and Why is a cheerleader for the personal critic inside all of us, like a book about eating that’s really for amateur restaurant reviewers for Yelp.com. How to Read and Why could also be a broader version of Shakepeare: The Invention of the Human, with short essays on a variety of authors instead of one.

Bloom passes judgment—in a very “judicious” sense of the word—on authors and works, as when he says that “Absorbing as Crime and Punishment is, it cannot be absolved of tendentiousness, which is Dostoevsky’s invariable flaw.” That Bloom didn’t say “crime” in lieu of “flaw,” shows his seriousness as a writer, and maybe also his lack of fun in seizing a terrible but obvious pun. Elsewhere, some of Bloom’s analysis manages the difficult trifecta of being subtle, meaningful, and non-obvious, as when he writes that “Turgenev, like Henry James, learned something subtler from Shakespeare: the mystery of the seemingly commonplace, the rendering of a reality that is perpetually augmenting.” The word “augmenting” is perhaps off-key, but we understand what Bloom meant. Although I don’t know whether she learned it from Shakespeare, Virgina Woolf might have accomplished the same thing.

These insights or descriptions or banal commentary, depending on perspective, are sprinkled throughout the book. In each section—”chapter” is too large a word for them—Bloom goes through essentially the same formula, relating to short stories, poems, novels, plays, and then novels again: he gives a close reading of the work, states what he thinks is unusual about its style or content, then gives a lesson or lessons. Some “lessons” are negative, in that they show what not to aspire to, while others are positive; others toe the nebulous middle, like this passage about Chekhov’s “The Student:”

Nothing in ‘The Student,’ except what happens in the protagonist’s mind, is anything but dreadfully dismal. It is the irrational rise of impersonal joy and personal hope out of cold and misery, and the tears of betrayal, that appears to have moved Chekhov himself.

In weaker hands, such a comment might be merely sentimental and, worse, fatuous. But here it feels supported—organic—although to show how would require pages and pages of quote. It show the acknowledgment of cold and misery and the reality of those things through a single word: “irrational.” With it, Bloom nods at reality and then transcends it, as “The Student” does.

Nonetheless, not everything in How to Read and Why is flawless. Bloom writes that “[…] short stories, whether of the Chekhovian or Borgesian kind, constitute an essential ” Essential form? What the hell does that mean? What’s a non-essential form? Regardless of their essentiality or lack thereof, I still don’t care much for them because, as I’ve often explained to friends amused at this reasoning, by the time I’m into one, it ends. It takes novels to really hold me and to make me want to invest in them. He makes, however, as strong an intellectual and academic case for short stories as one is likely to find, although Francine Prose, James Wood, and others argue in their favor. Regardless of their defenses, I still don’t like them.

Bloom also doesn’t and perhaps can’t explain the pleasures of reading except in terms of themselves, and perhaps that’s for the best: such sensations are difficult if not impossible to convey, but to his credit they are implied. It’s pleasure mingled with duty to Bloom, one becoming the other in the mature mind; as he writes, “I want to contrast Shakespeare’s abandonment of the work [toward ceaselessly reinventing consciousness] with Tarphon’s [a Rabbi of the same generation as the more famous Akiva] insistence that we are not free to abandon it.” The two are different perhaps for religious reasons; of Shakespeare’s inclinations we know little, but it seems that he probably had no God looking over his shoulder, while Tarphon had the possibility of disappointing God with him at every moment. The contrast between the two men is hardly surprising; it’s been claimed that the novel arose to take the place of God, meaning that a specialized form of imaginative narrative art overtook the belief in literal manifestations of a deity beyond time and space, and there is even a book with the very deliberate and appropriate title The Secular Scriptures, which studies Romance.

I’ve focused primarily on the short story section of How to Read and Why, and it’s emblematic of the strengths and weaknesses of the book as a whole. The major problem with Bloom’s approach is that sophisticated readers already do this, and they might even read critics who help them to do it better. People who don’t or seldom read probably won’t be interested. That leaves naive readers who would like to learn more, but I can’t imagine that a vast number of them are waiting for Harold Bloom’s instruction in the art of reading. It’s possible some exist, to be sure, but it seems more likely that someone interested in becoming a sophisticated reader will have already done so, and someone uninterested is unlikely to read a book to learn more about reading. How many people are there in the marginal space devoted to seekers who haven’t found much yet? Some, perhaps—the cover proclaims that How to Read and Why was a New York Times bestseller, for whatever that’s worth. Still, I could see How to Read and Why being an excellent gift book, or an excellent reference to attackers who say “why bother reading?”