Foreigners

Books and Literature, Culture

Jennifer duBois’s second novel, Cartwheel, is an example of psychological realism done right, that is to say, not locked into a desperate schemata designed to make each act of each character a Newtonian inevitability set into elegant motion within the heatless, frictionless word problem of The Way We Live Now; rather, it’s a fraught, cock-eyed rending of minds made ever more inscrutable the deeper we delve into them, a sort of dissection that ends with a pile of catalogued parts that can never again be whole. This is how the mind really works, or I should say instead, how it is, but too much realist writing has never gotten over Freud and sees consciousness as a knot to be untangled into a genealogy. Cartwheel is full of minds that make less sense as the book goes on, less sense to us and less sense to their own fictional selves. Superficially a novel about a notorious murder, it becomes a book about the obscurity and impossibility of motive, about the odd fact that self is as much a conceit as narrative; that as there aren’t actually stories in the world, neither are there selves.

The novel is both loosely and precisely based on the murder of Meredith Kercher in Perugia, or put another way, on the Amanda Knox affair, reset to Argentina, but following the general contours of that case closely. I’d heard about but hadn’t read duBois’s first novel; I should be honest and admit that the descriptions of the thing didn’t especially appeal to me; the setup sounded like the sort of prize-bait I try to avoid. I should maybe give it a chance. But I like lurid literature—I mean, I’m not above rereading Call Me by Your Name for the sex, for instance—and I was slightly fascinated by the Kercher/Knox story, as, I think, any former exchange student must be. So I grabbed Cartwheel from the library as a wild card and figured I’d read it in a night.

In fact, the murder story is a sort of MacGuffin; though the details are changed, it’s almost immediately evident that we’re not going to stray too far from the Knox case. What we get, instead, is a series of chapters that wind in and out of the consciousness of Knox-manquée Lily Hayes, her sort-of boyfriend Sebastien, her father, and Eduardo Campos, the official prosecuting her case. All of them are thoroughly introspective, fully convinced of their own insights into their own motives, and thoroughly, tragically deluded. But—and this is the book’s neat trick, its welcome departure from the norm of American “literary fiction”: their delusions are never that they misapprehend something essential about their own character, that they believe themselves to be altruists, say, when they’re really egoists; that they’ve convinced themselves they’re doing good when in fact they’re just reenacting some evil done to them long ago.

Instead, they keep circling a core of self that isn’t there; they’re not orbiting a star, but a black hole, a well so deep and dense that it approximates solidity through a kind of nonexistence. They’re all reference and no referent; a set of contingencies reflecting only each other. Now, there are some predictable bits; the effect isn’t absolute. Lily and her family are burdened by the loss of an earlier child and sibling; Eduardo, the prosecutor is confounded by a fickle, disastrous marriage. And still, the pervading sadness in Andrew and Maureen Hayes, Lily’s divorced parents, feeds mostly on itself, even as they both—both of them intellectuals—cast back in hopeless hopefulness to the ur-tragedy of losing a child, while Eduardo’s dogged, moralistic pursuit of Lily in the murder case wills itself into a parallel with the flights of Eduardo’s wife rather than being naturally generated by them. Lily is the most obscure of all of them, and we can believe that she both did and did not murder that girl; that if she didn’t, she might have; that if she verifiably did, she might not.

This is a kind of fiction I wish we got more of: subversively resistant to the idea that human beings are a quantity to be known. In it, we are utterly alien to ourselves, and our lives aren’t hemmed in by the conventions of narrative and psychology, but keep messily, insistently transgressing them.

As in, Jerk

Books and Literature, Culture, Economy, The Life of the Mind

Dave Eggers’s The Circle has won a lot of critical praise from traditional book-review types and a lot of derisive snorts from those of us—I count myself, somewhat embarrassingly, among them—who have both computers and business degrees for its comic ignorance of how computers function, what the internet is, or how these corporations are created, funded, and managed. But, although it’s easy enough, and fun, to giggle at a book whose near-future information architects say “in the cloud” with the same skittish incomprehension as your mother or your sixtysomething boss; although it’s easy enough, and fun, to plunk Eggers into the dour, self-reverential gentlemen’s club of Twitter-hating Franzens, forever clearing their throats and folding back their broadsheets in the Anglosaxon gloom of a midday liver lunch with the brocade curtains pulled; the less funny, unfortunate truth about The Circle is that it isn’t just an obtuse book, but a bad one: badly written, poorly conceived, and deeply uncharitable.

The main character is a young woman named Mae, freshly rescued from a dull job at a public utility somewhere in the Central Valley and plopped in an entry-level position in the novel’s eponymous Google-manqué. She rises through the company, and although the narrative frequently stops to mete out in actual, numerical detail the various impressive scores she receives on the company’s 100-point internal grading system, our principle experience of Mae via a limited third-person voice that never exits her rather limited head is a skein of never-ending idiocy and incompetence. Eggers clearly set out to write her as a naïf whose unfolding experience and awareness of The Circle both prompts and mirrors our own, but he accidentally wrote her as one of the shallowest dummies you’re likely to encounter this side of a Tom Friedman column; you wonder how she made it out of high school, let alone how she managed to get hired by the choicest tech firm in the world, connections or no.

She wanders around bedazzled by everything, and the prose reads like an unapproved merger of bad Young-Adult writing and SkyMall catalog copy. More skillfully done, this sort of thing might come off as satire, but here it just reads as clumsy writing, and when errors pop up, you can’t quite tell if they’re meant to be mocking Mae’s misunderstandings or if they’re just errors. There’s a particularly odd passage in which, while taking us on one more interminable tour of the company’s campus and all its myriad wonders, we encounter, “Another Circle team [that] was close to dissembling tornadoes as soon as they formed.” Does Eggers mean disassembling? Would that actually make any more sense? Would it be better writing?

Nitpicking copyediting issues is trivial and a little unfair, but there’s a broader problem here, a problem that the giggly eviscerations of Eggers’s internet non-comprehension hint at: the texture is all wrong; The Circle’s ersatzness is . . . ersatz. There are piles of detail—the naming conventions of buildings, the layout of the campus, the many projects that many teams in many departments are working on, etc.—but there’s not the slightest sense that any of these things exist except as convenient ideograms for Big Google Company Doing Big Google Company Things. A really good workplace novel, a really good workplace satire—Then We Came to the End; On the Floor—hauls the essential unreality of working life out of the weird blandness of working life as much as out of the particulars. Eggers famously (notoriously?) bragged that he hadn’t done any research on tech firms when writing this book, which is clearly not true. If anything, the book suggests an unhealthy infatuation with the self-presentation of those very internet companies, all happy-happy people playing ping-pong in the artisanal cafeteria while dreaming up the next disruptive inflection point in human history. Eggers exaggerates all this to a point light-years beyond absurdity, but he never manages to land a convincing blow because his target is itself an illusion.

So a dumb character stumbles through a poorly conceived fictional company until—spoilers—she arrives at her Winston Smith moment and loves Big Brother. Except she always loved Big Brother; her moments of doubt are petty and procedural; she always gets over them, and quickly. She must betray her friend and mentor, Annie, but the betrayal is such a foregone conclusion, so telegraphed, so obvious the moment Annie’s own first doubts emerge that you mostly wonder why it took so long to get there. Mae has already abandoned her family. Oh, and she drives her ex off a cliff, literally. That scene has a quality of slapstick. Intentional? In a book so hasty and thoughtless, it’s hard to tell. Anyway, she loves Big Brother. The Circle has turned—in about a year—into a force of global domination, and Mae is, like, cool with that. Eggers’s ideology appears to lie with her dead ex, but he’s dead, and in any event, his speeches mostly sounded like bad college-paper op-eds. I suppose Eggers made him insufferable in order to make him more complicated, to make the point that unpleasant people and cranks and kooks can be right, too, but the guy mostly comes off as a bozo. And despite living for years now with revelations about government spying and subversion of the online world, about a complex interplay of antagonism and collusion between spy agencies and tech businesses, this is a straight-up evil corporation; the hapless government is just hapless. Mae isn’t manipulated; she isn’t tortured; she isn’t a skeptic converted or corrupted by The Circle’s promise of wealth and power; she’s nothing that would make her interesting, and she doesn’t change. Eggers sets his come-to-Jesus moment in the middle of a megachurch.

But what bugs me the most, and what makes this book worth reviewing as an artifact of an attitude, is the unfair and uncharitable way Eggers writes the rest of us idiots, who appear here only as a vast, unthinking mass eating whatever shit the internet shovels at us out of some desperate, pathetic, mewling, self-worshipping desire to be loved, or something. One of The Circle’s supposed-to-be terrifying slogans is “Privacy is theft.” Well, no. Privacy is respect. But sharing (“Sharing is caring” is another scare-phrase here) is human; the desire to know and to be known is one of the bases of cognition, conscience, and sentience. There’s nothing wrong with lampooning narcissism, and the internet enables plenty of it, but this evident belief that there is something fundamentally disordered about rating your favorite restaurants or poking your friends is a load of snobbish, patrician garbage. If modernity and modernism are the history of human atomization, of the centrifugal forces of technology and economy flinging our communities and families ever farther away from each other, of the dislocation of the human mind and the human soul, then how do we find ourselves in an era when some small part of our old communities—gossipy, yes, and sometimes without secrets, and very judgmental, and yet, because we know them, very often good—can be regained, only to find that many of our writers, who are supposed to be concerned with things like the mind and the soul, hate it, and think that we’re children and fools for wanting it, even if our desire should indeed be tempered with reserve.

I am not a technological utopian. I don’t think that “information wants to be free” is an adequate ideology for the perfection of the human condition. I also don’t think that governments should be in the intellectual property racket, and I think Google and the like ought to be broken up, although I’d probably give them a pass until I got done with the banks. I worry about my privacy, both the privacy that I sign away when I log onto gmail or Amazon and the privacy that’s wrenched from me by the panty-sniffing fear salesmen in the US government. I would gladly eat my own eyeballs before reading another restaurant review on Trip Advisor. I find Instragram slightly upsetting, but I’m willing to admit that’s probably just me getting a little bit old.

But it’s unkind and unperceptive to assume that people’s sometimes ill-considered flight to convenience is a mark of some vast inadequacy. Insofar as we encounter any citizens of the internet in Eggers’s book, they’re a horrifying stampede of gimme-gimme-gimme status-obsessed zomboids who will kill a man through an overabundance of likes. Yeah, well, maybe some of them are just busy moms with long commutes to lousy jobs in a shit economy. Maybe some of them are wannabe reviewers who got one too many rejection notes from McSweeny’s and decided to start their own blogs. Maybe some of them are gay dudes in Russia or activists in Myanmar who find pseudonymity convenient and safe. Maybe some of them just like taking pictures of their fucking food. Maybe some of them were forced to move across the country for work and this is just how they stay in touch. Maybe some of them are uncomfortable with the compromises they make to bank online or e-file their taxes. And maybe some of them are trolls and misogynists and scammers and crooks. But they are, in fact, people, for whom The Circle, in whose sympathies they supposedly lie, hasn’t got any time at all.

Update: I was not the first to this post’s title, or at least, the punchline. Credit and attribution where due.

We Like Ike, Man

Culture, Education, Uncategorized, War and Politics

I graduated from Oberlin College ten years ago, and if the college was in many ways an exemplar of the sort of economic inequality and unfairness that define the waking American dream, a charming oasis of unostentatious but everywhere evident family wealth amid a lot of Cass Gilbert architecture plunked obscenely in the middle of one of the poorest counties in Ohio, then it was also a fine example of what a college or university ought to be. Yes, it had its share of bureaucrats, and yes, there was an occasional adjunct, though usually just visiting for a year right out of graduate school, but there were precious few deans; I never once met a “director” of anything other than, maybe, campus dining; the departments were run by faculty; the office of career services was a distant backwater, an uncomfortable fishbowl near an underutilized computer lab; we got stoned and complained mightily about the fascist administration of then-college president, Nancy Dye, about the progressive, radical spirit of the school disappearing in the assault of Ivy-League-ism, but in retrospect I most remember that everyone seemed genuinely to believe that the purpose of the whole shebang was for everyone to read a lot, think a lot, and learn a few things. There were a bunch of professors, most of them seemingly well-paid, and not very many students as far as the ratios went. It was very expensive, but you could multiply the number of kids times the number of dollars per kid and come up with a reasonable cost for operating such an institution for a year. Select any random college employee, and you could figure out without too much trouble what it was that he or she did all day.

So you can imagine the revelation of entering a business school at a large public university almost a decade later. Great gouts and floods of ink have already broken the dam and overrun the banks of the conversation about “the rising costs of higher education,” and I won’t bother repeating all the data that others have collected, collated, and explained better than I ever could. But I can’t help but share my anecdotal astonishment at the number of inessential administrators running around. Even the dean (especially the dean?) of the business school drifted from here to there on campus in a slightly overlarge suit that seemed expressly tailored to contain both a man and his aura of uselessness. Of the dozens I encountered, only one manager, a sensible, lovely woman named Linda, far down the hierarchy of pay and title, ever managed to get anything done; I mean, she got everything done, from our schedules to the hiccups in our travel arrangements when we went to conferences abroad.

I don’t mean to cast aspersions on their characters. One of the bad habits in the radical’s critique of any institution is to presume evil intentions on the parts of people who simply, unthinkingly serve. Most of the people involved in the spiraling scam of university administration are just doing their jobs, however hopelessly unnecessary they may be to the actual operation of an actual organization dedicated to the real teaching of students. Making some assistant director for recruitment the object of moral ire is like hating on some corporate spend analyst in the bowels of Enron. How many of us would give up our livelihood at the vague prospect that our employer might be causing an indefinable and distant harm? The assistant director of recruitment just wants to make his quota for the year, save enough money for a vacation, pay his rent, go to a nice restaurant from time to time. Does he realize, in some general way, that he’s implicated in the personal debt crisis, or the Taylorization of learning? Hey, he went to grad school, too. He’s no dummy. But you gotta feed the monkey.

This isn’t to say that there’s no moral blame; it is to say that you’ve gotta amortize that blame over an awful lot of associate deans and provosts and boards of trustees. We are uncomfortable with the idea of distributed guilt, but there it is. What makes the problem intractable is precisely its lack of some monstrous secret master, some center, not to mention the essential ordinariness of all the participation by all the beneficiaries of a rent-seeking education apparatus that largely apes finance and government by siphoning money from the general wealth and moving it to certain select cadres of the population. That last bit, of course, makes the whole thing even more confounding, since the scam is so non-particular; you can’t even blame the institutions of education, which are only comporting themselves to an even broader social and economic pattern. The modern university is to contemporary American society what that vice-provost for media relations is to the university: a functionary, just doing its job.

So I’ve been thinking about David Petraeus, a former military commander in Iraq and Afghanistan and the director of the CIA for a year before an inconsequential sex affair involving a sycophant biographer and bankrupt Tampa con artist caused him to resign. He was hired by CUNY to teach the sort of bogus celebrity seminar that appeals to college administrators because it predominantly involves reading Economist articles and consulting group reports and considering how to reproduce them in the form of PowerPoint presentations, in other words, exactly what an assistant director of does for much of the day. This is a slightly more advanced version of the kind of education foisted on primary and secondary students, with the slide show template filling in for the bubble sheet. It’s mostly notable in that it requires no thought; it’s an exercise in formatting. For this, the university offered to pay the general $200,000, later reduced to $150,000, and then, when a load of malcontents refused to shut up about it and administrators got worried about bad press, finally, they knocked it down to a one-buck honorarium.

This original scandal was mostly about money. Adjuncts were starving in the outer boroughs, while some four-star jerk was going to get paid $10,000 an hour to show up and gallop through material prepared for him by his own underpaid assistants. What was fascinating about this episode was less the imbroglio itself than the reaction of the participants; most notable to me was the initial incomprehension and painfully slow dawning of the problem on the administrators who brought the general to the table to begin with. Their first reaction was visceral disbelief. But, but, he’s David Petreaus. Former 4-star general and CIA director David Petraeus! These are people for whom status and career recognition hold intrinsic value—name and title function as a kind of irreducible gold standard of human worth. The idea that one might not richly compensate such a guy just for showing up was so alien to them that they could not, at first, understand what the fuss was about. The relationship between this and the underpayment of temporary faculty was thoroughly beyond them.

But eventually they did come around to the idea that there was, at least, some sort of fuss, and they grudgingly reduced his pay. With the economic argument now largely undercut, opposition to Petreaus’s appointment found a new target in the idea that he is an abominable war criminal who presided over unspeakable violence and torture in the illegal occupation of other countries, and who now sullies the university with his very presence. Since I am, and have always been, deeply opposed to US military action abroad, the invasions and occupations of Iraq and Afghanistan in particular, I’m innately sympathetic to this view, but I also believe that we err in assigning this sort of direct and unique moral culpability to Petreaus, that we commit, in effect, the mirrored error of his boosters, who generally proclaim him the hero and genius who rescued one, possibly two American wars from utter catastrophe.

Petreaus strikes me as a skilled bureaucrat who rose steadily through the ranks of America’s largest and most byzantine bureaucracy, but I find it hard to believe that a man who assigns Brookings Institution readings and Washington Post op-eds as anything other than object lessons in bad prose can be any kind of genius. His legendary success in Iraq was no success at all, not even by America’s own self-interested, self-designed, and self-applied metrics, and his supposedly ingenious reinvention of America’s Iraq occupation was never more than a tactical redeployment cribbed from a centuries-old colonial playbook. Remember the glowing reports of military brass gathered in dark conference rooms watching The Battle of Algiers? We’ve been to this theater before. His proponents would cast him as some kind of Eisenhower; his opponents as some kind of latter-day Heydrich. In reality, he was a functionary, and for all the horror perpetrated under his command, he was only the latest in a long line of commanders going back many decades. The war in Iraq, let’s not forget, began not under George Bush, but under his father; the US never ceased its low-level conflict under Clinton; Bush Jr. just re-upped; Obama continued it, although it seems as if he may have been out-foxed by the Iranians into withdrawing at last. The US project in the Middle East dates to the passing of influence from Britain to America after the Second World War; we’ve been fighting conflicts and proxy conflicts in the region for half a century. Petraeus may indeed be a criminal, as the internal auditors at Lehman were criminals, but in our zeal to condemn, let’s remember that all of these guys just showed up for work and did what they were told. Better men would have resigned; good men would never have found themselves in such a position to begin with; but there aren’t that many good men in the world, and most Americans do what they’re told.

None of this absolves Petreaus of responsibility or culpability. He was, after all, a general, but the main characteristic of his life and career is not the vicious contemplation of how to bring violence, misery, and death to peoples around the world, but rather the stubborn inability to think about that violence, misery, and death, to consider it in any way other than the unfortunate but necessary ancillary outcome of some other thing that had to have been done. The very same unthinking allows the President of the United States of America stand before the United Nations and say that the US harbors no imperial agenda because it frequently invades other countries. This is taken as evidence of extraordinary hypocrisy or cognitive dissonance, but both interpretations require an element of cognition that’s wholly lacking. The principal characteristic of these sorts of pronouncements is their lack of deliberateness and their lack of thought. These are just rote recitations of obligatory memorization; is it any wonder that a society led by such cliché machines chooses to measure intellectual achievement through standardized tests?

I think this is the really salient point. A brutal and unfair society requires a population that conceives of the intellect in terms of taking instruction. Even in my own student days, when testing was far less important, I can recall teachers and exam proctors stalking up and down the aisles between desks warning us of the dire consequences of not carefully reading the instructions. A culture thus educated develops mental habits that revolve around taking and interpreting commands. Its sense of duty and ethics isn’t is this right, but rather, am I doing this right? In this regard, the appointment of a David Petreaus, or a John Yoo, or a Condoleeza Rice to prominent positions in the academy are less significant because these people are monstrous than because they are expressly not so. When Yoo is asked about the torture memos he authored and replies that he was just providing the executive with what it requested and required, people tend to see obfuscation, but I see an instructive kind of honesty: he just can’t imagine that one wouldn’t provide what his boss required of him. He didn’t torture anyone.

This by the way, was Arendt’s misunderstood point—she had the bad luck to coin a very quotable phrase that distracted from it. What enables evil is not so much the capacity of ordinary people to be converted to dark purposes, but instead the incapacity of people to think about purpose and consequence. Our dilemma is that this form of thoughtlessness is exactly what the reformers of education at all levels seek. Unfortunately, for the most part, they too are unable to think about what they’re doing. The people who hire a Patreaus only perceive that his instruction might in some way help some students do what he did, and what they themselves have done to a lesser degree: enter an institution, serve it, and move upward through its ranks to their natural place in the overall order. Does it occur to them that this is Huxley’s dystopia, a life of servitude in a predetermined class interspersed with the occasional recreational bunga bunga and some Coors Light Lime? No. They haven’t read it. But you can divide into groups of four and prepare an in-class presentation for the next time we meet. Here is a Harvard Business Review article summarizing the case. Use it as the basis for your work.

Schnell! Eeeeaaasssy. Um?

Culture, Economy, Media, Movies

I walked into Elysium a few minutes late, and Matt Damon was getting the beat-down from a robot, to whom he’d had the temerity to back talk. This robot was the only character in the film whose motivations were clear and whose actions were a function of its character. Nothing else made any sense.

In the future, an orbital post-scarcity society with the capacity to manipulate complex organic systems at the sub-molecular level maintains Fordist manufactories on Earth. Is it just to give the proles something to do? A single line of dialogue to the effect of, “We gotta keep them busy or they will revolt,” might have covered this flaw, although how an earthbound population could revolt against a well-armed space station, manifest numerical superiority or no, is quite a question, and in any case, most of the people on Earth don’t appear to have work, so there goes that theory. William Fitchner plays the vicious capitalist who runs this robot mill. He does most of his via computer terminal in a hermetically sealed office; naturally, we wonder: why is he on earth at all? Couldn’t he just Skype? Rather more to the point, in an orbital society capable of manipulating individual atoms, why is there still enterprise capitalism? Is it like contact sports, a vicious and anachronistic entertainment, practiced by only a few professionals, kept around for entertainment and kicks? Well, our industrialist suggests that it’s essential he get his company back to profitability, and he is willing to assist Jodie Foster in a coup to do so. Wait, wait, wait a minute. She offers him a 200-year contract to build Elysium’s missiles and robots. Does this not imply a competing firm, or firms? But there’s only one space station. How are these other firms in business? Who’s buying their robots and missiles? Am I going insane? What day is this?

The movie desires to be an allegory of illegal immigration, the hispanophone have-nots of a SoCal favela relentlessly throwing themselves over the Rio Grande of Near Earth Orbit in order to get to the better lives Elysium has to offer. Wait, what? Oh, no, I’m sorry. They’re going for miracle cures. Elysium doesn’t offer a better life. It just offers to fix your boo-boos. Three out of every four shiploads of immigrants gets blown to smithereens, so, like, it appears that you do not increase your chances of beating that cancer, if you consider the actual odds. Here again, the motivations are completely nonsensical, and needlessly so. We could understand people risking their lives to escape this wretched Earth in order to make a new life in space, but all evidence suggests that even those who get there and get their thyroid problems and sugar diabeetuss cleared up get deported right back to our little ball of pollution. Your cancer’s fixed, but you’re still going to starve to death. Hm.

Meanwhile, on Elysium, Jodie Foster plays a French fascist. Fascism, like capitalism, seems an odd ideology for a post-economic paradise, but I suppose assholes will always be with us. She plays some kind of secretary of defense, and she growls that the feckless leadership of Elysium is going to get them all killed, or something, despite the fact that everything on Elysium seems to be going absolutely swimmingly, and the few Earthers who do manage to crash land in this vast La Jolla in the sky appear to be swiftly rounded up and returned. Again and again and again, nothing about this world justifies her snarling aggression (nor her French, but I suppose it’s just meant to convey aristocratic awfulness, so we’ll laissez-faire).Why not make immigration a really confounding problem? The smugglers have figured out how to get hundreds, thousands of people onto Elysium. It’s upsetting the political order. They’re voting for OBAMA! The white peoples is gettin’ restless.  Jodie Foster seule pouvait eux sauver !

So Jodie Foster wants to take over Elysium for no reason, and she has William Fitchner rewrite the code for the Elysium operating system. Which he can do, because he or his company built it? Why is he in such desperate straits, then? Why is he bugging Jodie Foster for contracts? Why doesn’t he take over? I don’t know, I guess he read the script, and it says that he didn’t. He’s motivated by money in a world where money is irrelevant. Elysium has no stores, no ATMs. It’s just houses and swimming pools. Robots bring you champagne. At no point do we see any sort of transactional exchange, except of course when William begs Jodie for a contract. No one has a reason to do anything. William Fitchner goes to Earth, writes the magical spell to take over Elysium, and gets shot down by Matt Damon. Damon is dying because he got irradiated building robots in the factory that exists for no reason. Apparently this happens all the time, because they have a robot whose design features make it useful solely for the purpose of pulling irradiated humans out of a robot chamber. Yo, why didn’t you just send a robot into the radiation chamber in the first place, guys? No, Jim! You’ll flood the whole compartment! He’s dead already.

Anyway, Matt Damon and a gang of dudes who have kept Mazda 626s operational for two centuries shoot down his airplane. Matt Damon has been technologically augmented, and they download the shit into his brain. They unscramble it on the Dell desktop that I had in my office when I was an administrative assistant 10 years ago. Aw, shee-it, it’s the codes to do something. Argue argue. Run run run run run. Jodie Foster sends an augmented assassin whose motivation is that he fucking loves killing shit after Matt Damon. Eventually, everyone ends up on Elysium, because although this is the future and they are able to manipulate matter at the atomic level, if Jodie Foster is distracted for a sec, any asshole can just roll right through the gate, because Elysium’s automated systems read the script and realized that’s what they were supposed to do because of the plot. For no particular reason, the assassin stabs Jodie Foster in the neck with a piece of glass and decides that now he wants to rule Elysium. Has he ever even been to Elysium? Who cares? Jodie Foster bleeds to death on the floor in a closet with a woman who is only in this movie to prove that Matt Damon is not gay, even though Matt Damon is clearly gay; the only person with whom he has a convincing emotional connection is the sexy DL Latin thug car thief who is his best friend, who gets killed, and whose death is Damon’s one moment of actual pathos. These two obviously were boning, but don’t worry, look, there’s this woman!

So the movie kills its main villain in the middle of the last act for no reason, and then reminds you of the narrative senselessness of this act by occasionally cutting back to the room where Jodie Foster is literally lying dead under a tarp, which is the one allegory this movie gets right, except that it is an allegory for this movie. Then some shit happens, and then it turns out that there is no reason at all for the material privation and medical hopelessness on earth, and then the movie is over. I suppose there was some decent production design, in the sense that it all looked better than Star Wars Episode I. Foster does a fine villain, but her character makes so little sense that her performance was lost, and although Damon does the everyman with some skill, he gets lost as soon as the action starts. Bourne proved him a capable action hero in the hands of a capable action director. Here, alas, no.

Look, the future as an allegory for the present moment is effectively the whole point of science fiction, so the movie’s intentions were in the right place, but Blomkamp didn’t think about his concept. You don’t need a Tolkienian backstory to build a realistic fictional world, but consistency matters. If no one has any reason to do anything, or if they act constantly in contravention of their own apparent interests, then all an audience can do is be confused. The movie struggles to present its characters in the tradition of psychological realism. This may be the future, but these people are just like us, etc. etc. And yet, because everything these characters do is in the service of a story that ought not be taking place at all according to its own rules and logic, all these emotions and psychologies are rendered not more, but less real.

A Pound of Music

Art, Books and Literature, Culture, Religion, Science

How do you solve a problem like Stephen Pinker?

Ross Douthat notes the curious convergence: that “the defining practices of science, including open debate, peer review, and double-blind methods,” which were, according to Pinker, “explicitly designed to circumvent the errors and sins to which scientists, being human, are vulnerable,” lead inexorably to the economoral worldview to which Pinker has–surely a coincidence–already subscribed. Fuck Theory, meanwhile, notices that Pinker seems unfamiliar with the philosophers he name drops to open his essay. (By the way, Pinker also mangles Bergson’s élan vital, elsewhere and otherwise in the essay, if only in passing.) FT might be too kind. He damns our scientician for having failed to read the primary sources, but the real knock is that Pinker could have avoided a lot of these basic errors just by reading Will Durant. He could have read Wikipedia! Is there anything as unforgivably lazy in this great age of the internet as a man incapable of feigning authority over a couple thousand words?

Look, I’m a materialist. I don’t believe in the supernatural. I’m an atheist. I believe that the mind is an emergent phenomena of the brain. You might say that I constitute the natural constituency for Pinker’s argument, which is what makes its obtuseness and inadequacy so annoying. It gets everything backward. He says, for example, that science wipes away “the theory of vengeful gods and occult forces [and] undermines practices such as human sacrifice, witch hunts, faith healing, trial by ordeal, and the persecution of heretics.” No word on what testable hypotheses prohibit second degree murder or which codicil of evolutionary psychology demands that we not remove the mattress tags, but let’s allow the point. It is true, after all, that the sorts of bureaucratic rationalization that led to more modern systems of trial and punishment are kissin’ cousins with Pinker’s over-broadly defined science. Nevertheless, we end up in a bizarre territory wherein morality is defined by utility but the “science” behind it is a transcendent ideology:

Though everyone endorses science when it can cure disease, monitor the environment, or bash political opponents, the intrusion of science into the territories of the humanities has been deeply resented.

The implication of this complaint, and the essential thesis of the article, is that science, whatever that is, uniquely among all human disciplines and endeavors, is not subject to utilitarian analysis, is not merely a mathematical function, a delta of positive change to “human flourishing.”

In fact, I agree. I think it would be a shame to look at the advancement of scientific knowledge, the immense growth of our species’ physical insight into the world and the universe, as a merely additive process whose sole measure is the number of new patents, cures, and minutes of extended battery life. Yes, there will surely be some practical outcome of learning that dolphins give each other names, but there is something essentially miraculous in simply knowing it to be true. And this is why I find Pinker’s claim so utterly bizarre, as if science must stake out a monopoly on the extraordinary, all our other transcendent experiences subsumed to its totalitarian scope. Pardon me, but isn’t that just weird? Religion claims to give life meaning, but by proving the Biblical creation myth false, science, gives life meaning. Replacing one false, totalizing claim with another is an odd way to run a debate team, if you know what I’m saying.

But then, this is where Pinker really wanders down a dusty path:

Science has also provided the world with images of sublime beauty: stroboscopically frozen motion, exotic organisms, distant galaxies and outer planets, fluorescing neural circuitry, and a luminous planet Earth rising above the moon’s horizon into the blackness of space. Like great works of art, these are not just pretty pictures but prods to contemplation, which deepen our understanding of what it means to be human and of our place in nature.

Slow down there, Percy Bysshe! Okay, I agree that pictures of the Earth from its own satellite are pretty fucking lovely, but what is, and from whence comes, sublime beauty? What does it mean to “mean” to be human? When you say, “our place in nature,” I presume you mean something more than our position on the food chain and our direct impact on global climatic systems. Cognitive neuroscience may lay claim to the question of how and why our particular subset of upright mammals perceives beauty as it does, but clearly we’re talking about something more than a reducible pleasure response to a Fibonacci-derived golden ration. Why do we find the Hubble deep field beautiful? Why, actually, do we artificially color it to make it beautiful? And what is “beautiful”?

These are lines of inquiry that real scientists (as opposed to commercial popularizers) and scholars of the humanities and artists and authors think about with much greater depth and subtlety than you’d suspect reading this crackpot essay, which prefers to lob vague accusations of disastrous postmodernism at the humanities as if it were an essay in Commentary in 1985. I mean, if Pinker reveals himself as something less than a scholar of philosophy at the beginning, he shows himself as an even worse art critic later on. Cheering for a new, scientific art like a bizarro Soviet, he actually says:

The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces and landscapes.

This is the rough equivalent of James Turrell demanding that chemists to avail themselves of the unknown discipline of gas chromatography. Yo, Pinky, it’s Robert Smithson calling from 1970. He’d like to sell you a large, earthwork time machine. Artists have long embraced science and technology in their work and their practice. Has Pinker ever heard of Steve Kurtz? Does he know about collectives like Informationlab? Is he aware that the Oberlin Conservatory established the Technology in Music and Related Arts program in 1967? Does he read science fiction? Shit, I mean, has he heard of a little-known avant-garde filmmaker named James Cameron? Physician, heal thyself.

More Sinned Against than Manning

Culture, Justice, Religion, War and Politics

We all knew that the conviction of Bradley Manning was a fait accompli before the trial began, and the government’s petty and vindictive rejection of his plea offer only certified that the amoral keepers of order, beginning with the President himself, considered this sinful spectacle of vengeful formality a necessary bit of instruction, pour décourager les autres. I use the word sinful advisedly. The fact that the government went through with the trial indicates how truly despicable the powerful become when they’ve been embarrassed, how small they are, and how distant from what is good.

You know, I joined Twitter because I wrote a novel and it seemed wise to weasel my way into a few more online forums in anticipation of its publication, but I’ve been gratified to make some interesting new friends and acquaintances, several of whom are devout Christians. I’m not religious in any practical sense of the word, but I’ve always been conservative by temperament, however radical my politics, and although I’m no more inclined to believe that Yahweh is real than I ever was, I do find that, as I’ve gotten older, I’ve become both more austere in my moral judgments and more communitarian in my social thinking, habits I certainly associate with the Judaism of my youth, however wishwashy and Reform it may have been. I don’t know, this Shabbat is my brother’s Yahrtzeit, and I always get sentimental. Nevertheless, even if I don’t believe in God and feel no affinity for the concept of a god, then I do believe, abidingly, that there is such a thing as justice, and that justice is more than some dull codex of laws, fairly and blindly applied. There should be room for forgiveness, tolerance, and exigence, and when we afflict the weak and the powerless with our harshest punishments, we traduce justice and sully ourselves. The desire to punish, the eagerness to see punishment, reveals, I think, a human soul, or being, or whatever you want to call it, that secretly fears this very outcome for itself—trial, judgment, and punishment for its sins.

The government tortured Bradley Manning; they tried, literally, to drive him mad, likely in the belief that he would then give up some other participant in a concocted conspiracy. They later accused him of vanity, but is there anything more vain than powerful, paranoid men imagining their own secret persecution? Still, I want to resist the urge to let my heart break for him, because I think that he’s stronger and braver than me; had I been subjected to what he endured, I would not have endured. I doubt I’d have done what he did to draw the vicious ire of the Executive and the military to begin with, even if I’d had the opportunity. Fear would have stopped me, or malaise, or plain indifference. So it seems indulgent to offer him my pity, and instead I would offer my anger.

Manning is a prisoner of politics and conscience. As I sit on my designer (if dog-stained) couch in my pretty little row house in my lovely city wondering how much more furniture or art my ex will want to take as we dissolve the last eight years of life together, it feels vain to have any opinion, to share any sentiment at all. It feels decadent. But my god, we were twenty-three when we met! We were trawling through Pittsburgh bars and going to museum parties. We were the same age as Manning when they arrested him. And I believe that what is really decadent is to cast him as some speechless other, with whose experience and suffering I can feel no connection. I would have hit on Bradley Manning if I’d met him in a bar when I was twenty-three. I can’t help but feel. Another political little queer. The difference, of course, is that he was in the right place, or the wrong place, and he was more formidable than me.

What does the Manning case say? I won’t say mean, because what does anything mean? It says that our rulers are small and vengeful and afraid. The language of security and peril that’s come to cloak every official announcement is decadent. The hounding pursuit of those who undermine and question the imperatives of security and the reality of the peril is decadent. The hollow liturgy of a show trial is decadent. I’ve never been much of a nationalist, never felt especially inspired by America, always known that we are a nation like any other, built on bones and fairy tales as much as anything else, but I do appreciate the power of myth to model society, and this lousy episode really makes you wonder, what is our national myth? What does America have to offer itself anymore? We’ve become very adept at hurting people for nothing. I wonder: is that all?

Poor? No.

Culture, Economy, Media, War and Politics

To me, the most interesting reaction to the recent Guardian/Glenn Greenwald reporting on the US Government’s vast, creepy, and stupid engagement in various programs of indiscriminate eavesdropping is the shock and disbelief evinced by, mostly, partisan defenders of the President to the effect that there is something deeply disturbing and unbelievable about the idea that a “high-school dropout” could have advanced, succeeded, and come into a six figure salary. Nevermind that the technology industries have always valorized the dropout narrative and that there are prominent tech billionaires offering substantial grants to kids who skip college in order to do something useful with their lives. I’m reminded of some educational activists who point out that in the eyes of the New York Times et al., $250,000 a year is too poor to live in Manhattan, but a teacher making fifty grand is an entitled sinecure living high on the hog. The point is . . . no, the question is: is $200,000 a lot of money to make in a year? Well, in the eyes of the professional classes and their media interlocutors, the answer is: no, if you’re the right kind of person; yes, if you’re not.

The people who express these doubts in the media, who find it so extraordinary that a guy with a mere GED could make what still passes for a decent living in this country, and indeed, find it in a sense offensive that this should be the case, as if the lack of a particular kind of credential is in fact a moral demerit that renders personal financial success not merely suspect but anathema to the proper order of an economy, are the sort of people who eagerly get on board with notions like, “every child should have the opportunity to go to college.” You can ignore the word opportunity; it is a mere formalism. They mean, everyone should go to college. (The obvious economic rejoinder is that if the thing is no longer scarce, it is no longer valuable. Witness, ladies and gentlemen, the Bachelor’s degree. But I digress.) Usually this exhortation is coupled with some vague notion that we—America, if you were wondering—are being outcompeted by China in the war to endow our children with “the skills they will need for the jobs of tomorrow.” No one ever quite gets around to mentioning what those skills are. I assume they mean computers. And it appears, InshSteveJobs, that what a guy or gal needs in order to figger out them crazy computers, is not a college degree, but access to a computer.

In fact, the universalizing of college education has completely elided the distinction between credential and skill. In the days when college was just finishing school for men of a particular class, there was a lot less confusion. I’ll confess to being a conservative sympathizer in certain domains, but I don’t pine for those days. They were shitty. Nevertheless, there was a very real recognition that reading history didn’t make a man fit to be a banker; it made a man clubbable, and then he learned to be a banker. But I propose to you that if Edward Snowden had a BA in English and Creative Writing from Oberlin College and went on to become a high-paid analyst at a defense contractor, no one would say boo about it, even though he would be in a practical sense no more qualified, and hell, probably less so, than any randomly selected dropout blogger. Guys, I know whereof I speak.

What about a degree in visual arts and documentary filmmaking—here I will reveal my conservative sympathies and laugh that such things exist—qualifies a person to judge, one way or other, the professional and vocational qualifications of a person to be a data systems/IT guy? Did you even set up your own home WiFi? Was Edward Snowden a qualified employee? I don’t know, but the dispositive evidence one way or other has nothing to do with whether or not he got into Phi Beta Kappa. The sheer ­de haut en bas snobbery of it is pretty astonishing, especially as it comes from the sort of technocratic centrists and liberals for whom class distinction is supposed to evaporate in the upward movement of social progress. Hey, I think the IT guy who fixes my copier who probably has a 2-year degree from somewhere makes more money than I do, but you know what, I’m just a manager, whereas he has skills.

Peeping Thomism

Culture, Education, Media

At some point in your youth, someone warned you that “this, young man, is going to go on your permanent record.” In my case, it was a high school vice principal. I’ve forgotten the infraction, but I remember the warning. The vice principal wasn’t a bad man, but he was a bit of martinet. That’s probably a part of the job description. I knew plenty of teachers and principals who disciplined out of impatience or because of a poorly hidden streak of petty sadism, but Mr. R. wasn’t one of them; I think he held an abiding belief that structure and direction were good—not just practically good, but universally and categorically so. Most disciplinarians just believe that children, that people, are rotten. Mr. R. believed that we were basically good, just stupid. The diagnosis was correct if the prescription was wrong, and in any event he was able to moderate his meanness, especially for the hard luck kids. That, I think, was the real mark of his moral character. He was never vindictive, and while I disagree with his code to this day, he applied it justly, which is to say, unequally, and contingent on the circumstances. American society often views harsh punishment as a virtue, and when we complain about the unequal application of the rules, we usually mean that rich guys get off too easy, but Mr. R. knew that the real problem is poor guys get it too hard. Man, did we hate that SOB, but we also thought he was kind of okay. Kids are sophisticated like that, more so than adults.

Anyway, the permanent record was one of those semi-mythical creatures that you publicly dismissed while privately fearing when you were camping in the woods and the fire had burned down. I was a rich kid in that poor town, in public school mostly because of politics related to my father’s job, and most high school discipline rolled right off me. It was a given that I’d graduate at the top of my class and decamp for some fancy college, which, indeed, I did. But I do remember the permanent record thing making me ever so slightly nervous, and if I laughed about it to my friends, then I still privately fretted that some ambitious admissions officer would haul up my file and mark me off with a red X for some past minor infraction. Now, of course, kids really do get a permanent record because schools have followed the general trend of American social hysteria and started calling the cops for the slightest infraction; detention is now a misdemeanor, and so on. That’s a shame, because the permanent record ought to be as laughable now as it ever was. Do you remember yourself when you were sixteen? Many descriptors come to mind, but fully formed isn’t one of them.

As if that weren’t bad enough, that idea that one ought to be branded with one’s own youth like a poorly considered neck tattoo, we now find not only kids, but adults (especially new adults) getting constantly dinged with the dire warning that Social Media Lasts Forever. I think this is probably patently untrue in a purely physical sense; it strikes me as probable that fifty years from now, the whole electronic record of our era will be largely lost in a sea of forgotten passwords, proprietary systems, faulty hardware, and compatibility issues. But it should also be untrue in, dare I say it, the moral sense. Educators and employers are constantly yelling that you young people have an affirmative responsibility not to post anything where a teacher or principal or, worst of all, boss or potential boss might find it, which gets the ethics of the situation precisely backwards. It isn’t your sister’s obligation to hide her diary; it’s yours not to read it. Your boyfriend shouldn’t have to close all his browser windows and hide his cell phone; you ought to refrain from checking his history and reading his texts. But, says the Director of Human Resources and the Career Counselor, social media is public; you’re putting it out there. Yes, well, then I’m sure you won’t mind if I join you guys at happy hour with this flip-cam and a stenographer. Privacy isn’t the responsibility of individuals to squirrel away secrets; it’s the decency of individuals to leave other’s lives alone.

At some point, employers will have to face up to the unavoidability of hiring people whose first Google image is a shirtless selfie. Demographics will demand it. They’ll have to get used to it just as surely as they’ll have to get used to nose rings and, god help us, neck tattoos. It’s a shame, though, that it’ll be compulsory and reluctant. We should no more have to censor our electronic conversations than whisper in a restaurant. I suspect that as my own generation and the one after it finally manage to boot the Boomers from their tenacious hold on the steering wheel of this civilization that they’ve piloted ineluctably and inexorably toward the shoals, all the while whining about the lazy passengers, we will better understand this, and be better, and more understanding. And I hope that the kids today will refuse to heed the warnings and insist on making a world in which what is actually unacceptable is to make one’s public life little more than series of polite and carefully maintained lies.

As I Indicated, Admiral, the Thought Had Not Occurred to Me

Art, Culture, Media, Movies

This review is going to reveal Benedict Cumberbatch’s “secret identity.” The quotation marks are there to indicate that his character is neither secret, nor has an identity, unless hard puncher counts as an identity. I bet you never in your wildest imagination thought that Star Trek would end like this, with Spock karate-chopping a bad guy on top of a flying garbage truck in the middle of a bad CGI Star Wars set? I mean, sure, Star Trek had plenty of punching, but geez, man, it’s like, it’s like as if you hired, oh, I don’t know, Baz Luhrmann to make the Great Gatsby and he made it all about parties and clothes and dancing. Oh. Oh.

Like everything Damon Lindelof gets his hands on, Star Trek Subtitle Using Variation on the Word Dark begins with a MacGuffin, muddles into a non sequitur, and ends in a mess. Who hires this guy? My own editor noticed that a draft of my novel twice used fiancé instead of girlfriend, so presumably there’s someone, somewhere who could have read the script and told Lindelof and Abrams that none of this makes any sense. They could have very easily called back the original “Space Seed” episode, set it along the Klingon neutral zone at a moment of high tension when the Federation was searching for a strategic military advantage and had a fine, intelligent movie that also had punching, Klingons, and space battles. You could have had Khan as an object of fear, reverence, and intrigue. Kirk admires his prowess and poise; McCoy his immunological whatever; Spock his astonishing intellect; Uhura his, uh, substantial Cumberbatch. He would divide them and conquer them, but then, rediscovering their bonds of friendship and duty, the crew would defeat him, because there is no eugenically superior superman in TEAM. And hell, you could even throw in a necessary tactical alliance with the Klingons to set the stage for the Cold War plot that was the backbone of the Klingon storyline in TOS and the original films.

Instead. Now look, I’m going to spare you the “Where are the orbital defenses?” and “How come the Klingons didn’t detect ‘em on the long range scanners?” I’m gonna spare you the “How far away is Kronos even at high warp?” and “What’s the effective range on that communicator again?” You may, after all, think that the main storytelling conceit of Star Trek is faster-than-light travel, but really, the main conceit is that a spacefaring civilization resembles Britain, each planet an island, its Starfleet, literally, and Admiralty. Forget all that. Despite its science fiction trappings, Star Trek is really a procedural drama. Starfleet is just its convenient institution.

Yes, you heard it here first. The man who owes Gene Roddenberry the greatest debt is Dick Wolf. Star Trek is the weekly tale of people working within an institution. This, by the way, is also its principal connection to political liberalism—not its easily-dispensed-with humanism nor its integrated crew; rather, its commitment to a universe run, for the better, by enlightened bureaucrats. The prototypical Star Trek plot is a conundrum—cultural, technological, legal—that must be solved through the application of area expertise within the confines of organizational rules and the occasional call of a higher morality or duty. It’s Law and Order in space. Act 1: unexplained thing. Act 2: investigation and preliminary diagnosis. Act 3: unexpected difficulty, delay, or complication, sometimes compounded by institutional resistance. Act 4: renewed investigation, sometimes unorthodox, leading to unexpected solution. Act 5: resolution, explanation.

People and institutions exist in Abrams’ & Lindelof’s reimagined universe, but they’re just sort of there, clogging the frame until the next face punch. I don’t object to action in Star Trek; but I do object to getting rid of the old two-fister:

As with CGI, advances in fight choreography have proceeded right past the point of more gripping physical realism and into the realm of the unbelievably hyper-real. The action is so fast, the movement so “kinetic,” to borrow the Hollywood usage, that it appears faker than the stagey fisticuffs of the old TV series. These guys are naval officers, right? Not ninjas. They pilot starships; this isn’t The Matrix. Hey, remember this little rebooted show called Battlestar Galactica, how it imagined a really gripping sort of space combat—before, anyway, it got bogged down in crackpot Lindelofian metaphysics? Remember Star Trek: First Contact. Yeah, it sucked, but the opening skirmish with the Borg vessel was pretty damn cool, AMIRIGHT? Well, whatever. Let’s just have these guys run down some hallways with guns and punch each other.

So the camera certainly moves around a lot, but there’s nothing doing. Dialogue is declaimed against a clamoring background of exploding noise, and when it does rise to the level of your noticing, it’s less the sound of voices than the smell of ham. “You are my superior officer. You are also my friend. I have been and always shall be yours,” Spock tells Kirk at the beginning of The Wrath of Khan. It is a quiet moment that comes back at the very end of the movie without a flashing arrow; here, it’s all shouted above the din. “YOU ARE MY FRIEND! WE ARE FAMILY!” It’s a disco inferno. Any pathos is in any event squished beneath the steamroller of incomprehensible plot developments, as Khan is first a terrorist, then a fugitive, then a pawn, then maybe a terrorist again, then fighting Spock on a flying thingamajig. Kirk does nothing of consequence, which is just as well, because Chris Pine, while serviceable, is no match for Benedict’s genetic, uh, endowment. Zach Quinto is a better actor, but because he never convincingly fell in love with Kirk or Khan, his jilted anger is incongruous at best. And once more for the cheap seats, it appears to take place deep within the CPUs of Skywalker Ranch.

It’s all a terrible waste of good production design and some nice costume choices (love that hat, Zach; CALL ME). Meanwhile, I know we are meant to believe that the immense crap funnel of our current cinema to be an undistorted reflection of our culture’s degraded taste, and that may be so, but I yet believe that if we must have junk, it can at least taste sweeter and not smell so awfully past its expiration date.

Have Plot, Will Unravel

Books and Literature, Culture

Although I’ve called him morally obtuse, I can’t bring myself to dislike Ezra Klein. He may be just another young hack on the make in Washington, a careerist and a faddish liberal, but unlike so many of his peers, there seems to be something accidental about his success, something less gratuitous and self-willed. But it still came too early, and it ruined him. He ought to be the most popular teacher at a middle school in Columbus, or the director of a nice Reform summer camp, underpaid but decent, one of those rare grown-ups we all remember as having steadied us through the awful middle passage of our youth. Instead, he writes for the Washington Post and makes speeches at think tanks. I can’t begrudge him his success, but I do almost pity him for it; he’ll run faster, stretch his arms father . . . . And one fine morning—

Anyway, Klein’s writing for the Post is drudgery; the interior monologue of staff-level Washington is unceasingly banal, a pseudo-economic pidgin of legalese and bad PR-firm argot so divorced from ordinary human concern or communication as to become a form of language-looking gibberish, lorem ipsum. But a friend of mine on twitter forwarded me his brief, recent musings on Gatsby, presumably occasioned by the arrival of a new, gaudy film, and if only because it occasioned a re-reading, I had to reply.

I don’t care for the phrase, great American novel, but you can’t escape it; it exists, at very least, as a genre, albeit more aspirational than actual. American literature is littered with the wreckage of titanic Summa Theologiæ, the preferred template. Fitzgerald himself attempted that sort of thing, and isn’t it interesting that his only truly remembered work is a mere 50,000 words that could nearly make Katherine Mansfield look loquacious? Even so, no one can quite agree what it is, or what it’s about; the fact that so slim a work can mean so many things to so many people, admirers and detractors alike, suggests something at once uncanny and ineffable about it, something inevitable, a word to which I’ll return

Gatsby isn’t my favorite novel, and you certainly won’t hear me, as you’ll hear some of its more hyperbolic admirers, call it perfect. There are a few perfect pieces of art in the world, but none of them is a novel. Fitzgerald’s lyricism sometimes gets the best of him, and he’s obviously burdened with some of the prejudices of his time, although we can never know which of these belong to the author and which of them to Nick Carraway. But you still won’t find a more well wrought or more finely honed book; 50,000 words seems like a trifle, but 50,000 words sustaining so singular a voice seem, to another writer, as impossible and daring as a guy walking a tightrope over the Grand Canyon.

So. What to make of Klein’s complaint?

I love the writing and, for that matter most of the book. What I can’t stand is the finale.

The book’s denouement is a series of ever-more insane coincidences. Gatsby and Daisy hit a pedestrian. The pedestrian proves to be Tom’s mistress. Tom persuades her husband that Gatsby was driving the car. The husband kills Gatsby then kills himself.

That’s fine for fiction. Dark Knight Rises wasn’t very believable, either. But it’s a problem for a book with Something To Say. The end of the Great Gatsby doesn’t feel inevitable. It feels unlikely. And thus its lessons don’t mean much.

This, first of all, is a misreading, and I wonder if it isn’t in part the result of a bad memory for the particular details of the story. There are some well-known problems with the internal chronology of Gatsby, but this bit of plotting builds almost from the beginning. The connection between Tom, Wilson, and Wilson’s wife (Tom’s mistress) Myrtle isn’t just happenstance; the Wilson residence is on the main route between tony Long Island and the city; and the tragic inevitability of Myrtle’s death isn’t that Gatsby and Daisy run down some pedestrian who, mirabile visu, just happens to turn out to be Myrtle, but rather that Myrtle has been waiting and watching for Gatsby’s car, which she mistakenly believes to belong to Tom Buchanan, and that she runs out into traffic to try to stop it. If the line sets are visible and the first electric peeks out from behind the black border, still, the knowledge that you’re in the theater does not a deus ex machina make.

Hey, though, opinions may differ; reasonable adults may disagree. Of all the artifices of narrative fiction, plot is the most unnatural and the most unreal. One author’s elegant resolution is another man’s overwrought coincidence, and I’m not going to ding Klein too hard for falling into the latter camp, even if I do half suspect that it’s the result of a flawed recollection from a not-recent-enough reading. What I will toss tomatoes about, however, are the “lessons.”

The idea that Gatsby is a sort of sociological survey of the gilded age, with the characters as archetypes playing out changing ideas about wealth, status, and morality is an easy one, and wrong. I’m sorry, but you’ve mistaken this novel’s setting for its theme, the scenery for the schema. Fitzgerald was undoubtedly interested in money, class, and the passing away of the old guard in the face of something new, but all this is the background against which something more human moves. Gatsby is sometimes criticized for a lack of psychological depth, but this, like the desire for a less coincidental plot, is a kind of prudishness and a just-so belief about what a novel ought to be; it sounds like a nice old lady looking at a piece of great modern art and sighing, But what’s it a picture of? If Gatsby lacks some of the more ostentatious experimentation with perspective and consciousness that characterized high modernism, Fitzgerald did dare to challenge the convention that every character in a book must act in Cartesian accord with his own internal machinery. Talk about coincidence! Part of the magic in Gatsby is that its characters can’t be easily explained or psychoanalyzed. Like I said: human.

Klein says:

As Nick Gillespie writes, most of the Great Gatsby perceptively sketches a moment in which new money, new immigrants, a new economy, and new social mores were overwhelming the old order. The old order triumphs in the book, but only with the help of authorial providence. Absent that car ride, Gatsby’s story might have proven a happy one. And 88-years-later, when the film is being made by a guy named Baz Luhrmann in a country run by a guy named Barack Hussein Obama, we know who really won, and it isn’t Tom. F. Scott Fitzgerald had it right, at least up until the end.

We can leave aside for a moment the cheap teleology of social progress at the end, there, and the Nick Gillespie piece he cites is really just about Nick Gillespie, but right in the center of the paragraph is that same odd complaint, repeated. The story, absent “authorial providence,” a happy one? Once more, I’ve got to wonder when he last read the book. Gatsby’s and Daisy’s affair ends that day in the city, when she admits that she also loved her husband. She will stay with him. She won’t say that she never loved him; she can’t. When Tom tells the room that he and Daisy have been through things that none of them will understand, it’s devastating because it’s true. When, later, after the accident, we see Tom and Daisy through the window at the table together, his hand on hers, talking quietly, conspirators to the end, we are meant to realize that this was the only possible outcome with or without the accident. (As for the idea that Tom “wins,” that the book “old order triumphs,” well; Tom and Daisy flee, and Nick goes home as well. An odd victory, no?)

Tom, Daisy, Gatsby, Jordan, the Wilsons, Wolfshiem—the title and Fitzgerald’s skillful deflection throughout obscure the fact that this is a book about Nick Carraway, who gives his voice and consciousness to the novel. Nick is an extraordinary character; poetic, ironic, sexually ambiguous, a liar—and Gatsby is a Bildungsroman of disappointment. He goes East to seek a fortune independent of his past, and it ends in failure and regret. The beatific image of the new world and the boundless future beckons only as a false promise; the present only ever becomes the past, and the future eternally recedes from us. It’s a terribly sad and pessimistic vision, although one with the ring of truth, and in a time when “a guy named Baz Lurhman” cranks out entertainments whose thin veneer of contemporaneity masks a devastating nostalgia for a vanished past and “a country run by a guy named Barack Hussein Obama” likewise bubbles in the gloriously false promise of its own lost preeminence, I’d say a poet of disappointment is very much what we need.