A Pound of Music

Art, Books and Literature, Culture, Religion, Science

How do you solve a problem like Stephen Pinker?

Ross Douthat notes the curious convergence: that “the defining practices of science, including open debate, peer review, and double-blind methods,” which were, according to Pinker, “explicitly designed to circumvent the errors and sins to which scientists, being human, are vulnerable,” lead inexorably to the economoral worldview to which Pinker has–surely a coincidence–already subscribed. Fuck Theory, meanwhile, notices that Pinker seems unfamiliar with the philosophers he name drops to open his essay. (By the way, Pinker also mangles Bergson’s élan vital, elsewhere and otherwise in the essay, if only in passing.) FT might be too kind. He damns our scientician for having failed to read the primary sources, but the real knock is that Pinker could have avoided a lot of these basic errors just by reading Will Durant. He could have read Wikipedia! Is there anything as unforgivably lazy in this great age of the internet as a man incapable of feigning authority over a couple thousand words?

Look, I’m a materialist. I don’t believe in the supernatural. I’m an atheist. I believe that the mind is an emergent phenomena of the brain. You might say that I constitute the natural constituency for Pinker’s argument, which is what makes its obtuseness and inadequacy so annoying. It gets everything backward. He says, for example, that science wipes away “the theory of vengeful gods and occult forces [and] undermines practices such as human sacrifice, witch hunts, faith healing, trial by ordeal, and the persecution of heretics.” No word on what testable hypotheses prohibit second degree murder or which codicil of evolutionary psychology demands that we not remove the mattress tags, but let’s allow the point. It is true, after all, that the sorts of bureaucratic rationalization that led to more modern systems of trial and punishment are kissin’ cousins with Pinker’s over-broadly defined science. Nevertheless, we end up in a bizarre territory wherein morality is defined by utility but the “science” behind it is a transcendent ideology:

Though everyone endorses science when it can cure disease, monitor the environment, or bash political opponents, the intrusion of science into the territories of the humanities has been deeply resented.

The implication of this complaint, and the essential thesis of the article, is that science, whatever that is, uniquely among all human disciplines and endeavors, is not subject to utilitarian analysis, is not merely a mathematical function, a delta of positive change to “human flourishing.”

In fact, I agree. I think it would be a shame to look at the advancement of scientific knowledge, the immense growth of our species’ physical insight into the world and the universe, as a merely additive process whose sole measure is the number of new patents, cures, and minutes of extended battery life. Yes, there will surely be some practical outcome of learning that dolphins give each other names, but there is something essentially miraculous in simply knowing it to be true. And this is why I find Pinker’s claim so utterly bizarre, as if science must stake out a monopoly on the extraordinary, all our other transcendent experiences subsumed to its totalitarian scope. Pardon me, but isn’t that just weird? Religion claims to give life meaning, but by proving the Biblical creation myth false, science, gives life meaning. Replacing one false, totalizing claim with another is an odd way to run a debate team, if you know what I’m saying.

But then, this is where Pinker really wanders down a dusty path:

Science has also provided the world with images of sublime beauty: stroboscopically frozen motion, exotic organisms, distant galaxies and outer planets, fluorescing neural circuitry, and a luminous planet Earth rising above the moon’s horizon into the blackness of space. Like great works of art, these are not just pretty pictures but prods to contemplation, which deepen our understanding of what it means to be human and of our place in nature.

Slow down there, Percy Bysshe! Okay, I agree that pictures of the Earth from its own satellite are pretty fucking lovely, but what is, and from whence comes, sublime beauty? What does it mean to “mean” to be human? When you say, “our place in nature,” I presume you mean something more than our position on the food chain and our direct impact on global climatic systems. Cognitive neuroscience may lay claim to the question of how and why our particular subset of upright mammals perceives beauty as it does, but clearly we’re talking about something more than a reducible pleasure response to a Fibonacci-derived golden ration. Why do we find the Hubble deep field beautiful? Why, actually, do we artificially color it to make it beautiful? And what is “beautiful”?

These are lines of inquiry that real scientists (as opposed to commercial popularizers) and scholars of the humanities and artists and authors think about with much greater depth and subtlety than you’d suspect reading this crackpot essay, which prefers to lob vague accusations of disastrous postmodernism at the humanities as if it were an essay in Commentary in 1985. I mean, if Pinker reveals himself as something less than a scholar of philosophy at the beginning, he shows himself as an even worse art critic later on. Cheering for a new, scientific art like a bizarro Soviet, he actually says:

The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces and landscapes.

This is the rough equivalent of James Turrell demanding that chemists to avail themselves of the unknown discipline of gas chromatography. Yo, Pinky, it’s Robert Smithson calling from 1970. He’d like to sell you a large, earthwork time machine. Artists have long embraced science and technology in their work and their practice. Has Pinker ever heard of Steve Kurtz? Does he know about collectives like Informationlab? Is he aware that the Oberlin Conservatory established the Technology in Music and Related Arts program in 1967? Does he read science fiction? Shit, I mean, has he heard of a little-known avant-garde filmmaker named James Cameron? Physician, heal thyself.

More Sinned Against than Manning

Culture, Justice, Religion, War and Politics

We all knew that the conviction of Bradley Manning was a fait accompli before the trial began, and the government’s petty and vindictive rejection of his plea offer only certified that the amoral keepers of order, beginning with the President himself, considered this sinful spectacle of vengeful formality a necessary bit of instruction, pour décourager les autres. I use the word sinful advisedly. The fact that the government went through with the trial indicates how truly despicable the powerful become when they’ve been embarrassed, how small they are, and how distant from what is good.

You know, I joined Twitter because I wrote a novel and it seemed wise to weasel my way into a few more online forums in anticipation of its publication, but I’ve been gratified to make some interesting new friends and acquaintances, several of whom are devout Christians. I’m not religious in any practical sense of the word, but I’ve always been conservative by temperament, however radical my politics, and although I’m no more inclined to believe that Yahweh is real than I ever was, I do find that, as I’ve gotten older, I’ve become both more austere in my moral judgments and more communitarian in my social thinking, habits I certainly associate with the Judaism of my youth, however wishwashy and Reform it may have been. I don’t know, this Shabbat is my brother’s Yahrtzeit, and I always get sentimental. Nevertheless, even if I don’t believe in God and feel no affinity for the concept of a god, then I do believe, abidingly, that there is such a thing as justice, and that justice is more than some dull codex of laws, fairly and blindly applied. There should be room for forgiveness, tolerance, and exigence, and when we afflict the weak and the powerless with our harshest punishments, we traduce justice and sully ourselves. The desire to punish, the eagerness to see punishment, reveals, I think, a human soul, or being, or whatever you want to call it, that secretly fears this very outcome for itself—trial, judgment, and punishment for its sins.

The government tortured Bradley Manning; they tried, literally, to drive him mad, likely in the belief that he would then give up some other participant in a concocted conspiracy. They later accused him of vanity, but is there anything more vain than powerful, paranoid men imagining their own secret persecution? Still, I want to resist the urge to let my heart break for him, because I think that he’s stronger and braver than me; had I been subjected to what he endured, I would not have endured. I doubt I’d have done what he did to draw the vicious ire of the Executive and the military to begin with, even if I’d had the opportunity. Fear would have stopped me, or malaise, or plain indifference. So it seems indulgent to offer him my pity, and instead I would offer my anger.

Manning is a prisoner of politics and conscience. As I sit on my designer (if dog-stained) couch in my pretty little row house in my lovely city wondering how much more furniture or art my ex will want to take as we dissolve the last eight years of life together, it feels vain to have any opinion, to share any sentiment at all. It feels decadent. But my god, we were twenty-three when we met! We were trawling through Pittsburgh bars and going to museum parties. We were the same age as Manning when they arrested him. And I believe that what is really decadent is to cast him as some speechless other, with whose experience and suffering I can feel no connection. I would have hit on Bradley Manning if I’d met him in a bar when I was twenty-three. I can’t help but feel. Another political little queer. The difference, of course, is that he was in the right place, or the wrong place, and he was more formidable than me.

What does the Manning case say? I won’t say mean, because what does anything mean? It says that our rulers are small and vengeful and afraid. The language of security and peril that’s come to cloak every official announcement is decadent. The hounding pursuit of those who undermine and question the imperatives of security and the reality of the peril is decadent. The hollow liturgy of a show trial is decadent. I’ve never been much of a nationalist, never felt especially inspired by America, always known that we are a nation like any other, built on bones and fairy tales as much as anything else, but I do appreciate the power of myth to model society, and this lousy episode really makes you wonder, what is our national myth? What does America have to offer itself anymore? We’ve become very adept at hurting people for nothing. I wonder: is that all?

Brookstoßlegende

Justice, Media, War and Politics

Does anyone remember when David Brooks was a conservative? Me neither, and yet the adjective persists. He’s gotten great mileage out of the not-very-original but not-very-objectionable-either argument that a society, properly constituted, is a nested set of smaller societies, from friends and family on up through your block, your council district, your diocese, etc., all the way up to the Federal Government. He combines these with a Burkean horror at the excesses of the French Revolution; for David Brooks, it is always 1789 1968. This in turn gets folded into a frothy meringue of faddish neurobabble and pop psychology. The result is an odd chimera, a giddy atavistic technocratic utopian anachronist: a Benthamite Whig monarchist. Imagine that on your coat of arms.

Anyway, Brooks uses his column today to accuse Edward Snowden of taking the delicately wrought matryoshka doll that constitutes American civilization up to the roof and hurling it callously onto the sidewalk below. He accuses Snowden of betraying his own mother. Betrayal is one of those words that you only ever encounter in two contexts. In actual politics, betrayal is part of the lexicon of fascism. I’ll let others on the internet accuse Brooks of this. Despite his authoritarian predilections, Brooks is not a fascist, any more than Brooks is a conservative, or a liberal; Brooks is just a grumpy, entitled suburbanite on the downhill side of middle age—il est lui-même la matière de son livre. The other area in which one encounters betrayal is in the realm of romance. Ah, so that’s it. The odd tone of Brooks’ column grinds against what one expects from a polemic, but it does remind you of a breakup letter. Brooks isn’t outraged; he’s jilted.

Gore Vidal famously, or notoriously, quipped: “I am at heart a propagandist, a tremendous hater, a tiresome nag, complacently positive that there is no human problem which could not be solved if people would simply do as I advise.” Vidal was a real aristocrat, and so he could turn his curdled humor on his own noblesse oblige; Brooks is an arriviste, lacking the confidence to giggle at his own certainty; he echoes everything in that sentence that follows positive and nothing that precedes it. Brooks views himself as essentially metonymous with the United States of America, thus the attitude toward Snowden. I can’t believe you’re breaking up with me! You can’t break up with me! I’m breaking up with you!

The column is full of peculiar, #slatepitch counterintuitions (“He betrayed the privacy of us all. If federal security agencies can’t do vast data sweeps, they will inevitably revert to the older, more intrusive eavesdropping methods”), which, in true Dear John fashion, simultaneously accuse Snowden of never doing the dishes and of always getting water all over the counters when he does the dishes, but there’s one fascinating and bizarre politico-historical claim that merits an additional note:

He betrayed the Constitution. The founders did not create the United States so that some solitary 29-year-old could make unilateral decisions about what should be exposed.

I have searched in vain, and I find no part of the Constitution, original text or amendments, that makes any provision whatsoever for the keeping of secrets, official or otherwise. In such absence, the accusation makes literally no sense at all. If you take cranberries and stew them like applesauce they taste much more like prunes than rhubarb does. Now tell me what you know. The founders did create the United States in part to protect against the issuance of general warrants by an unanswerable government. The closest they get to mentioning 29-year-olds is in making 25 the minimum age for Representatives, 30 for the Senate. Mostly, though, both bodies are occupied by Mr. Brooks’ cohort. Boy, they’re really doing a bang-up job.

Poor? No.

Culture, Economy, Media, War and Politics

To me, the most interesting reaction to the recent Guardian/Glenn Greenwald reporting on the US Government’s vast, creepy, and stupid engagement in various programs of indiscriminate eavesdropping is the shock and disbelief evinced by, mostly, partisan defenders of the President to the effect that there is something deeply disturbing and unbelievable about the idea that a “high-school dropout” could have advanced, succeeded, and come into a six figure salary. Nevermind that the technology industries have always valorized the dropout narrative and that there are prominent tech billionaires offering substantial grants to kids who skip college in order to do something useful with their lives. I’m reminded of some educational activists who point out that in the eyes of the New York Times et al., $250,000 a year is too poor to live in Manhattan, but a teacher making fifty grand is an entitled sinecure living high on the hog. The point is . . . no, the question is: is $200,000 a lot of money to make in a year? Well, in the eyes of the professional classes and their media interlocutors, the answer is: no, if you’re the right kind of person; yes, if you’re not.

The people who express these doubts in the media, who find it so extraordinary that a guy with a mere GED could make what still passes for a decent living in this country, and indeed, find it in a sense offensive that this should be the case, as if the lack of a particular kind of credential is in fact a moral demerit that renders personal financial success not merely suspect but anathema to the proper order of an economy, are the sort of people who eagerly get on board with notions like, “every child should have the opportunity to go to college.” You can ignore the word opportunity; it is a mere formalism. They mean, everyone should go to college. (The obvious economic rejoinder is that if the thing is no longer scarce, it is no longer valuable. Witness, ladies and gentlemen, the Bachelor’s degree. But I digress.) Usually this exhortation is coupled with some vague notion that we—America, if you were wondering—are being outcompeted by China in the war to endow our children with “the skills they will need for the jobs of tomorrow.” No one ever quite gets around to mentioning what those skills are. I assume they mean computers. And it appears, InshSteveJobs, that what a guy or gal needs in order to figger out them crazy computers, is not a college degree, but access to a computer.

In fact, the universalizing of college education has completely elided the distinction between credential and skill. In the days when college was just finishing school for men of a particular class, there was a lot less confusion. I’ll confess to being a conservative sympathizer in certain domains, but I don’t pine for those days. They were shitty. Nevertheless, there was a very real recognition that reading history didn’t make a man fit to be a banker; it made a man clubbable, and then he learned to be a banker. But I propose to you that if Edward Snowden had a BA in English and Creative Writing from Oberlin College and went on to become a high-paid analyst at a defense contractor, no one would say boo about it, even though he would be in a practical sense no more qualified, and hell, probably less so, than any randomly selected dropout blogger. Guys, I know whereof I speak.

What about a degree in visual arts and documentary filmmaking—here I will reveal my conservative sympathies and laugh that such things exist—qualifies a person to judge, one way or other, the professional and vocational qualifications of a person to be a data systems/IT guy? Did you even set up your own home WiFi? Was Edward Snowden a qualified employee? I don’t know, but the dispositive evidence one way or other has nothing to do with whether or not he got into Phi Beta Kappa. The sheer ­de haut en bas snobbery of it is pretty astonishing, especially as it comes from the sort of technocratic centrists and liberals for whom class distinction is supposed to evaporate in the upward movement of social progress. Hey, I think the IT guy who fixes my copier who probably has a 2-year degree from somewhere makes more money than I do, but you know what, I’m just a manager, whereas he has skills.

A Functionary of the National Security Agency Encounters the Holy Spirit at His Work

Books and Literature, Poetry

Priest, confessor, bureaucrat, alone
in a warehouse full of ordinary dreams,
aspirations and unexpurgated streams
of consciousness, all context, lacking tone
or affect, notices a bird has flown
in through a window, perched among the beams,
black-beaked and tiny, singing, it seems
semi-demiurgic, though a known
and common type, taxonomized and quite
familiar; still, indoors, becomes a kind
of miracle, unseen except by this
thin-wristed man beneath fluorescent light,
glorious excess born of a bored mind,
transubstantiated into bliss.

No Homo Economicus

Conspiracy and the Occult, Economy, Plus ça change motherfuckers, War and Politics

As a rule, I’m suspicious of economic explanations, because I regard economics as a fraudulent pseudoscience, although in my more charitable moments, I allow that it might just be a kind of Becherian proto-science, a vast expanse of arithmetical phlogiston that our descendant generations will regard as very nearly quaint. The civic discourse of the present era is completely dominated by economics; young pundits with degrees in philosophy begin to be taken seriously only when they start dropping its jargony solecisms into their op-eds. Economics actually claims to be both a behavioral science and a physical one, even though it appears to believe that its natural laws derive from the word problems at the back of the book than vice versa, and anyway it has a record of near total failure at figuring out why things actually happened or predicting if and when they will happen again. All that said, I’m going to propose a sort of economic explanation for the fact that the government just can’t stop spying on us.

I think we need to see programs like the NSA’s immense and unanswerable but also totally wasteful and unproductive spying program as a form of rent-seeking. That isn’t to say that it isn’t also weird, evil, sinister, and creepily totalitarian, and it isn’t necessarily to claim that it’s a sign of gross incompetence either. For instance: rent-seeking investment banks are very good at what they do, which is balling up other people’s money, auctioning it off, and charging everyone for the privilege of having someone else direct their losses. They are useless, unproductive, and destructive, and they can seem incompetent if you take as their task the purported reason for such institutions to exist, which is to generate wealth for their clients while directing their clients’ wealth toward investment in productive enterprise, but if you understand them for what they actually are, understand that the purpose of Goldman Sachs is to rob people to grow Goldman Sachs, then their incompetence begins to seem a little more like a form of genius.

Well, the surveillance state is at its root—and this isn’t to discount all of its other more nefarious acts and ends, but simply to regard them as symptomatic rather than causal—an ongoing argument for its own existence, a self-replicating machine whose only real purpose is itself. What on earth will the government do with all this data? Well, it will hire more people and discover that this particular dataset is broad but shallow which will necessitate gathering billions more bytes which will continue to have precisely the same effect of necessitating more, more, and more until, hopefully, one day the machines become actually intelligent and decide to devote their considerable processing power to something more necessary, like playing chess or writing metrical poetry.

Some of us nerds recognize it: information is still sufficiently scarce and finite to function as a kind of currency, and the spies are just taking a commission at every point of exchange, but at least when VISA does it with the old money some satisfied customer may walk away from some satisfied merchant. You might consider the NSA program, and others like it, as a kind of information tax without benefits—it’s an absolute requirement, universal and un-appealable, but it doesn’t even cold patch a pothole on the information superhighway. When Google maps your brain into a computer you might get a coupon out of it, some provision of service in exchange. In the meantime, while I believe that we should fight and protest these intrusions on our privacy and personhood, I also come down on the vaguely optimistic side; just as JP Morgan has no idea what to do with its billions other than make more billions, I don’t think the government can do much with this titanic volume of information but add to it. It is morally but not practically outrageous; it’s an exercise of mere accumulation, which isn’t a sign of malevolence so much as of a chronic and probably terminal decadence.

Peeping Thomism

Culture, Education, Media

At some point in your youth, someone warned you that “this, young man, is going to go on your permanent record.” In my case, it was a high school vice principal. I’ve forgotten the infraction, but I remember the warning. The vice principal wasn’t a bad man, but he was a bit of martinet. That’s probably a part of the job description. I knew plenty of teachers and principals who disciplined out of impatience or because of a poorly hidden streak of petty sadism, but Mr. R. wasn’t one of them; I think he held an abiding belief that structure and direction were good—not just practically good, but universally and categorically so. Most disciplinarians just believe that children, that people, are rotten. Mr. R. believed that we were basically good, just stupid. The diagnosis was correct if the prescription was wrong, and in any event he was able to moderate his meanness, especially for the hard luck kids. That, I think, was the real mark of his moral character. He was never vindictive, and while I disagree with his code to this day, he applied it justly, which is to say, unequally, and contingent on the circumstances. American society often views harsh punishment as a virtue, and when we complain about the unequal application of the rules, we usually mean that rich guys get off too easy, but Mr. R. knew that the real problem is poor guys get it too hard. Man, did we hate that SOB, but we also thought he was kind of okay. Kids are sophisticated like that, more so than adults.

Anyway, the permanent record was one of those semi-mythical creatures that you publicly dismissed while privately fearing when you were camping in the woods and the fire had burned down. I was a rich kid in that poor town, in public school mostly because of politics related to my father’s job, and most high school discipline rolled right off me. It was a given that I’d graduate at the top of my class and decamp for some fancy college, which, indeed, I did. But I do remember the permanent record thing making me ever so slightly nervous, and if I laughed about it to my friends, then I still privately fretted that some ambitious admissions officer would haul up my file and mark me off with a red X for some past minor infraction. Now, of course, kids really do get a permanent record because schools have followed the general trend of American social hysteria and started calling the cops for the slightest infraction; detention is now a misdemeanor, and so on. That’s a shame, because the permanent record ought to be as laughable now as it ever was. Do you remember yourself when you were sixteen? Many descriptors come to mind, but fully formed isn’t one of them.

As if that weren’t bad enough, that idea that one ought to be branded with one’s own youth like a poorly considered neck tattoo, we now find not only kids, but adults (especially new adults) getting constantly dinged with the dire warning that Social Media Lasts Forever. I think this is probably patently untrue in a purely physical sense; it strikes me as probable that fifty years from now, the whole electronic record of our era will be largely lost in a sea of forgotten passwords, proprietary systems, faulty hardware, and compatibility issues. But it should also be untrue in, dare I say it, the moral sense. Educators and employers are constantly yelling that you young people have an affirmative responsibility not to post anything where a teacher or principal or, worst of all, boss or potential boss might find it, which gets the ethics of the situation precisely backwards. It isn’t your sister’s obligation to hide her diary; it’s yours not to read it. Your boyfriend shouldn’t have to close all his browser windows and hide his cell phone; you ought to refrain from checking his history and reading his texts. But, says the Director of Human Resources and the Career Counselor, social media is public; you’re putting it out there. Yes, well, then I’m sure you won’t mind if I join you guys at happy hour with this flip-cam and a stenographer. Privacy isn’t the responsibility of individuals to squirrel away secrets; it’s the decency of individuals to leave other’s lives alone.

At some point, employers will have to face up to the unavoidability of hiring people whose first Google image is a shirtless selfie. Demographics will demand it. They’ll have to get used to it just as surely as they’ll have to get used to nose rings and, god help us, neck tattoos. It’s a shame, though, that it’ll be compulsory and reluctant. We should no more have to censor our electronic conversations than whisper in a restaurant. I suspect that as my own generation and the one after it finally manage to boot the Boomers from their tenacious hold on the steering wheel of this civilization that they’ve piloted ineluctably and inexorably toward the shoals, all the while whining about the lazy passengers, we will better understand this, and be better, and more understanding. And I hope that the kids today will refuse to heed the warnings and insist on making a world in which what is actually unacceptable is to make one’s public life little more than series of polite and carefully maintained lies.

As I Indicated, Admiral, the Thought Had Not Occurred to Me

Art, Culture, Media, Movies

This review is going to reveal Benedict Cumberbatch’s “secret identity.” The quotation marks are there to indicate that his character is neither secret, nor has an identity, unless hard puncher counts as an identity. I bet you never in your wildest imagination thought that Star Trek would end like this, with Spock karate-chopping a bad guy on top of a flying garbage truck in the middle of a bad CGI Star Wars set? I mean, sure, Star Trek had plenty of punching, but geez, man, it’s like, it’s like as if you hired, oh, I don’t know, Baz Luhrmann to make the Great Gatsby and he made it all about parties and clothes and dancing. Oh. Oh.

Like everything Damon Lindelof gets his hands on, Star Trek Subtitle Using Variation on the Word Dark begins with a MacGuffin, muddles into a non sequitur, and ends in a mess. Who hires this guy? My own editor noticed that a draft of my novel twice used fiancé instead of girlfriend, so presumably there’s someone, somewhere who could have read the script and told Lindelof and Abrams that none of this makes any sense. They could have very easily called back the original “Space Seed” episode, set it along the Klingon neutral zone at a moment of high tension when the Federation was searching for a strategic military advantage and had a fine, intelligent movie that also had punching, Klingons, and space battles. You could have had Khan as an object of fear, reverence, and intrigue. Kirk admires his prowess and poise; McCoy his immunological whatever; Spock his astonishing intellect; Uhura his, uh, substantial Cumberbatch. He would divide them and conquer them, but then, rediscovering their bonds of friendship and duty, the crew would defeat him, because there is no eugenically superior superman in TEAM. And hell, you could even throw in a necessary tactical alliance with the Klingons to set the stage for the Cold War plot that was the backbone of the Klingon storyline in TOS and the original films.

Instead. Now look, I’m going to spare you the “Where are the orbital defenses?” and “How come the Klingons didn’t detect ‘em on the long range scanners?” I’m gonna spare you the “How far away is Kronos even at high warp?” and “What’s the effective range on that communicator again?” You may, after all, think that the main storytelling conceit of Star Trek is faster-than-light travel, but really, the main conceit is that a spacefaring civilization resembles Britain, each planet an island, its Starfleet, literally, and Admiralty. Forget all that. Despite its science fiction trappings, Star Trek is really a procedural drama. Starfleet is just its convenient institution.

Yes, you heard it here first. The man who owes Gene Roddenberry the greatest debt is Dick Wolf. Star Trek is the weekly tale of people working within an institution. This, by the way, is also its principal connection to political liberalism—not its easily-dispensed-with humanism nor its integrated crew; rather, its commitment to a universe run, for the better, by enlightened bureaucrats. The prototypical Star Trek plot is a conundrum—cultural, technological, legal—that must be solved through the application of area expertise within the confines of organizational rules and the occasional call of a higher morality or duty. It’s Law and Order in space. Act 1: unexplained thing. Act 2: investigation and preliminary diagnosis. Act 3: unexpected difficulty, delay, or complication, sometimes compounded by institutional resistance. Act 4: renewed investigation, sometimes unorthodox, leading to unexpected solution. Act 5: resolution, explanation.

People and institutions exist in Abrams’ & Lindelof’s reimagined universe, but they’re just sort of there, clogging the frame until the next face punch. I don’t object to action in Star Trek; but I do object to getting rid of the old two-fister:

As with CGI, advances in fight choreography have proceeded right past the point of more gripping physical realism and into the realm of the unbelievably hyper-real. The action is so fast, the movement so “kinetic,” to borrow the Hollywood usage, that it appears faker than the stagey fisticuffs of the old TV series. These guys are naval officers, right? Not ninjas. They pilot starships; this isn’t The Matrix. Hey, remember this little rebooted show called Battlestar Galactica, how it imagined a really gripping sort of space combat—before, anyway, it got bogged down in crackpot Lindelofian metaphysics? Remember Star Trek: First Contact. Yeah, it sucked, but the opening skirmish with the Borg vessel was pretty damn cool, AMIRIGHT? Well, whatever. Let’s just have these guys run down some hallways with guns and punch each other.

So the camera certainly moves around a lot, but there’s nothing doing. Dialogue is declaimed against a clamoring background of exploding noise, and when it does rise to the level of your noticing, it’s less the sound of voices than the smell of ham. “You are my superior officer. You are also my friend. I have been and always shall be yours,” Spock tells Kirk at the beginning of The Wrath of Khan. It is a quiet moment that comes back at the very end of the movie without a flashing arrow; here, it’s all shouted above the din. “YOU ARE MY FRIEND! WE ARE FAMILY!” It’s a disco inferno. Any pathos is in any event squished beneath the steamroller of incomprehensible plot developments, as Khan is first a terrorist, then a fugitive, then a pawn, then maybe a terrorist again, then fighting Spock on a flying thingamajig. Kirk does nothing of consequence, which is just as well, because Chris Pine, while serviceable, is no match for Benedict’s genetic, uh, endowment. Zach Quinto is a better actor, but because he never convincingly fell in love with Kirk or Khan, his jilted anger is incongruous at best. And once more for the cheap seats, it appears to take place deep within the CPUs of Skywalker Ranch.

It’s all a terrible waste of good production design and some nice costume choices (love that hat, Zach; CALL ME). Meanwhile, I know we are meant to believe that the immense crap funnel of our current cinema to be an undistorted reflection of our culture’s degraded taste, and that may be so, but I yet believe that if we must have junk, it can at least taste sweeter and not smell so awfully past its expiration date.

Have Plot, Will Unravel

Books and Literature, Culture

Although I’ve called him morally obtuse, I can’t bring myself to dislike Ezra Klein. He may be just another young hack on the make in Washington, a careerist and a faddish liberal, but unlike so many of his peers, there seems to be something accidental about his success, something less gratuitous and self-willed. But it still came too early, and it ruined him. He ought to be the most popular teacher at a middle school in Columbus, or the director of a nice Reform summer camp, underpaid but decent, one of those rare grown-ups we all remember as having steadied us through the awful middle passage of our youth. Instead, he writes for the Washington Post and makes speeches at think tanks. I can’t begrudge him his success, but I do almost pity him for it; he’ll run faster, stretch his arms father . . . . And one fine morning—

Anyway, Klein’s writing for the Post is drudgery; the interior monologue of staff-level Washington is unceasingly banal, a pseudo-economic pidgin of legalese and bad PR-firm argot so divorced from ordinary human concern or communication as to become a form of language-looking gibberish, lorem ipsum. But a friend of mine on twitter forwarded me his brief, recent musings on Gatsby, presumably occasioned by the arrival of a new, gaudy film, and if only because it occasioned a re-reading, I had to reply.

I don’t care for the phrase, great American novel, but you can’t escape it; it exists, at very least, as a genre, albeit more aspirational than actual. American literature is littered with the wreckage of titanic Summa Theologiæ, the preferred template. Fitzgerald himself attempted that sort of thing, and isn’t it interesting that his only truly remembered work is a mere 50,000 words that could nearly make Katherine Mansfield look loquacious? Even so, no one can quite agree what it is, or what it’s about; the fact that so slim a work can mean so many things to so many people, admirers and detractors alike, suggests something at once uncanny and ineffable about it, something inevitable, a word to which I’ll return

Gatsby isn’t my favorite novel, and you certainly won’t hear me, as you’ll hear some of its more hyperbolic admirers, call it perfect. There are a few perfect pieces of art in the world, but none of them is a novel. Fitzgerald’s lyricism sometimes gets the best of him, and he’s obviously burdened with some of the prejudices of his time, although we can never know which of these belong to the author and which of them to Nick Carraway. But you still won’t find a more well wrought or more finely honed book; 50,000 words seems like a trifle, but 50,000 words sustaining so singular a voice seem, to another writer, as impossible and daring as a guy walking a tightrope over the Grand Canyon.

So. What to make of Klein’s complaint?

I love the writing and, for that matter most of the book. What I can’t stand is the finale.

The book’s denouement is a series of ever-more insane coincidences. Gatsby and Daisy hit a pedestrian. The pedestrian proves to be Tom’s mistress. Tom persuades her husband that Gatsby was driving the car. The husband kills Gatsby then kills himself.

That’s fine for fiction. Dark Knight Rises wasn’t very believable, either. But it’s a problem for a book with Something To Say. The end of the Great Gatsby doesn’t feel inevitable. It feels unlikely. And thus its lessons don’t mean much.

This, first of all, is a misreading, and I wonder if it isn’t in part the result of a bad memory for the particular details of the story. There are some well-known problems with the internal chronology of Gatsby, but this bit of plotting builds almost from the beginning. The connection between Tom, Wilson, and Wilson’s wife (Tom’s mistress) Myrtle isn’t just happenstance; the Wilson residence is on the main route between tony Long Island and the city; and the tragic inevitability of Myrtle’s death isn’t that Gatsby and Daisy run down some pedestrian who, mirabile visu, just happens to turn out to be Myrtle, but rather that Myrtle has been waiting and watching for Gatsby’s car, which she mistakenly believes to belong to Tom Buchanan, and that she runs out into traffic to try to stop it. If the line sets are visible and the first electric peeks out from behind the black border, still, the knowledge that you’re in the theater does not a deus ex machina make.

Hey, though, opinions may differ; reasonable adults may disagree. Of all the artifices of narrative fiction, plot is the most unnatural and the most unreal. One author’s elegant resolution is another man’s overwrought coincidence, and I’m not going to ding Klein too hard for falling into the latter camp, even if I do half suspect that it’s the result of a flawed recollection from a not-recent-enough reading. What I will toss tomatoes about, however, are the “lessons.”

The idea that Gatsby is a sort of sociological survey of the gilded age, with the characters as archetypes playing out changing ideas about wealth, status, and morality is an easy one, and wrong. I’m sorry, but you’ve mistaken this novel’s setting for its theme, the scenery for the schema. Fitzgerald was undoubtedly interested in money, class, and the passing away of the old guard in the face of something new, but all this is the background against which something more human moves. Gatsby is sometimes criticized for a lack of psychological depth, but this, like the desire for a less coincidental plot, is a kind of prudishness and a just-so belief about what a novel ought to be; it sounds like a nice old lady looking at a piece of great modern art and sighing, But what’s it a picture of? If Gatsby lacks some of the more ostentatious experimentation with perspective and consciousness that characterized high modernism, Fitzgerald did dare to challenge the convention that every character in a book must act in Cartesian accord with his own internal machinery. Talk about coincidence! Part of the magic in Gatsby is that its characters can’t be easily explained or psychoanalyzed. Like I said: human.

Klein says:

As Nick Gillespie writes, most of the Great Gatsby perceptively sketches a moment in which new money, new immigrants, a new economy, and new social mores were overwhelming the old order. The old order triumphs in the book, but only with the help of authorial providence. Absent that car ride, Gatsby’s story might have proven a happy one. And 88-years-later, when the film is being made by a guy named Baz Luhrmann in a country run by a guy named Barack Hussein Obama, we know who really won, and it isn’t Tom. F. Scott Fitzgerald had it right, at least up until the end.

We can leave aside for a moment the cheap teleology of social progress at the end, there, and the Nick Gillespie piece he cites is really just about Nick Gillespie, but right in the center of the paragraph is that same odd complaint, repeated. The story, absent “authorial providence,” a happy one? Once more, I’ve got to wonder when he last read the book. Gatsby’s and Daisy’s affair ends that day in the city, when she admits that she also loved her husband. She will stay with him. She won’t say that she never loved him; she can’t. When Tom tells the room that he and Daisy have been through things that none of them will understand, it’s devastating because it’s true. When, later, after the accident, we see Tom and Daisy through the window at the table together, his hand on hers, talking quietly, conspirators to the end, we are meant to realize that this was the only possible outcome with or without the accident. (As for the idea that Tom “wins,” that the book “old order triumphs,” well; Tom and Daisy flee, and Nick goes home as well. An odd victory, no?)

Tom, Daisy, Gatsby, Jordan, the Wilsons, Wolfshiem—the title and Fitzgerald’s skillful deflection throughout obscure the fact that this is a book about Nick Carraway, who gives his voice and consciousness to the novel. Nick is an extraordinary character; poetic, ironic, sexually ambiguous, a liar—and Gatsby is a Bildungsroman of disappointment. He goes East to seek a fortune independent of his past, and it ends in failure and regret. The beatific image of the new world and the boundless future beckons only as a false promise; the present only ever becomes the past, and the future eternally recedes from us. It’s a terribly sad and pessimistic vision, although one with the ring of truth, and in a time when “a guy named Baz Lurhman” cranks out entertainments whose thin veneer of contemporaneity masks a devastating nostalgia for a vanished past and “a country run by a guy named Barack Hussein Obama” likewise bubbles in the gloriously false promise of its own lost preeminence, I’d say a poet of disappointment is very much what we need.

The Coincidental Fundamentalist

Books and Literature, Conspiracy and the Occult, Culture

I’m about to finish revising my novel, The Bend of the World, which I hesitatingly call my first novel, because really, it’s my third. I wrote the first (mostly) during my senior year at Oberlin and my second a couple of years later when I was back in Pittsburgh and insinuating myself professionally into the world of arts management—still my day job. They had in common only that they were gay (I mean that in both the sexual and the adolescent insult sense) and terrible. The first was called The Atlas of the End of the World, and yes, I ripped off that title for my current work, since it was one of the few redeeming qualities. It also had a pretty good opening line: “That night we drove as if the driving would save him.” Ok, maybe a little histrionic, but not bad for a twenty-two-year-old. It was a bad pastiche of The Mysteries of Pittsburgh and Berlin Stories and Portrait of the Artist—I’m trying to make a joke about my Stephen Hero being a Stephen Zero. And failing.

My second novel was called Be That As It May, and once again, the title is the closest it gets to possessing a redeeming quality. Ah, no, to be fair, it has one good scene involving a drunken cop and a small-town gay bar. It also has a couple of really bad sex scenes, one in a swimming pool; sort of a fanfictive prefiguring of some of the hotter passages in Call Me By Your Name. My boyfriend’s first comment after reading the first draft of my new book was, “It’s not as sexy as your last one.” This was a compliment.

To my own credit, I guess, I pretty quickly recognized these first two forays as shitty work, although I did send the opening chapter of the second to a few agents and am to this day a little staggered by the generous restraint required to reply with a mere this isn’t really the sort of thing we’re looking for right now. I then spent a few years forgoing fiction altogether, until I started noodling around with my ongoing novel, whose opening quarter I think I rewrote six or seven times, and whose latter half kept wandering off. Literally. Like, the characters kept going to North Carolina, Florida, New York. They had no business going to any of these places, but I couldn’t stop them. I really wanted Johnny to give a talk on UFOlogy at a lousy convention center in Central Florida. I really wanted a conspiracy to involve the lighthouse at Cape Hatteras. Or maybe more to the point, Johnny really wanted to give that talk, and that lighthouse really wants to be part of a conspiracy. Snip, snip.

A confession. I’ve always been annoyed by writers and artists who yak on about their process, a word I associate with a cloudy concoction made of one part self-indulgence and one part self-doubt, but long before my book appears to the public, I’m beginning to see how unavoidable it is, and how it must pay to have an anecdote at the ready. My friends and relatives keep asking me what it’s like to write a book, and I don’t actually know what to say. I imagine myself gazing off into the middle distance as a NPR personality warbles in my ear, then answering bemusedly, “Well, Steve, it’s like . . . writing a book.” In fact, once I managed to get myself a deadline thanks to the hard work of an editor and an agent who seem to think that this thing I’ve made is actually somewhat better than not half bad, I found the work of it a pleasure and joy, although I can now say, months into editing, that I am getting awfully tired of the little fucker. As to what that work consisted off, well, I just don’t know. Of writing? And as for what it all means? Uh, my Corporate Sponsors have asked me to emphasize the following message: It Is What It Is.

Hang with me, guys. Thesis: there exists a desire to see the process of creating a novel through the lens of what we popularly suppose a novel is. There are preexisting narrative and psychological expectations. There’s an expected shape to the thing. Or: the form of the work is supposed to mirror the form of the work. The story of the writer’s encounter with his own writing should follow a psychic and temporal line. The author should undergo an experience of self-revision, emerging from the ordeal altered by the events. His process should itself be a story. He should, in fact, be a character. Well, as much as I bitch about the expectations of narrative and the deranging influence of too much realism, my book does have a plot of sorts, but you won’t be surprised to learn that I have an equally hard time answering the question, “What’s it about?” I don’t really know. The conspiracy narrative was the formal model. Everything ought to seem uncannily connected, but with the indelible sense that it’s all just coincident. Actually, this is my personal theory of narrative in any event: a conspiracy theory of reality.