Dese bones gonna rise again

31 05 2011

This was not the best season of Bones.

Which is to say: this was the worst season of Bones. Not a single episode was as good as previous episodes, and while there were no truly terrible episodes, the best it got was only “all right”.

Alyssa Rosenberg argued on Matt Yglesias’s blog (now her own, at ThinkProgress) that the problem was with the overarching theme (the sniper), namely, that is was weak and centered on a boring character. I think she has a point: Although the first season didn’t have an overarching theme, two were set up for the following seasons, one regarding Brennan’s family and another with the serial killer Howard Epps.

Now, I kinda think the whole sexual-sadist-serial killer is played out (yeah, I’m looking at you, CSI, with the truly boring Nate Haskell), but they undercut the superman-superevil bad guy schtick deliberately: Howard Epps thought he was a genius but, as Zack pointed out, he really wasn’t as smart as he claimed to be.

Season two was backboned by Brennan’s backstory, with her plastic-surgeried criminal father dipping into and out of a number of episodes. (“Judas on a Pole”, which introduces him, also includes a great cover of Kate Bush’s “Running up that hill”.) It also introduced the Gravedigger, a nasty piece of work who appeared again in single episodes in seasons 3, 4, 5, and 6.

The Gormogon thing (season 3) was weird, and the Zack angle on that was weird, but it was also satisfying: so over-the-tops nuts (ritualistic cannibalism of secret society members) that there was a certain brio to the writing. Everybody seemed to be having a good time—well, you know what I mean.

Season 4 didn’t have any major arcs, save, perhaps, the Angela-Hodgins fallout, as well as an somewhat underdeveloped bit about Booth’s brain. (It didn’t really cohere, but that it didn’t really cohere didn’t really matter.) Oh, and the introduction to a rotating cast of interns/assistants. Anyway, it had a fine, fine, season ender.

Too much about the Booth/Brennan relationship interfered with season 5, but there were still some very good stand-alone episodes, as there were in each of the preceding seasons. I’m one of those who did NOT want Booth and Brennan to get together—yes, adults who have chemistry may nonetheless desist from dating—but I was even more annoyed at how forced those episodes were. Stephen Fry, who brought back his utterly charming character Gordon Wyatt, then ruined the moment by pushing (against character) for a romantic relationship. Brennan’s father talked about it, Angela talked about it, Booth and Brennan separately brooded about it—blech, it was all too much.

Yeah, we get it: they have chemistry, but enough already! Anyway, the Angela-Hodgins arc was more interesting.

Still, there was an energy and wit running through these seasons, a humor and affection comingled with the murder and mayhem, such that even amidst the utter unreality of the television crime procedural, you got the sense that these were real people doing real work.

The people mattered, the work mattered: a fine balance.

This year, however, that was thrown off. Again, I think Rosenberg may be onto something about the boring sniper arc, but I think the greater problem was that the balance got thrown off. The crimes were almost beside the point, or existed only to drive the personal plot-lines; thus the play of earlier seasons was missing, as the writers sought to reduce the looseness and otherwise force into a pre-exising cutout every damned storyline. This not only took away much of the wit of the dialogue, it also signaled a certain impatience with the characters.

So, for example, Sweets entertains doubts about his expertise and has those doubts resolved all in a single episode; both the doubts and the resolution were, erm, doubtful. (And Miss Julian opened up to Sweets, which, frankly, was not believable.) Brennan’s father was brought in for a couple of completely superfluous scenes, and there were bits about Cam and about Angela’s pregnancy (guess, no, really, just guess how the season ended), but it was all rather listless.

And having a number of the interns each undergo a character change? Please. Give me back my uptight Clark. (f only they could give back my favorite intern, whose death scene was devastating.)

It was never truly awful and, really, only a few episodes were bad, but it was such a letdown. I’ll watch again next year (even though I was not particularly happy with the last episode set-up for the new season), but I hope this season was a lapse rather than a harbinger.

I did, however, watch a truly awful show this year, even after swearing off it. Yes, I moaned my way through yet another season of CSI-New York. Ye gads. They brought in Sela Ward to replace Melina Kanakaredes, and I thought, Oh, well, I like Sela Ward.

I might still like Sela Ward, but her character, Jo? Do. Not. Like.

This show just got sappier and more moralistic as it went along. God, I can’t even be bothered to go through everything that was wrong with this show because everything was wrong.

The only good thing: it may finally have gotten so bad that even I will look away.





Brave companion of the road

28 05 2011

Is it better to be consistent than inconsistent? What about contradiction and hypocrisy: what is the merit or demerit of such concepts?

Ta-Nehisi Coates has been carrying on a long conversation with himself and the rest of us regarding the interpretation and understanding of the American Civil War; to that end, he tries to leave judgment behind and move into the experience—as much as is possible—of those living at the time. He reads historical accounts and letters and novels and requests that we “Talk to me like I’m stupid” regarding weaponry, battle tactics, wardrobe, John Locke, and hermeneutics.

He wants to understand.

I follow his wonderings in part because he often writes beautifully about these topics, in part because I learn something the Civil War, and in part because his attempt to shed enough of himself to enter into the mind of, say, a Confederate soldier, seems simultaneously brave, foolish, and in vain.

Brave: You do have to shed your armor, your clothes, sometimes even your skin to make yourself open to another.

Foolish: You have to shed your armor, your clothes, and sometimes even your skin to make yourself open to another.

In vain: As long as you can choose to come and go into another’s experience, you reinforce the separation between yourself and the other.

I am ambivalent about the limits and risks and possibilities and purposes of understanding, an ambivalence which tips sometimes more toward openness, sometimes more toward skepticism, but I am fascinated by the quest.This is not just philosophy; this is art.

And that’s where I return to the questions regarding consistency and contradiction. In  a recent post of George Fitzhugh’s Cannibals All!, TNC noted that he appreciated not only Fitzhugh’s straightforward defense of slavery, but his willingness to extend it as far as it could logically take him—in Fitzhugh’s case, into the enslavement of the majority of humankind:

There’s something attractive about his willingness to game out all of his maniacal theories. He has moral courage that his double-talking, bullshitting, slaveholding friends lack. It’s the opposite of that Jeffersonian view of slavery which cowers from the awful implications of one’s beliefs.

It’s Howell Cobb’s, “If slaves make soldiers, then our whole theory of slavery is wrong,” versus Jefferson Davis’s legalistic bullshit about black Confederates. There’s something about the sheer clarity of these guys, even though they speak evil, that’s a breath of fresh air. Half the problem is cutting through the deliberate lying about one’s own theories.

At which point I (metaphorically) raised my hand and said, Um, wait a minute: why is straight talk better, here? Is this really courageous as opposed to, say, crackers?  I drilled down further to argue that there is no necessary moral content either to consistency or to contradiction.

Consider, as well, “double-talking”, “bullshitting”, “deliberate lying”: these are all moral judgments on those who, unlike Fitzhugh, do not make their arguments one logical smooth piece, but who cramp and crinkle and perhaps tear at the fabric of their own arguments regarding the justness of slavery or the conditions of those enslaved.

These moral judgments, in other words, are, if not at root, then at least also, aesthetic judgments: better to make the argument straight than kinked, better to untie all knots and iron the whole cloth of the argument, better there be no seams.

But why is this so? Why let the aesthetic stand in for the moral? Can the aesthetic stand in for the moral? (This is a very old argument, by the way.)

No, no, I’m really not demanding a thesis from TNC; he’s doing quite enough already. But his musings in this particular piece have thrown into sharp relief how tenacious are our unexamined judgments, how much of one’s own world—one’s own ontology, as it were—one brings to that quest for understanding.

There’s no easy way out of this: judgments are our bearings, and to leave them behind in an attempt to make sense of another risks losing them altogether, to the point where we can’t make sense to ourselves.

I don’t know where I’m going with this; perhaps I’m losing my own bearings. But this whole understanding gig, tch, it’s a real kick in the head.





This war can’t end soon enough

26 05 2011

I’m not quite halfway through Evans’s The Third Reich at War, and by now all of the theoretical contradictions of Naziism come crashing into one another in practice:

1. Hitler sets out a goal of a racially pure Germany, but the need for labor means that hundreds of thousands of Poles, Russians, Ukrainians, and other racial inferiors are imported into Germany.

2. The belief that the conquest of the east would provide sufficient resources for Germans to wage war in both east and west—that war was necessary for Germany’s very survival—is turned upside down as the need and/or difficulty of holding these areas becomes a drain on the Old Reich’s (Germany proper) own resources.

3. The National Socialist’s disdain for, well, socialism, means that the rationalization and coordination of the war effort is fatally delayed. When Albert Speer does finally take over armaments production, he succeeds only insofar as he’s able to shutter small producers in favor of efficient larger producers; in doing so, he reneges on the 1930s promise to protect the petit bourgeoisie.

4. Hitler’s preference for his deputies to fight amongst themselves for position means he never imposes the discipline necessary to march them all in the same direction.

5. The Nazis are offended when the people in countries they overrun fight back; they consider such resistance to be so out of bounds as to provide justification for the initial invasion.

6. The Nazis claim to be acting according to the highest ideals in exterminating Jews, Gypsies, and Slavs, but go to great lengths to hide evidence of such extermination. (SS and Police Leader Odilo Globocnik did protest such subterfuge: he argued that instead of digging up the bodies of the dead to be burned, they should “bury bronze tablets stating that it was we who had the courage to carry out this gigantic task.”)

7. The most glaring contradiction, of course, is the contention that the German is the pinnacle of human being, a superbeing who is nonetheless threatened by the very existence of the weak and parasitic Jew.

Sure, you could spin this last point with reference to Jews as vermin or viruses or whatever, but it is nonetheless striking how much power Hitler, Himmler, Goebbels, Goring, et. al., give to Jews, so much power that they in effect put the onus for the war on Jews themselves. The great and noble German will always be vulnerable as long as Jews exist.

(It is this last point, of course, which makes the Nazis nothing like Nietzsche’s Overman: the Overman is not only not threatened by the weak, he pays them no mind. And, of course, Nietzsche thought anti-Semitism was stupid.)

I should also mention one last, well, not contradiction, exactly, but avoidable tragedy: that in so many cases Communist and Zionist prisoners could not overcome their conflicts to coordinate resistance to camp guards and administrators. Even amidst the great gnashing of teeth of the Nazi death maw their antipathy was more important than death itself.

Anyway, by this point I’m reading less out of intrinsic interest than in a kind of savage anticipation of the end.

I cannot wait for the Nazis to end.





We don’t need no education

24 05 2011

Pretty much says it all:

Tests for Pupils, But the Grades Go to Teachers

By SHARON OTTERMAN
Published: May 23, 2011

New York City education officials are developing more than a dozen new standardized tests, but in a sign of the times, their main purpose will be to grade teachers, not the students who take them.

New York Times





This is not a painting

23 05 2011

Camel thorn trees, Namibia.

Photograph by Frans Lanting, National Geographic





If I had a rocket launcher

22 05 2011

The invasion of Poland was almost unbearable.

I knew it was awful, but awful only in a general way; the opening didn’t linger on the atrocities, but the details—the killing of 55 Polish prisoners here, the burning of village after village there, the many smug justifications for murder—knit the details of death into the whole cloth of invasion and mass murder.

If I didn’t know how it all ended, I told a friend, I don’t think I could read it.

I’m on the last book of Richard Evans’s trilogy of the Third Reich, finally cracking it open after it sat on my desk for a few weeks.

I raced through The Coming of the Third Reich (useful for its doleful portrayal of the Weimar Republic) and read with fascination The Third Reich in Power, but The Third Reich at War, well, the premonitions of the first two books are borne out in the last. It will get worse, much worse, before it ends; it cannot be said to get better.

Reading about genocide and slaughter has never been fun, but I used to be able to do so without flinching. I remember reading in high school  Anne Nelson’s dispatches in Mother Jones about the Salvadoran death squads; I close my eyes, and I can still conjure up the accompanying photo of bloody heads on bench. College was apartheid and nuclear war, and grad school, human rights abuses generally.

The University of Minnesota maintained an archive of human rights material in its law school library. I’d trudge over there from my West Bank (yes, that’s what it was called) office and read reports of the massacre at the finca San Francisco, of soldiers smashing babies’ heads and slicing up their mothers. Reports of torture in Nicaragua and disappearances in Argentina and killing after killing after killing in Guatemala.

It was awful, but I could take it, and since I could take it, I felt a kind of duty to do so. There was nothing I could do, hunched over these documents in the back corner of the library, but to read them, to read as many of them as I could.

I no longer have the compulsion, or the arrogance, or frankly, even the stomach, any more to do so. I still think the reading matters, the knowledge matters, even if I can’t precisely say why, but it is so hard, almost too hard, to keep reading. To read is to conjure these lives, these men and women and children, and watch them murdered all over again.

It was like that with the footage of the airplanes hitting the World Trade Center, and of the two towers collapsing into themselves. It seemed important to watch, to see, to know what I could, but after that, it just seemed obscene, as if the replays were killing people all over again.

I know that’s not how it works—I am aware of at least a few laws of physics—but the necessity of witness is found precisely in the knowledge of what is witnessed, that is, in the knowledge of the killing of over 2500 people. I don’t want that knowledge dulled or forgotten.

Maybe that’s why it’s so difficult now to read of atrocity: the outrage has been so stretched and worn that in too many places the bare horror is all that remains. The outrage is still there—reading (again) of the T4 extermination program, I raged against the ideology of rassenhygiene and “lives not worth living”—but it no longer protects as it once did. Its use as a buffer is gone; the horror gets  close.

Still, the knowledge matters, so I read what I can when I can. It is the least, the very least, I can do.





Break like the wind

19 05 2011

Not a fan of Lars von Trier.

I should say up front that I haven’t actually seen a von Trier film in its entirety: I’ve seen chunks of Dancer in the Dark and bits of Breaking the Waves but, for the most part, I have been quite content to let his Dogma pass me by.

I’m not quite sure why, oh, hell, I know exactly why—because I don’t care to spend 90 or 120 or 150 minutes watching women get the shit beaten out of them physically, sexually, emotionally, and/or intellectually. I know, he’s supposed to very artistic in his assaults, and perhaps he’s even making some kind of point about the status of women, but point or not, I don’t want to watch it.

(I consider this a bit of a failing on my part, actually, that I am unwilling to sit through movies which make me uncomfortable or set me off, but, well, let me hold off on why I think so.)

Still, as a non-connoisseur of his works, I admit that I may be missing something wonderful and sly, and that people who love his work might have terrific reasons for doing so. I even have a bit of admiration for that whole Dogma thing—not because I sign on to worth of its strictures, but because the attempt to place limits on oneself in service to art is a worthy practice.

Calling oneself a Nazi in service to art is, however, puzzling.

I’m with The New Yorker‘s Richard Brody when he argues that

it should not be troubling to anyone that he claims to understand Hitler; it’s the job of artists to attempt to understand and enter into imaginative sympathy even with monsters; what makes artists artists is their ability to illuminate the darkest regions of the soul.

I don’t think you have to be a Nietzschean (although it might help) to see that art has its own morality, one which does not and perhaps even should not have much to do with ethical or political norms.
Still, it is perhaps unsurprising that when a man-of-the-movies opines at a film festival press conference on sympathies which, um, heavily intersect with history and politics, that there might be some complications:

But, anyway, I really wanted to be a Jew, and then I found out I’m really a Nazi, because my family was German, Hartmann, which also gave me some kind of pleasure. What can I say? I understand Hitler. But I think he did some wrong things, yes, absolutely, but I can see him sitting in his bunker in the end.

He continues the ramble (you can read it at the link, above) with asides about Israel (“a pain in the ass”) and  Danish filmmaker Susanne Bier and a thumbs-up for Albert Speer, only to have it all end (more or less) with him saying “Okay, I’m a Nazi.”

The Cannes Film Festival booted von Trier, although his film Melancholia remains. That seems about right.

Yes, even with my the-artists-must-be-free schtick (and even as I accept that von Trier might be less artist than huckster—but that’s another conversation), that they ought to have the freedom to create even the most outrageous art, that doesn’t mean they get a free (ahem) pass to say whatever they want wherever they want without consequence. Slap, and be slapped in turn.

And given the Cannes Film Festival’s own history—it was created as an explicit counterpoint to the fascist-overrun Venice Film Festival—it is unsurprising that organizers would take a dim view of anyone claiming sympathy with Nazis, even if done so (half?)-jokingly and without any apparent forethought.

Maybe he thought he was being clever and provocative, maybe he panicked as a stray thought managed to find its way into words and he had no way of reining it back in. Maybe he did mean it. Maybe he’s just a prick.

I tend to go with a combination of clever/provocative and panicked. He did apologize, which suggests either cravenness and/or abashedness; again, I go with the combo option.

I also think the fest organizers’ actions ought to be the end of it. Certainly, some moviegoers might want to avoid his films as a result or some actors might not take a call from him—if you can’t get past the man to experience the work—but there’s no ipso facto reason to avoid his films.

None of this is to excuse von Trier, bumbling offender though he may be, nor is it an excuse for Woody Allen or Mel Gibson or Roman Polanski. Again, if you can’t get past the man—I can’t, really, with Gibson—then it makes sense to avoid the work, but I don’t know that this is so much a moral position as an aesthetic one.

And that you like the work of  von Trier, Gibson, Allen, or Polanski doesn’t make you a Nazi, a violent and anti-Semitic misogynist, a schmuck, or a rapist, nor does appreciation for their work signal acceptance of their behavior. And please, if you do love the work of people who’ve done or said wretched things, don’t feel like you have to minimize said wretchedness (“it wasn’t ‘rape’ rape”) in order to justify that love.

Have the courage of your artistic convictions.





19 05 2011

JUDGMENT DAY

THE END OF THE WORLD IS ALMOST HERE!
HOLY GOD WILL BRING JUDGMENT DAY ON
MAY 21, 2011

      Thus Holy God is showing us by the words of 2 Peter 3:8 that He wants us to know that exactly 7,000 years after He destroyed the world with water in Noah’s day, He plans to destroy the entire world forever. Because the year 2011 A.D. is exactly 7,000 years after 4990 B.C. when the flood began, the Bible has given us absolute proof that the year 2011 is the end of the world during the Day of Judgment, which will come on the last day of the Day of Judgment.

      Amazingly, May 21, 2011 is the 17th day of the 2nd month of the Biblical calendar of our day. Remember, the flood waters also began on the 17th day of the 2nd month, in the year 4990 B.C.

~~~~~~~

Huh. Guess I can stop worrying about my student loan debt.

(h/t: Jaweed Kaleem, HuffPo)





Negation—wha. . .what?

18 05 2011

Perhaps I should not have used the term “negation”.

It carries a philosophical load—which is fine, and not unrelated to my use of it—but I wanted (also) to emphasize the more prosaic, i.e., practical, aspects of negation, as in: to negate, to eliminate as an option or consideration.

The germ theory of disease negated theories of miasma, Lavoisier’s experiments with oxygen negated phlogiston, industrial production of beakers and test tubes negated the need for scientists to blow their own glassware (which further negated the need for the knowledge of blowing glassware), fuel injection will likely negate carburetors, etc.

So negation could mean “overturn” (as with germs > miasmas or oxygen > phlogiston) or “leave behind” (as with glass-blowing and carburetors), that is, to negate may be to disprove or it could mean to render irrelevant or trivial.

Now, these practical effects may reverberate ontologically, such that the negation of the practical may serve to negate an entire way of thinking or being, or simply to serve as a signal of the instability of that way of thinking/being. Thomas Kuhn’s The Structure of Scientific Revolutions, with its discussion of paradigm shifts rendering previous modes of scientific practice inert, lays out a version of global negation, while current questions of the role of cyber-technologies signal uncertainty over what counts as “real”.

John Donne’s “An Anatomy of the World” (1611) is often quoted—hell, I quoted it a while back—to exemplify the agonized confusion over the discoveries of the natural philosophers:

And new philosophy calls all in doubt,
The element of fire is quite put out;
The sun is lost, and the earth, and no man’s wit
Can well direct him where to look for it.
And freely men confess that this world’s spent,
When in the planets and the firmament
They seek so many new; they that this
Is crumbled out again to his atomies.
‘Tis all in pieces, all coherence gone;
All just supply, and relation:

Natural philosophy took for itself the name science, and modernity marched on. The laments for the old world died with those who once lived in it.

William Butler Yeats’s “The Second Coming” clearly echoes this lament, with the opening

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;

The times they are a-changin’, indeed.

History is not a line, or rather, history only holds the line, such that events may loosen or smash that hold and the contents of that history scatter.

Some of those pieces are lost and even of those which are found, the meaning of the piece, precisely because it has been scattered, can only be guessed at. It is shard of pottery uncovered in the desert, hinting at something which once was, now gone.

But not everything is lost: it could be hiding in that proverbial plain sight. I’m much taken with the notion of the palimpsest—that is, of a kind of tablet which has been inscribed then scrubbed clean to be reinscribed—largely because I think that the previous inscriptions are still there, that, like words which have been erased from a page, the impression lingers.

Heidegger in The Question Concerning Technology decries the transformation of the Rhine from a river in a landscape into a “water power supplier”, that is, it is no longer itself but a source of reserve power for a hydroelectric plant. Perhaps it could be understood as that river in a landscape, he muses, but “In no other way than as an object on call for inspection by a tour group ordered there by the vacation industry.”

Those who complain that Manhattan has turned into a theme park and that Times Square has lost all its gritty reality have not a little bit in common with Herr Heidegger.

I have a great deal of sympathy for this feeling, but even more skepticism for such sympathy; as I’ve mentioned more times than you probably care to read, we’re never who we’ve been.

So, again, I’m not taking the side of the past against the present, not least because I have no basis for such a taking of sides. Again, I simply want to trace the history of modern history.

I can’t raise all the inscriptions on the palimpsest, but maybe I can see some of what has been left behind.





Vas ist dis “thoughtlessness”?

17 05 2011

Have I been thoughtless?

Perhaps, but mostly busy, lazy, and sick; actually, it would be more accurate to state that “busy, lazy, and sick” are the proximate causes for my thoughtlessness.

Anyway.

What do I mean by thoughtlessness (anyway)?

Let’s start with what I don’t mean: I don’t mean stupid (as in lacking analytic and intellectual ability) or ignorant (as in lacking knowledge) or even the general not-bothering-to-think (although there is something to this). Nor do I mean this to be the result of (c)overt propangandistic attempts to alter interpretations of events or peoples’ own experiences of those events.

Nope, I mean something more structural, as in a way of being (and thus also thinking—or not thinking, as it were) which encompasses and conditions all of us. There is rarely any sort of intent behind this version of thoughtlessness (although there are at times (c)overt attempts to justify intentional thoughtlessness) and thus it is rarely malicious, and while its effects may nonetheless be pernicious, it may, at some levels, even be beneficial.

Finally, thoughtlessness is not restricted to modern thought. I think it’s a feature of consciously totalizing systems of thought, by which I mean systems of thought which actively seek to rewrite, suppress, or surpass any preexisting narratives and to corral any innovations or questions into forms recognized by that system. I’m not sure how much I’ll be considering those other systems—I’m thinking at this point specifically of medieval Christianity—but as I have an inkling of modern thought as way to overcome the upheavals of said Christianity, there’s likely to be some engagement.

Regardless, I’m interested in the thoughtlessness of modernity, so that’s what I’ll be lookin’ at.

Okay, you say, but you haven’t yet said what it is.

The one word answer is: negation. Other brief definitions: a plowing-under, erasure, diminution, trivialization, limitation, . . . you get the gist. The slightly longer answer is that in modern thought there are some matters worth thinking about and others not, that there are appropriate and inappropriate ways to think about those matters worth thinking about, and that if you think about worthless things in inappropriate ways you will have a hard time getting along in life.

Again, no conspiracy; just a sense of “this is how things are”.

None of this is particularly new. Critics of modernity from both the pre- and (alleged) post- positions have long pointed out what is lost in the movement from one way of being to another. The Catholic Church, Nietzsche, Heidegger, and Strauss are among the more prominent critics, and some versions of anthropology are given over to a recovery from/protection against the predations of modernity.

Although I, too, am a critic—not so much prominent as obscure—I’m not terribly interested in trying to return to some sort of pre-modern ontology or in continuing my lament of How Shitty Everything Is. No, I am actively trying to move beyond the lament and it seems to me that such movement requires trying to make sense of where we are now.

There is so much which makes sense and does not make sense at the same time, so much which is simultaneously thought-ful and thought-less—how can this be?

I am curious.