Blown backwards into the future

14 05 2014

Benjamin conjured history as an angel.

Let’s sit with that for a bit, as it’s a lovely sad conjuring.

There is no repair, not for the angel, not for us. Sad, perhaps, but not unbearably so.

There is also no going back, as that angel learned. If the past is an ocean, then history is diving in and bringing the bits and debris and life to the surface, to the present, to see what we’ve got. We can bring what’s down below to the surface and we can make sense of it, but it is our sense, a present sense. And the things themselves, especially the lives themselves, are changed for having been dragged from the deep.

Diving, digging, spelunking: all this bringing to the surface the bits and debris in attempt to recreate life. History as simulacrum.

And the epochs and eras and moments? Those are the bits highlighted or strung together: the Renaissance or Scientific Revolution or Modernity or the Enlightenment. It gives us a way to see.

Usually, when I speak of seeing, I speak metaphorically. But I wanted literally to see where these different moments were in relation to one another, so I ran parallel timelines of European history—scientific, cultural, religious, political, trade—down sheets of paper taped in my hallway, then plotted out those moments.


This is an incomplete draft—I clearly need to allow more room on the final version—but it’s not hard to see how this moment was understand as Italian Renaissance at its ripest.

Or here, as what we now call the Scientific Revolution gets underway:


These give me that bird’s eye view of the middle centuries of the last millennium; they also make me wonder what isn’t there, isn’t recorded in any of the texts I’m using.

What moments are still underground? And what stories will we tell if we ever unearth them?


And I know things now

7 05 2014

Modernity is dead is in a coma.

Okay, not modernity—modernity is still kickin’—but my medieval/modern project to suss out the beginnings of modernity, yeah, that’s on life support. I’ll probably never pull the plug, but the chances of recovery at this point are slim.

The main problem was that I never had a thesis. As a former post-modernist I was interested in the pre-mod: learning about the last great (Euro) transition might help me to make sense of what may or may not be another transitional moment.

And I learned a lot! I knew pitifully little about European history—couldn’t have told you the difference between the Renaissance and the Enlightenment, that’s how bad I was—and now I know something more. I’d now be comfortable positioning the Renaissance as the final flowering of the medieval era, arguing that the 16th and 17th centuries were the double-hinge between the medieval and the modern, that the Enlightenment was about the new moderns getting chesty, that Nietzsche crowbarred open the crack first noticed by the sophists, and that the medieval era in Europe did not truly end until the end of World War I.

None of these is a particularly novel observation. I make no pretense of expertise nor even much beyond a rudimentary working knowledge: there are still large gaps in my knowledge and large books to be read. And I will continue reading for a very long time.

But I don’t have a point to that reading beyond the knowledge itself. It’s possible that something at some point will present itself as a specific route to be followed, but right now, the past is an ocean, not a river.

That’s all right. I’m a fan of useless knowledge and wandering thoughts.

She blinded me with science

17 02 2014

When to let go and when to hang on?

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?


The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends  which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

We just have to figure that out.

Of flesh and blood I’m made

16 01 2014

What is human?

I got into it with commenter The Wet One at TNC’s joint, who chided me not to, in effect, complicate straightforward matters. I responded that straightforward matters often are quite complicated.

In any case, he issued a specific challenge to claims I made regarding the variability of the human across time and space. This request was in response to this statement:

At one level, there is the matter of what counts as “reasonably concrete realities”; I think this varies across time and place.

Related to this is my disagreement with the contention that those outside of the norm have fallen “within the realm of the ‘human’ for all intents and purposes’. They most assuredly have not and to the extent they do today is due to explicit efforts to change our understanding of the human.

Examples, he asked?

As one of the mods was getting ready to close the thread, I could only offer up the easiest one: questions over the status of embryos and fetuses.

Still, while I think that a reasonable response, it is also incomplete, insofar as it doesn’t get at what and who I was thinking of in writing that comment: people with disabilities.

“People with disabilities”: even that phrase isn’t enough, because “disability” itself isn’t necessarily the apt word.  I had referred in an earlier comment to those whose morphology varied from the statistical norm; not all variations are disabilities in even the strictest sense.

In any case, when I went to my bookshelf to try to pull out specific, referenced, examples, I was stopped by that basic question which set off the whole debate: what is human?

Now, in asking that here I mean: how maximal an understanding of the human? Is to be human to be accorded a certain status and protection (“human rights”)? or is it more minimal, in the sense that one sees the other as kin of some sort, tho’ not necessarily of an equal sort?

Arendt argued for a minimalist sense when she noted there was nothing sacred in the “naked” [of the protections of the law] human, meaning that such status granted no particular privilege. That I both do and do not agree with this is the source of my estoppel.

Kuper in Genocide notes that dehumanization often precedes assault—which suggests that before the one goes after the other, that a kinship is recognized which must then be erased. But maybe not. I don’t know.

Is the human in the recognition? If you are akin to us (and we know that we are human), then we will grant such status (for whatever it’s worth) to you. We might still make distinctions amongst us as to who is superior/inferior, but still grant than an inferior human is still human. There’s something to that—something which I perhaps should have emphasized a bit more than I did in my initial go-’round with TWO.

But I also think are cases in which the kinship might repulse rather than draw in: that disgust or horror (or some kind of uncanny valley) gets in the way of seeing the disgusting/horrid/uncanny one as human. I’m thinking of the work of William Ian Miller and Martha Nussbaum, on disgust, and, perhaps, to various histories of medicine,especially regarding the mentally ill. Perhaps I should dig out that old paper on lobotomy. . . .

Oh, and yet another wrinkle: Insofar as I consider the meaning of the human to vary, I don’t know that one can elide differences between the words used to refer to said humans. “Savage” means one thing, “human” another, and the relationship between the two, well, contestable.

I’m rambling, and still without specific, referenced examples for TWO. I can go the easy route, show the 19th century charts comparing Africans to the great apes, the discussion of so-called “primitive peoples” (with the unveiled implication that such peoples weren’t, perhaps, human people). Could I mention that “orangutan” means “person of the forest”, or is that too glib? Too glib, I think. Not glib is the recent decision to limit greatly the use of chimpanzees in federally-funded research—the extension of protections to our kin, because a kinship is recognized.

And back around again. I don’t know that one can meaningfully separated the identity of  a being from the treatment of the identified being; identification and treatment somersault over and over one another.

So if one protections are offered to one member of H. sapiens and it is withdrawn from another, then it seems to say something about the status of that other: that we don’t recognize you as being one of us. We don’t recognize you as human.

If things can be done to someone with schizophrenia (old term: dementia praecox) or psychosis—various sorts of water or electric shocks, say—that would not be done to someone without these afflictions, then one might wonder whether the schizophrenic or psychotic is, in fact, recognized as human, that as long as the affliction is seen to define the being, then that being is not-quite-human.

Ah, so yet another turn. I allowed for the possibility of superior/inferior humans [which might render moot my examples from eugenics and racism]; what of lesser or more human? Is someone who is less human still human? What does that even mean?

Back to biology. Those born with what we now recognize as chromosomal abnormalities have not and are not always taken in, recognized as being “one of us”. A child with cri-du-chat syndrome does not act like a child without; what are the chances such children have always been recognized as human?

Oh, and I’m not even getting into religion and folklore and demons and fairies and whatnot. Is this not already too long?

I can’t re-read this for sense; no, this has all already flown apart.

We might as well try: Here comes the future and you can’t run from it

24 07 2012

It is terrible not to know all that I want to know, a terribleness only counterbalanced by the pleasure of soaking up what others know.

This is as good a precis for this series as any:

If men have always been concerned with only one task—how to create a society fit to live in—the forces which inspired our distant ancestors are also present in us. Nothing is settled; everything can still be altered. What was done but turned out wrong, can be done again. The Golden Age, which blind superstition had placed behind [or ahead of] us, is in us.

—Claude Levi-Strauss, from Triste Tropiques

Yes, I know Levi-Strauss, but no, I haven’t read him, don’t know if I’ll ever make the time to read him.

But this bit, this bit was worth the time.

h/t John Nichols’s obit for Alexander Cockburn, The Nation

Onward, Christian soldiers

27 06 2012

Done with Calvin and on to the Thirty Years War.

Yes, the project on modernity rumbles on, as I dart back and forth between the 16th and 20th centuries (with occasional forays into the 15th and 14th centuries), jumbling up the wars of religion and emperors and kings and popes and princes and reformers and Reformers and . . . everything everything everything.

May I pause just to note what pleasure, what pure pleasure it gives me to see shapes and movement arise from what had once been a white, blank field of the past?

Consider this line from CV Wedgewood: “Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one.”

Ta-Nehisi Coates has remarked on the beauty of her Wedgewood—and yes, she has a way with words—but her facility with the language reveals a nimbleness of thought, and this one, elegantly expressed, conveys the tragic risk of greatness: Go big and you lose the small, and in losing the small, you lose it all.

Only Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one in its specificity is far more breathtaking and heartbreaking than my pallid generalization.

And it is the specificity itself which provides that pleasure: there was nothing, and now there is something.

Now, before I repeat that last line to end the post, I do want to interject with one observation about Calvin’s Reformed thought, specifically, his doctrine of double predestination (God elects both who goes to heaven and who goes to hell): why would anyone believe this?

Calvin argued that only a few of the professing Christians would be saved and most lost, that there was absolutely nothing the individual (an utterly depraved being) could do to save herself—so why would anyone cleave to a belief system which gave you rotten odds and no way to change them?

One possibility is that most Reformers didn’t believe in predestination, double or otherwise; another is that Reformers did believe in double predestination, but also believed that they were the elect. So, yeah, sucks to be you, o depraved man, but I am so filled with the spirit that there is no way God hasn’t picked me for His team.

There is no rational reason* to believe this; since people believed nonetheless, then it is clear that something other than reason is required to explain the spread of the Reformed faith.

(*Reason in terms of: why pick this religion over that one, not: why pick any religion at all. Context, people, context.)

Anyway, Calvin was much more impressed with himself than I was with him—although it must be noted he had a few more followers than the 19 who follow me (in this blog, anyway).

Oh, man, it’s getting late and I’m getting frantic for sleep so yes, let’s return to pleasure and knowledge and movement where before there was stillness and lines where before there was blankness and etchings across the smooth surface  and something, something rather than nothing.

Wait just a darned minute!

23 06 2011

Matt Yglesias:

In my experience as a professional political pundit, the study of political philosophy doesn’t get you very far in terms of illuminate real controversies even relative to other branches of philosophy.

I was going to go all umbragy, but then I remembered: he’s right—professional political pundits rarely bother to go very far into the study of political theory in ways which would help to illuminate real controversies.

If I had a rocket launcher

22 05 2011

The invasion of Poland was almost unbearable.

I knew it was awful, but awful only in a general way; the opening didn’t linger on the atrocities, but the details—the killing of 55 Polish prisoners here, the burning of village after village there, the many smug justifications for murder—knit the details of death into the whole cloth of invasion and mass murder.

If I didn’t know how it all ended, I told a friend, I don’t think I could read it.

I’m on the last book of Richard Evans’s trilogy of the Third Reich, finally cracking it open after it sat on my desk for a few weeks.

I raced through The Coming of the Third Reich (useful for its doleful portrayal of the Weimar Republic) and read with fascination The Third Reich in Power, but The Third Reich at War, well, the premonitions of the first two books are borne out in the last. It will get worse, much worse, before it ends; it cannot be said to get better.

Reading about genocide and slaughter has never been fun, but I used to be able to do so without flinching. I remember reading in high school  Anne Nelson’s dispatches in Mother Jones about the Salvadoran death squads; I close my eyes, and I can still conjure up the accompanying photo of bloody heads on bench. College was apartheid and nuclear war, and grad school, human rights abuses generally.

The University of Minnesota maintained an archive of human rights material in its law school library. I’d trudge over there from my West Bank (yes, that’s what it was called) office and read reports of the massacre at the finca San Francisco, of soldiers smashing babies’ heads and slicing up their mothers. Reports of torture in Nicaragua and disappearances in Argentina and killing after killing after killing in Guatemala.

It was awful, but I could take it, and since I could take it, I felt a kind of duty to do so. There was nothing I could do, hunched over these documents in the back corner of the library, but to read them, to read as many of them as I could.

I no longer have the compulsion, or the arrogance, or frankly, even the stomach, any more to do so. I still think the reading matters, the knowledge matters, even if I can’t precisely say why, but it is so hard, almost too hard, to keep reading. To read is to conjure these lives, these men and women and children, and watch them murdered all over again.

It was like that with the footage of the airplanes hitting the World Trade Center, and of the two towers collapsing into themselves. It seemed important to watch, to see, to know what I could, but after that, it just seemed obscene, as if the replays were killing people all over again.

I know that’s not how it works—I am aware of at least a few laws of physics—but the necessity of witness is found precisely in the knowledge of what is witnessed, that is, in the knowledge of the killing of over 2500 people. I don’t want that knowledge dulled or forgotten.

Maybe that’s why it’s so difficult now to read of atrocity: the outrage has been so stretched and worn that in too many places the bare horror is all that remains. The outrage is still there—reading (again) of the T4 extermination program, I raged against the ideology of rassenhygiene and “lives not worth living”—but it no longer protects as it once did. Its use as a buffer is gone; the horror gets  close.

Still, the knowledge matters, so I read what I can when I can. It is the least, the very least, I can do.

I’m a rocket man

14 12 2010

I try to be good, get off the computer for a few hours, and what happens? I miss an entire conversation on science.

Well, goddammit.

(Actually, given that a large portion of the thread was given over to name-calling and trollist cavils, I guess I didn’t miss that much. Still.)

So, science. I am for it.

I am an epistemological nihilist, it’s true, so this support is caveated with the usual cracks and abyssals, but I’m also quite willing to hop right over those chasms to walk among the ruins that compose our human life—and one of our more spectacular ruins is science.

Yes, ‘our’. ‘Our’ because science truly is a human endeavor (even as its dogmatists assert that science can take us outside of ourselves), and as such, there to be claimed by all of us. And it is important to claim it, both against the dogmatists and against those who find nothing of worth in curiosity and rigor, or in experimentation, skepticism, and discovery.

I can only respond to those opposed to discovery with questions and fiction—as we do not inhabit the same world, argument is stillborn—but to the dogmatists and, it must be said, to those who favor curiosity and thus oppose science because they believe science poisons curiosity, I can offer history and reason and ruin.

To offer the whole of that argument is to offer a book; instead, here is the abstract:

We humans have sought to know, and in seeking, have sought to make sense of what we have found. How we make sense has varied—through recourse to myths, common sense, measurement, extrapolation, generalization, systematization, reflection, etc.—and what we make of the sense we make has varied as well. Sometimes we call it truth or religion or wickedness or allegory or interpretation; sometimes we call it science. Sometimes this science is the means, sometimes it is the end, sometimes it is both. In early modern times [in Europe], in the period now known as the Scientific Revolution, science was thought to reveal truths about God, as it also was by those scientists working under the Abbasids; that it also brought technological advance and political and economic gain helped to preserve it against those who argued that a thirst for knowledge was itself corrosive of the faith.

Yet even throughout much of the modern period science was understood as, if no longer an appendage of natural philosophy, as nonetheless a part of a constellation of knowledge which included the arts, literature, and humanities; its practitioners are all a part of the learned class.

This collegiality faded, and now science is understood primarily as comprising the natural sciences and its methods; to the extent some social sciences adopt those methods, they may or may not be admitted to the realm as sciences, albeit as its lesser, ‘softer’, version. That science has a history is barely acknowledged, and it is unclear if scientists (or their learned critics) would consider them as ‘intellectuals’ rather than (mere) technicians, experimentalists, and lab directors.

This separation (and, often, contempt) is lamentable all around. [Natural] science is more than its tools and methods, involves more [hermeneutic] interpretation than the experimentalists may admit of, and requires greater curiosity than its skeptics may allow. But if we want to know, if we humans truly seek a human science (and, again, I would argue there is no other), then we have to prevent science from sliding all the way into scientism. Some think it’s already so technics-shriveled, that it is already mere methodological fetishism; I disagree.

This saving gesture doesn’t require that artists now refer to themselves as scientists or that neurobiologists become novelists. No, this reclamation project (another ruin) would gather the curious back together, to see if we exiles from one another would have anything to say to one another, to see what we could see.

I don’t believe this every day—yesterday, for example, I had no patience for this.

But some days, some days I think we humans could do this. Some days, this is my something more.

Too goddamned irritated to blog

13 12 2010

You call your ‘movement’ No Labels, give yourself the motto Not Left. Not Right. Forward., and yet on the top of your web page insist

We are Democrats, Republicans, and Independents who are united in the belief that we do not have to give up our labels, merely put them aside to do what’s best for America.

Kentucky fucking chicken, what’s the point of calling yourself No Label if what you really mean is Every Label (on the inside pocket)?

And it’s a stupid idea, anyway.

And then this, from a blog which insistence and crankiness I like and respect: Removing science from anthropology.

What anthropologists do is up to them; that said, I generally think we social scientists should hang on to the word ‘science’ with all our grubby little might. ‘Science’ in its most general terms as a search for knowledge has a long and honorable history and, as I always like to point out, one of the earliest known seekers was Aristotle—who considered political science the highest of all sciences. So there.

What chaps me about this piece is not that non-anthropologists have opinions about this move, but after the requisite words of respect about the so-called softer sciences, Orac also has to toss in the requisite bullshit references to ‘post-modernism’ and ‘political correctness’.

Yeah, I get it, he sees invidious parallels between claims about ‘other ways of knowing’ and his white whale, complementary and alternative medicine.

This is an intriguing claim. Truly.

But again with the KFC: Do you need to haul out straw-ass versions of an interpretive method which definition you draw from Sokal  in order to light the whole goddamned discussion on fire?

Kentucky Jesus Fried Christ.


Get every new post delivered to your Inbox.

Join 1,268 other followers