She blinded me with science

17 02 2014

When to let go and when to hang on?

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?

No.

The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends  which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

We just have to figure that out.





Of flesh and blood I’m made

16 01 2014

What is human?

I got into it with commenter The Wet One at TNC’s joint, who chided me not to, in effect, complicate straightforward matters. I responded that straightforward matters often are quite complicated.

In any case, he issued a specific challenge to claims I made regarding the variability of the human across time and space. This request was in response to this statement:

At one level, there is the matter of what counts as “reasonably concrete realities”; I think this varies across time and place.

Related to this is my disagreement with the contention that those outside of the norm have fallen “within the realm of the ‘human’ for all intents and purposes’. They most assuredly have not and to the extent they do today is due to explicit efforts to change our understanding of the human.

Examples, he asked?

As one of the mods was getting ready to close the thread, I could only offer up the easiest one: questions over the status of embryos and fetuses.

Still, while I think that a reasonable response, it is also incomplete, insofar as it doesn’t get at what and who I was thinking of in writing that comment: people with disabilities.

“People with disabilities”: even that phrase isn’t enough, because “disability” itself isn’t necessarily the apt word.  I had referred in an earlier comment to those whose morphology varied from the statistical norm; not all variations are disabilities in even the strictest sense.

In any case, when I went to my bookshelf to try to pull out specific, referenced, examples, I was stopped by that basic question which set off the whole debate: what is human?

Now, in asking that here I mean: how maximal an understanding of the human? Is to be human to be accorded a certain status and protection (“human rights”)? or is it more minimal, in the sense that one sees the other as kin of some sort, tho’ not necessarily of an equal sort?

Arendt argued for a minimalist sense when she noted there was nothing sacred in the “naked” [of the protections of the law] human, meaning that such status granted no particular privilege. That I both do and do not agree with this is the source of my estoppel.

Kuper in Genocide notes that dehumanization often precedes assault—which suggests that before the one goes after the other, that a kinship is recognized which must then be erased. But maybe not. I don’t know.

Is the human in the recognition? If you are akin to us (and we know that we are human), then we will grant such status (for whatever it’s worth) to you. We might still make distinctions amongst us as to who is superior/inferior, but still grant than an inferior human is still human. There’s something to that—something which I perhaps should have emphasized a bit more than I did in my initial go-’round with TWO.

But I also think are cases in which the kinship might repulse rather than draw in: that disgust or horror (or some kind of uncanny valley) gets in the way of seeing the disgusting/horrid/uncanny one as human. I’m thinking of the work of William Ian Miller and Martha Nussbaum, on disgust, and, perhaps, to various histories of medicine,especially regarding the mentally ill. Perhaps I should dig out that old paper on lobotomy. . . .

Oh, and yet another wrinkle: Insofar as I consider the meaning of the human to vary, I don’t know that one can elide differences between the words used to refer to said humans. “Savage” means one thing, “human” another, and the relationship between the two, well, contestable.

I’m rambling, and still without specific, referenced examples for TWO. I can go the easy route, show the 19th century charts comparing Africans to the great apes, the discussion of so-called “primitive peoples” (with the unveiled implication that such peoples weren’t, perhaps, human people). Could I mention that “orangutan” means “person of the forest”, or is that too glib? Too glib, I think. Not glib is the recent decision to limit greatly the use of chimpanzees in federally-funded research—the extension of protections to our kin, because a kinship is recognized.

And back around again. I don’t know that one can meaningfully separated the identity of  a being from the treatment of the identified being; identification and treatment somersault over and over one another.

So if one protections are offered to one member of H. sapiens and it is withdrawn from another, then it seems to say something about the status of that other: that we don’t recognize you as being one of us. We don’t recognize you as human.

If things can be done to someone with schizophrenia (old term: dementia praecox) or psychosis—various sorts of water or electric shocks, say—that would not be done to someone without these afflictions, then one might wonder whether the schizophrenic or psychotic is, in fact, recognized as human, that as long as the affliction is seen to define the being, then that being is not-quite-human.

Ah, so yet another turn. I allowed for the possibility of superior/inferior humans [which might render moot my examples from eugenics and racism]; what of lesser or more human? Is someone who is less human still human? What does that even mean?

Back to biology. Those born with what we now recognize as chromosomal abnormalities have not and are not always taken in, recognized as being “one of us”. A child with cri-du-chat syndrome does not act like a child without; what are the chances such children have always been recognized as human?

Oh, and I’m not even getting into religion and folklore and demons and fairies and whatnot. Is this not already too long?

I can’t re-read this for sense; no, this has all already flown apart.





We might as well try: Here comes the future and you can’t run from it

24 07 2012

It is terrible not to know all that I want to know, a terribleness only counterbalanced by the pleasure of soaking up what others know.

This is as good a precis for this series as any:

If men have always been concerned with only one task—how to create a society fit to live in—the forces which inspired our distant ancestors are also present in us. Nothing is settled; everything can still be altered. What was done but turned out wrong, can be done again. The Golden Age, which blind superstition had placed behind [or ahead of] us, is in us.

—Claude Levi-Strauss, from Triste Tropiques

Yes, I know Levi-Strauss, but no, I haven’t read him, don’t know if I’ll ever make the time to read him.

But this bit, this bit was worth the time.

h/t John Nichols’s obit for Alexander Cockburn, The Nation





Onward, Christian soldiers

27 06 2012

Done with Calvin and on to the Thirty Years War.

Yes, the project on modernity rumbles on, as I dart back and forth between the 16th and 20th centuries (with occasional forays into the 15th and 14th centuries), jumbling up the wars of religion and emperors and kings and popes and princes and reformers and Reformers and . . . everything everything everything.

May I pause just to note what pleasure, what pure pleasure it gives me to see shapes and movement arise from what had once been a white, blank field of the past?

Consider this line from CV Wedgewood: “Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one.”

Ta-Nehisi Coates has remarked on the beauty of her Wedgewood—and yes, she has a way with words—but her facility with the language reveals a nimbleness of thought, and this one, elegantly expressed, conveys the tragic risk of greatness: Go big and you lose the small, and in losing the small, you lose it all.

Only Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one in its specificity is far more breathtaking and heartbreaking than my pallid generalization.

And it is the specificity itself which provides that pleasure: there was nothing, and now there is something.

Now, before I repeat that last line to end the post, I do want to interject with one observation about Calvin’s Reformed thought, specifically, his doctrine of double predestination (God elects both who goes to heaven and who goes to hell): why would anyone believe this?

Calvin argued that only a few of the professing Christians would be saved and most lost, that there was absolutely nothing the individual (an utterly depraved being) could do to save herself—so why would anyone cleave to a belief system which gave you rotten odds and no way to change them?

One possibility is that most Reformers didn’t believe in predestination, double or otherwise; another is that Reformers did believe in double predestination, but also believed that they were the elect. So, yeah, sucks to be you, o depraved man, but I am so filled with the spirit that there is no way God hasn’t picked me for His team.

There is no rational reason* to believe this; since people believed nonetheless, then it is clear that something other than reason is required to explain the spread of the Reformed faith.

(*Reason in terms of: why pick this religion over that one, not: why pick any religion at all. Context, people, context.)

Anyway, Calvin was much more impressed with himself than I was with him—although it must be noted he had a few more followers than the 19 who follow me (in this blog, anyway).

Oh, man, it’s getting late and I’m getting frantic for sleep so yes, let’s return to pleasure and knowledge and movement where before there was stillness and lines where before there was blankness and etchings across the smooth surface  and something, something rather than nothing.





Wait just a darned minute!

23 06 2011

Matt Yglesias:

In my experience as a professional political pundit, the study of political philosophy doesn’t get you very far in terms of illuminate real controversies even relative to other branches of philosophy.

I was going to go all umbragy, but then I remembered: he’s right—professional political pundits rarely bother to go very far into the study of political theory in ways which would help to illuminate real controversies.





If I had a rocket launcher

22 05 2011

The invasion of Poland was almost unbearable.

I knew it was awful, but awful only in a general way; the opening didn’t linger on the atrocities, but the details—the killing of 55 Polish prisoners here, the burning of village after village there, the many smug justifications for murder—knit the details of death into the whole cloth of invasion and mass murder.

If I didn’t know how it all ended, I told a friend, I don’t think I could read it.

I’m on the last book of Richard Evans’s trilogy of the Third Reich, finally cracking it open after it sat on my desk for a few weeks.

I raced through The Coming of the Third Reich (useful for its doleful portrayal of the Weimar Republic) and read with fascination The Third Reich in Power, but The Third Reich at War, well, the premonitions of the first two books are borne out in the last. It will get worse, much worse, before it ends; it cannot be said to get better.

Reading about genocide and slaughter has never been fun, but I used to be able to do so without flinching. I remember reading in high school  Anne Nelson’s dispatches in Mother Jones about the Salvadoran death squads; I close my eyes, and I can still conjure up the accompanying photo of bloody heads on bench. College was apartheid and nuclear war, and grad school, human rights abuses generally.

The University of Minnesota maintained an archive of human rights material in its law school library. I’d trudge over there from my West Bank (yes, that’s what it was called) office and read reports of the massacre at the finca San Francisco, of soldiers smashing babies’ heads and slicing up their mothers. Reports of torture in Nicaragua and disappearances in Argentina and killing after killing after killing in Guatemala.

It was awful, but I could take it, and since I could take it, I felt a kind of duty to do so. There was nothing I could do, hunched over these documents in the back corner of the library, but to read them, to read as many of them as I could.

I no longer have the compulsion, or the arrogance, or frankly, even the stomach, any more to do so. I still think the reading matters, the knowledge matters, even if I can’t precisely say why, but it is so hard, almost too hard, to keep reading. To read is to conjure these lives, these men and women and children, and watch them murdered all over again.

It was like that with the footage of the airplanes hitting the World Trade Center, and of the two towers collapsing into themselves. It seemed important to watch, to see, to know what I could, but after that, it just seemed obscene, as if the replays were killing people all over again.

I know that’s not how it works—I am aware of at least a few laws of physics—but the necessity of witness is found precisely in the knowledge of what is witnessed, that is, in the knowledge of the killing of over 2500 people. I don’t want that knowledge dulled or forgotten.

Maybe that’s why it’s so difficult now to read of atrocity: the outrage has been so stretched and worn that in too many places the bare horror is all that remains. The outrage is still there—reading (again) of the T4 extermination program, I raged against the ideology of rassenhygiene and “lives not worth living”—but it no longer protects as it once did. Its use as a buffer is gone; the horror gets  close.

Still, the knowledge matters, so I read what I can when I can. It is the least, the very least, I can do.





I’m a rocket man

14 12 2010

I try to be good, get off the computer for a few hours, and what happens? I miss an entire conversation on science.

Well, goddammit.

(Actually, given that a large portion of the thread was given over to name-calling and trollist cavils, I guess I didn’t miss that much. Still.)

So, science. I am for it.

I am an epistemological nihilist, it’s true, so this support is caveated with the usual cracks and abyssals, but I’m also quite willing to hop right over those chasms to walk among the ruins that compose our human life—and one of our more spectacular ruins is science.

Yes, ‘our’. ‘Our’ because science truly is a human endeavor (even as its dogmatists assert that science can take us outside of ourselves), and as such, there to be claimed by all of us. And it is important to claim it, both against the dogmatists and against those who find nothing of worth in curiosity and rigor, or in experimentation, skepticism, and discovery.

I can only respond to those opposed to discovery with questions and fiction—as we do not inhabit the same world, argument is stillborn—but to the dogmatists and, it must be said, to those who favor curiosity and thus oppose science because they believe science poisons curiosity, I can offer history and reason and ruin.

To offer the whole of that argument is to offer a book; instead, here is the abstract:

We humans have sought to know, and in seeking, have sought to make sense of what we have found. How we make sense has varied—through recourse to myths, common sense, measurement, extrapolation, generalization, systematization, reflection, etc.—and what we make of the sense we make has varied as well. Sometimes we call it truth or religion or wickedness or allegory or interpretation; sometimes we call it science. Sometimes this science is the means, sometimes it is the end, sometimes it is both. In early modern times [in Europe], in the period now known as the Scientific Revolution, science was thought to reveal truths about God, as it also was by those scientists working under the Abbasids; that it also brought technological advance and political and economic gain helped to preserve it against those who argued that a thirst for knowledge was itself corrosive of the faith.

Yet even throughout much of the modern period science was understood as, if no longer an appendage of natural philosophy, as nonetheless a part of a constellation of knowledge which included the arts, literature, and humanities; its practitioners are all a part of the learned class.

This collegiality faded, and now science is understood primarily as comprising the natural sciences and its methods; to the extent some social sciences adopt those methods, they may or may not be admitted to the realm as sciences, albeit as its lesser, ‘softer’, version. That science has a history is barely acknowledged, and it is unclear if scientists (or their learned critics) would consider them as ‘intellectuals’ rather than (mere) technicians, experimentalists, and lab directors.

This separation (and, often, contempt) is lamentable all around. [Natural] science is more than its tools and methods, involves more [hermeneutic] interpretation than the experimentalists may admit of, and requires greater curiosity than its skeptics may allow. But if we want to know, if we humans truly seek a human science (and, again, I would argue there is no other), then we have to prevent science from sliding all the way into scientism. Some think it’s already so technics-shriveled, that it is already mere methodological fetishism; I disagree.

This saving gesture doesn’t require that artists now refer to themselves as scientists or that neurobiologists become novelists. No, this reclamation project (another ruin) would gather the curious back together, to see if we exiles from one another would have anything to say to one another, to see what we could see.

I don’t believe this every day—yesterday, for example, I had no patience for this.

But some days, some days I think we humans could do this. Some days, this is my something more.





Too goddamned irritated to blog

13 12 2010

You call your ‘movement’ No Labels, give yourself the motto Not Left. Not Right. Forward., and yet on the top of your web page insist

We are Democrats, Republicans, and Independents who are united in the belief that we do not have to give up our labels, merely put them aside to do what’s best for America.

Kentucky fucking chicken, what’s the point of calling yourself No Label if what you really mean is Every Label (on the inside pocket)?

And it’s a stupid idea, anyway.

And then this, from a blog which insistence and crankiness I like and respect: Removing science from anthropology.

What anthropologists do is up to them; that said, I generally think we social scientists should hang on to the word ‘science’ with all our grubby little might. ‘Science’ in its most general terms as a search for knowledge has a long and honorable history and, as I always like to point out, one of the earliest known seekers was Aristotle—who considered political science the highest of all sciences. So there.

What chaps me about this piece is not that non-anthropologists have opinions about this move, but after the requisite words of respect about the so-called softer sciences, Orac also has to toss in the requisite bullshit references to ‘post-modernism’ and ‘political correctness’.

Yeah, I get it, he sees invidious parallels between claims about ‘other ways of knowing’ and his white whale, complementary and alternative medicine.

This is an intriguing claim. Truly.

But again with the KFC: Do you need to haul out straw-ass versions of an interpretive method which definition you draw from Sokal  in order to light the whole goddamned discussion on fire?

Kentucky Jesus Fried Christ.





Q&A: Caputo

26 08 2010

how did you come to his works? —dmf

dmf—who clearly knows more about John Caputo’s works than I do—asked me the above question. Given that Caputo is not widely read by political scientists nor, almost certainly, by the general public, it’s the kind of particular query which opens up to the more general: how’dja find this [relatively unknown] cat?

For Caputo and me, the answer is twofold:

1. I read a long review of his works in the online version of Christianity Today; given the length of the essay, I think it was in the Books & Culture section. I was intrigued.

2. I worked in the philosophy section of the Astor Place Barnes & Noble and noticed we had a copy of Caputo and Gianni Vattimo’s After the Death of God. Employees are allowed to borrow hardcover books from their store, so I plucked this one out.

That’s the twinned short answer; here’s the bifurcated longer answer:

Early in my grad school career I became interested in the question of knowledge. It didn’t initially cohere into an inquiry into epistemology, but I did note that many of the questions I had about x, y, or z phenomena would lead me to questions about the approaches to x, y, or z phenomena, which led, ultimately, to questions about any approach to any phenomenon—in other words, not only how do we know what we know, but how do we determine something is a ‘what’ worthy (or at least capable) of being known, and what does it mean that something has been plucked out of the everything to become a ‘what’ in this particular way.

(These kinds of questions, it should be said, can go on for a very long time. You get the drift. . . .)

Epistemological issues were all the rage (really!) in some parts of the academy in the 1990s, which is when I did the bulk of my graduate work. Early on I was a dogmatic post-modernist and quite glib in my denuciations of Liberalism, the concept of the unitary individual, and the notion that we could ever truly know anything. Ah, the joys of the supercharged nihilist!

Then time did its thing, I mellowed, and while I didn’t surrender my skepticism, I no longer held it in such esteem. I don’t know that we can know, but we seem to make do, in the meantime. I toss a lot of knowledge into the category of the ‘provisional’ and go on from there.

There’s much more behind this, of course, but this is reasonable gloss on where I am now.

So I’m much less dogmatic than I used to be, more curious, and more willing to retrieve from my own personal ash-heap notions that had seem dead, naive, or hopelessly problematic. (Note: that something was ‘hopelessly problematic’ was reason both for my know-it-all (!) nihilist self to toss it and my curious self to retrieve it.) One of those things I had tossed was hermeneutics.

My department was very strong in political theory, but most of the theorists were suspicious of the turn theory seemed to be making away from the history of thought and toward considerations of method. Still, there were courses on method, and in one of those courses we mucked around a bit in hermeneutics. This, however, was a hermeneutics of the Gadamer sort, that is, an explicitly backwards-looking interpretation of tradition and meaning.

I have my disagreements with Habermas, but I think he nails it with regard to this type of interpretation: it is the method of the museum.

So to have come across Caputo and Vattimo and their arguments about ‘weak theology’ and nihilism and radical hermeneutics, well, I was intrigued: This was not your father’s interpretive method.

Couple this with an ongoing interest in questions of existence and hop-skip-jump I am led down another rabbit hole.

The second element at play concerns curiosity and cowardice among the credentialed. You see, once you get a degree, you [are able to] assume a level of expertise about your particular field. This expertise requires you both to know the Big Names and Big Debates and to have more answers than questions; it also requires you to shun certain topics and authors as unworthy of Serious Consideration.

In short, you know whose name to drop and whose to dismiss.

Now, I had never heard of either Caputo or Vattimo when I was in grad school, and I have no reason to believe that either had any kind of reputation, good or bad, among political theorists. Still, they were (are?) outliers among my kind, which makes them risky: If others haven’t heard of them, how are you to talk about them? Perhaps there’s a good reason no one else has heard of them; perhaps there’s something wrong with you for thinking so highly of them. . . .

Please note that no one has ever actually said any of these things to me; no, the responsibility for carrying this particular set of neuroses lies with me. But having been acculturated into academia, and by remaining even tangentially involved (as an adjunct) in my field, I remain caught in those cross-currents of ‘credentiality’; perhaps as an adjunct I am even more vulnerable to questions about my legitimacy as a political theorist.

Yet I have also, because I am an adjunct who is not looking for a tenure-track position, had the space to turn around and look at what and why it is I am doing, on the margins, in the academy. What is the purpose of my presence in the classroom?

And that is where Caputo and Vattimo have led me, in their forward-looking or radical hermeneutics: What is your purpose? What is the point? What is the meaning? What are the possibilities?

Answers are fine and necessary things, and in certain contexts require their own kind of courage. But the questions! Those can always get you into real trouble.





It’s getting better all the time

4 04 2010

I blame Rod Dreher.

No, he didn’t start it—well, maybe he did—but he certainly propelled my thinking back a thousand years or so.

Mr. Dreher, you see, is an American old-school conservative: He’s skeptical of modernity even as he admittedly eats of its fruits; skeptical of government (that’s the American part) even as he decries a culture which, in his view, corrodes human dignity; and a believer in community and roots even as he’s repeatedly moved his family around the country.

I say this not to damn him, not least because he is honest about his contradictions, but to locate, if not the then at least a, source of my current trajectory.

You see, I became interested in one of his contradictions, and took off from there.

Dreher has written (not terribly thoughtfully, for the most part) on Islam and the violence currently associated with it. He then contrasts this to contemporary Christianity, and to the relative lack of similar violence. There are all kinds of commentary one could offer on his views and contrasts, but what squiggled into my brain was his unquestioning acceptance of a main tenet of modernity—why would this professed anti-modern base his critique on a pillar of modern thought?

Time: The notion that there is a forward and a back-ward, and that forward is better than back.

This notion of the forward movement of time, the accretion of knowledge, the betterment of the status of the world, has explicitly informed progressive thought within modernity, but it runs underneath almost all of modern Anglo-American and European thought.

(Disclaimer: I’m not talking about the whole world in my discussion of modernity, or of all forms of modernity—there are forms of modern art and architecture, for example, which are distinct from that of  political theory—but of the set of ideas which emerged out of Europe and which greatly informed European philosophy and political institutions. These ideas have of course also found a home across the globe (not least in the United States), but in attempting to trace the ideas back to there source, I’m confining myself to the United Kingdom and the continent. Finally, I make no claim that these ideas in and of themselves are unique to Europe, but that there particular shape and constellation is historically specific. That is all.)

Okay. So, what got to me about Dreher’s contentions regarding Islam was that Christianity today was ‘better’ in some objective (or at least, intersubjective) way than Islam, that is, that even those who are not Christian would see that Christianity is better for the world than Islam.

I’m neither Christian nor Muslim, so theoretically I could simply dismiss such claims about the relative merits of these religions as a kind of fan jockeying of a sport I don’t follow—except that, contrary to Franklin Foer, religion has been a far greater force in the world than soccer.

In any case, even if it is the case that currently there is less violence associated with Christianity than with Islam, it wasn’t always so: The history of Christian Europe was until very recently a history of warring Europe.

I’ll leave that for another day. What is key is the general formula:  that at time t x was strongly associated with y, and that if at time t+1 x is no longer strongly associated with y it is not to say that x will never again associate with y.

To put it more colloquially, just because it ain’t now doesn’t mean it won’t ever be. That Christianity is no longer warring doesn’t mean it won’t ever war.

To believe otherwise is to believe that the past, being the past, has been overcome, never to return; the future is all—a thoroughly modern notion.

Again, as I’m not a fan of either team, I’m not about to engage in Christian-Muslim chest-bumping. More to the point, shit’s too complex for that.

Besides, that’s not what I’m interested in. In thinking about time, I got to thinking about what else characterizes modernity, and thus what might be post-modern, and oh, are we really post-modern? no I don’t think so even though I once took it for granted (which goes to show the risks of taking things for granted) and maybe where we are is at the edges of modernity and who knows if there’s more modernity beyond this or whether these are the fraying edges and hm how would one know maybe it would make sense to look at that last transition into modernity and what came before that?  the Renaissance but was that the beginning of modernity or the end of what came before that? hmm oh yeah the medieval period and Aquinas and . . .  uh. . .  shit: I don’t know anything about the medieval period.

So that’s why I’m mucking about the past, trying to make sense of those currents within the old regime which led, eventually (although certainly not ineluctably) to the new.

It’s a tricky business, not least because I’m looking at the old through the lens of the new; even talking about ‘looking back’ is a modern sensibility.

So be it: Here is where I stand; I can do no other.

Well, okay, I can crouch, and turn around, and try not to take my stance for granted or to think that my peering into the past will in fact bring me into the past.

But I can still look.

~~~

My starter reading list, on either side and in the midst of.

  • A Splendid Exchange, William J. Bernstein
  • God’s Crucible, David Levering Lewis
  • Eunuchs for the Kingdom of Heaven, Uta Ranke-Heinemann
  • Aristotle’s Children, Richard E. Rubenstein,
  • A World Lit Only By Fire, William Manchester
  • Sea of Faith, Stephen O’Shea
  • The Science of Liberty, Timothy Ferris
  • Betraying Spinoza, Rebecca Newberger Goldstein
  • The Scientific Revolution, Stephen Shapin
  • Leviathan and the Air-Pump, Stephen Shapin and Simon Schaffer
  • Coming of Age in the Milky Way, Timothy Ferris

Suggestions welcome.