Boom boom boom

18 09 2014

I am a mine-layer.

Some days—most days?—the most I can accomplish is to fling out enough mines that at least some will burrow into rather than merely roll off of my students.

I’m a pretty good teacher—I’d put myself in the B, B+ range—but I think even the best teachers fail to impart whole systems of thought or history or formulae to their students. One might be able to lay out the sets and subsets, the permutations and exemplars and exceptions, in as straightforward a manner as possible, noting what syncs up with which and where it all falls apart, but beyond the assignment or the test or the essay, the knowledge dissipates.

This isn’t their fault—the student’s, I mean—nor is it the teacher’s. Most of the material covered in a college course can only be fully taken in through repetition, and for many students in many classes, it’s one-and-done: the ticking off of requirements on their way to a degree. What they remember may be courses in their major, and that’s because they run into the same concepts and theories and studies over and over again.

If students are able to see the connections amongst ideas laid out in a 3- or 4-hundred-level course in their field, it likely has less to do with that particular professor than with the accumulation of bits from the 100- and 200-level courses.

So what to do when teaching a 1 or 200-level class, or even an advanced class which is supported by no major?

Lay mines. Try to expose the students to concepts they are likely to encounter again, so that the next time they run across “Aristotle” or “Arendt” or  “deontological ethics”, that little bomb will go off and they’ll say to themselves, Hey, I recognize this! and maybe not feel so estranged from what had seemed strange.

So many metaphors could be used here: taking a student down a path and pointing out enough landmarks so that when they traipse down it again, they’ll say Hey! . . . , and feel more confident in their surroundings, more willing to push further on. Tossing out enough seeds in the hopes that a few take root, sprout. Or maybe repeated vaccinations, priming the immune system to respond when next encountering the invasive idea (tho’ there are clear limits to this last analogy insofar as the knowledge isn’t to be annihilated).

Maybe it’s different for professors at elite schools, with students who’ve already been exposed to and are comfortable with these ideas. Or maybe even at my CUNY school I’d find less mine-laying if I were to teach more advanced-level courses in my field.

But maybe not, or, at least, not the way I teach. Yes, I want them to perform well on tests and papers, but more than that, much more than that, I am greedy enough of their attention that I want them to remember this stuff for the rest of their lives.

I’d rather they get a B and be bothered for decades than get an A and let it all go.

So this might explain why I’m partial to the mine idea: because it allows for the possibility of little bits of insight to explode whenever the student strays over forgotten territory. And if those mines are powerful enough and buried deep enough, there’s a chance those explosions might rearrange her intellectual landscape, might change how he looks at the world.

And yeah, I like the idea of blowing their minds, too.





Blown backwards into the future

14 05 2014

Benjamin conjured history as an angel.

Let’s sit with that for a bit, as it’s a lovely sad conjuring.

There is no repair, not for the angel, not for us. Sad, perhaps, but not unbearably so.

There is also no going back, as that angel learned. If the past is an ocean, then history is diving in and bringing the bits and debris and life to the surface, to the present, to see what we’ve got. We can bring what’s down below to the surface and we can make sense of it, but it is our sense, a present sense. And the things themselves, especially the lives themselves, are changed for having been dragged from the deep.

Diving, digging, spelunking: all this bringing to the surface the bits and debris in attempt to recreate life. History as simulacrum.

And the epochs and eras and moments? Those are the bits highlighted or strung together: the Renaissance or Scientific Revolution or Modernity or the Enlightenment. It gives us a way to see.

Usually, when I speak of seeing, I speak metaphorically. But I wanted literally to see where these different moments were in relation to one another, so I ran parallel timelines of European history—scientific, cultural, religious, political, trade—down sheets of paper taped in my hallway, then plotted out those moments.

003

This is an incomplete draft—I clearly need to allow more room on the final version—but it’s not hard to see how this moment was understand as Italian Renaissance at its ripest.

Or here, as what we now call the Scientific Revolution gets underway:

001

These give me that bird’s eye view of the middle centuries of the last millennium; they also make me wonder what isn’t there, isn’t recorded in any of the texts I’m using.

What moments are still underground? And what stories will we tell if we ever unearth them?

 





And I know things now

7 05 2014

Modernity is dead is in a coma.

Okay, not modernity—modernity is still kickin’—but my medieval/modern project to suss out the beginnings of modernity, yeah, that’s on life support. I’ll probably never pull the plug, but the chances of recovery at this point are slim.

The main problem was that I never had a thesis. As a former post-modernist I was interested in the pre-mod: learning about the last great (Euro) transition might help me to make sense of what may or may not be another transitional moment.

And I learned a lot! I knew pitifully little about European history—couldn’t have told you the difference between the Renaissance and the Enlightenment, that’s how bad I was—and now I know something more. I’d now be comfortable positioning the Renaissance as the final flowering of the medieval era, arguing that the 16th and 17th centuries were the double-hinge between the medieval and the modern, that the Enlightenment was about the new moderns getting chesty, that Nietzsche crowbarred open the crack first noticed by the sophists, and that the medieval era in Europe did not truly end until the end of World War I.

None of these is a particularly novel observation. I make no pretense of expertise nor even much beyond a rudimentary working knowledge: there are still large gaps in my knowledge and large books to be read. And I will continue reading for a very long time.

But I don’t have a point to that reading beyond the knowledge itself. It’s possible that something at some point will present itself as a specific route to be followed, but right now, the past is an ocean, not a river.

That’s all right. I’m a fan of useless knowledge and wandering thoughts.





She blinded me with science

17 02 2014

When to let go and when to hang on?

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?

No.

The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends  which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

We just have to figure that out.





Of flesh and blood I’m made

16 01 2014

What is human?

I got into it with commenter The Wet One at TNC’s joint, who chided me not to, in effect, complicate straightforward matters. I responded that straightforward matters often are quite complicated.

In any case, he issued a specific challenge to claims I made regarding the variability of the human across time and space. This request was in response to this statement:

At one level, there is the matter of what counts as “reasonably concrete realities”; I think this varies across time and place.

Related to this is my disagreement with the contention that those outside of the norm have fallen “within the realm of the ‘human’ for all intents and purposes’. They most assuredly have not and to the extent they do today is due to explicit efforts to change our understanding of the human.

Examples, he asked?

As one of the mods was getting ready to close the thread, I could only offer up the easiest one: questions over the status of embryos and fetuses.

Still, while I think that a reasonable response, it is also incomplete, insofar as it doesn’t get at what and who I was thinking of in writing that comment: people with disabilities.

“People with disabilities”: even that phrase isn’t enough, because “disability” itself isn’t necessarily the apt word.  I had referred in an earlier comment to those whose morphology varied from the statistical norm; not all variations are disabilities in even the strictest sense.

In any case, when I went to my bookshelf to try to pull out specific, referenced, examples, I was stopped by that basic question which set off the whole debate: what is human?

Now, in asking that here I mean: how maximal an understanding of the human? Is to be human to be accorded a certain status and protection (“human rights”)? or is it more minimal, in the sense that one sees the other as kin of some sort, tho’ not necessarily of an equal sort?

Arendt argued for a minimalist sense when she noted there was nothing sacred in the “naked” [of the protections of the law] human, meaning that such status granted no particular privilege. That I both do and do not agree with this is the source of my estoppel.

Kuper in Genocide notes that dehumanization often precedes assault—which suggests that before the one goes after the other, that a kinship is recognized which must then be erased. But maybe not. I don’t know.

Is the human in the recognition? If you are akin to us (and we know that we are human), then we will grant such status (for whatever it’s worth) to you. We might still make distinctions amongst us as to who is superior/inferior, but still grant than an inferior human is still human. There’s something to that—something which I perhaps should have emphasized a bit more than I did in my initial go-’round with TWO.

But I also think are cases in which the kinship might repulse rather than draw in: that disgust or horror (or some kind of uncanny valley) gets in the way of seeing the disgusting/horrid/uncanny one as human. I’m thinking of the work of William Ian Miller and Martha Nussbaum, on disgust, and, perhaps, to various histories of medicine,especially regarding the mentally ill. Perhaps I should dig out that old paper on lobotomy. . . .

Oh, and yet another wrinkle: Insofar as I consider the meaning of the human to vary, I don’t know that one can elide differences between the words used to refer to said humans. “Savage” means one thing, “human” another, and the relationship between the two, well, contestable.

I’m rambling, and still without specific, referenced examples for TWO. I can go the easy route, show the 19th century charts comparing Africans to the great apes, the discussion of so-called “primitive peoples” (with the unveiled implication that such peoples weren’t, perhaps, human people). Could I mention that “orangutan” means “person of the forest”, or is that too glib? Too glib, I think. Not glib is the recent decision to limit greatly the use of chimpanzees in federally-funded research—the extension of protections to our kin, because a kinship is recognized.

And back around again. I don’t know that one can meaningfully separated the identity of  a being from the treatment of the identified being; identification and treatment somersault over and over one another.

So if one protections are offered to one member of H. sapiens and it is withdrawn from another, then it seems to say something about the status of that other: that we don’t recognize you as being one of us. We don’t recognize you as human.

If things can be done to someone with schizophrenia (old term: dementia praecox) or psychosis—various sorts of water or electric shocks, say—that would not be done to someone without these afflictions, then one might wonder whether the schizophrenic or psychotic is, in fact, recognized as human, that as long as the affliction is seen to define the being, then that being is not-quite-human.

Ah, so yet another turn. I allowed for the possibility of superior/inferior humans [which might render moot my examples from eugenics and racism]; what of lesser or more human? Is someone who is less human still human? What does that even mean?

Back to biology. Those born with what we now recognize as chromosomal abnormalities have not and are not always taken in, recognized as being “one of us”. A child with cri-du-chat syndrome does not act like a child without; what are the chances such children have always been recognized as human?

Oh, and I’m not even getting into religion and folklore and demons and fairies and whatnot. Is this not already too long?

I can’t re-read this for sense; no, this has all already flown apart.





We might as well try: Here comes the future and you can’t run from it

24 07 2012

It is terrible not to know all that I want to know, a terribleness only counterbalanced by the pleasure of soaking up what others know.

This is as good a precis for this series as any:

If men have always been concerned with only one task—how to create a society fit to live in—the forces which inspired our distant ancestors are also present in us. Nothing is settled; everything can still be altered. What was done but turned out wrong, can be done again. The Golden Age, which blind superstition had placed behind [or ahead of] us, is in us.

—Claude Levi-Strauss, from Triste Tropiques

Yes, I know Levi-Strauss, but no, I haven’t read him, don’t know if I’ll ever make the time to read him.

But this bit, this bit was worth the time.

h/t John Nichols’s obit for Alexander Cockburn, The Nation





Onward, Christian soldiers

27 06 2012

Done with Calvin and on to the Thirty Years War.

Yes, the project on modernity rumbles on, as I dart back and forth between the 16th and 20th centuries (with occasional forays into the 15th and 14th centuries), jumbling up the wars of religion and emperors and kings and popes and princes and reformers and Reformers and . . . everything everything everything.

May I pause just to note what pleasure, what pure pleasure it gives me to see shapes and movement arise from what had once been a white, blank field of the past?

Consider this line from CV Wedgewood: “Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one.”

Ta-Nehisi Coates has remarked on the beauty of her Wedgewood—and yes, she has a way with words—but her facility with the language reveals a nimbleness of thought, and this one, elegantly expressed, conveys the tragic risk of greatness: Go big and you lose the small, and in losing the small, you lose it all.

Only Pursuing the shadow of a universal power the German rulers forfeited the chance of a national one in its specificity is far more breathtaking and heartbreaking than my pallid generalization.

And it is the specificity itself which provides that pleasure: there was nothing, and now there is something.

Now, before I repeat that last line to end the post, I do want to interject with one observation about Calvin’s Reformed thought, specifically, his doctrine of double predestination (God elects both who goes to heaven and who goes to hell): why would anyone believe this?

Calvin argued that only a few of the professing Christians would be saved and most lost, that there was absolutely nothing the individual (an utterly depraved being) could do to save herself—so why would anyone cleave to a belief system which gave you rotten odds and no way to change them?

One possibility is that most Reformers didn’t believe in predestination, double or otherwise; another is that Reformers did believe in double predestination, but also believed that they were the elect. So, yeah, sucks to be you, o depraved man, but I am so filled with the spirit that there is no way God hasn’t picked me for His team.

There is no rational reason* to believe this; since people believed nonetheless, then it is clear that something other than reason is required to explain the spread of the Reformed faith.

(*Reason in terms of: why pick this religion over that one, not: why pick any religion at all. Context, people, context.)

Anyway, Calvin was much more impressed with himself than I was with him—although it must be noted he had a few more followers than the 19 who follow me (in this blog, anyway).

Oh, man, it’s getting late and I’m getting frantic for sleep so yes, let’s return to pleasure and knowledge and movement where before there was stillness and lines where before there was blankness and etchings across the smooth surface  and something, something rather than nothing.








Follow

Get every new post delivered to your Inbox.

Join 1,320 other followers