She blinded me with science

17 02 2014

When to let go and when to hang on?

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?

No.

The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends  which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

We just have to figure that out.





Angels in the architecture

16 07 2013

This is not a “why I am not a creationist” piece. Oh no. Even though I’m not.

This is a hit on a “why I am a creationist” piece.

Virginia Heffernan, who can be an engaging writer, has apparently decided to disengage from thinking. In a widely commented-upon piece for Yahoo, the tech and culture writer outed herself as a creationist. It is a spectacularly bad piece of . . . well, I guess it’s a species of argumentation, but as she kinds of flits and floats from the pretty to the happy and fleetly flees from sweet reason, it might be best to consider this a kind of (bad) performance art.

My brief with her is less about the God-ish conclusion than that flitting and floating: she rejects science because its boring and sad and aren’t stories about God sooooo much better?

You think I’m exaggerating? I am not. To wit:

I assume that other people love science and technology, since the fields are often lumped together, but I rarely meet people like that. Technology people are trippy; our minds are blown by the romance of telecom. At the same time, the people I know who consider themselves scientists by nature seem to be super-skeptical types who can be counted on to denigrate religion, fear climate change and think most people—most Americans—are dopey sheep who believe in angels and know nothing about all the gross carbon they trail, like “Pig-Pen.”

I like most people. I don’t fear environmental apocalypse. And I don’t hate religion. Those scientists no doubt see me as a dopey sheep who believes in angels and is carbon-ignorant. I have to say that they may be right.

Uh-huh.

Later she mentions that she’s just not moved by the Big Bang or evolution, and that evo-psych is sketchy science (which it is) this must mean all of science is sketchy (which it is not).

And then this stirring conclusion:

All the while, the first books of the Bible are still hanging around. I guess I don’t “believe” that the world was created in a few days, but what do I know? Seems as plausible (to me) as theoretical astrophysics, and it’s certainly a livelier tale. As “Life of Pi” author Yann Martel once put it, summarizing his page-turner novel: “1) Life is a story. 2) You can choose your story. 3) A story with God is the better story.”

(Would it be fair to mention at this point that I hated Life of Pi? Too beside-the-point?)

To summarize, she likes technology—because it’s trippy—but she doesn’t like knowing the hows and whys technology actually works, i.e., the science.

This would be fine—after all, there are all kinds of things I like without necessarily being interested in how and why they came to be—were it not for the fact that she’s a technology writer.

Perhaps she’s a closet Juggalo, or maybe she thought Bill O’Reilly waxed profound on the movement of tides, or maybe she just ate a shitload of shrooms and floated down to her keyboard, but I’d be very—excuse me, super-skeptical of the views of a tech writer who apparently thinks angels make iPhones.

~~~

I have to admit, I was more amused by her piece than anything, and her Twitter exchange with Carl Zimmer left me gasping; to the extent I can make out any kind of coherent line at all, it seems to be “I like stories more than theories—so there!”

As someone who likes both stories and theories—yes, Virginia, we can have both—however, I hate her feeding into the Two Cultures divide, not least because dopey angel-mongering tends to diminish even further the humanities.

I am a science enthusiast, but I am also a critic of the some of the more imperial epistemological claims by some scientists (what often gets branded as “scientism“). To note that the methods of science (methodological naturalism, nomological-deductivism—take yer pick) and knowledge produced from those methods are bounded is often taken as an attack on science itself.

And, to be fair, sometimes—as in the Storified Twitter spat, when Heffernan (big fat honking sigh) pulls Foucault out her nose to fling at Zimmer—it is.

But it ain’t necessarily so. It is simply the observation that science is one kind of practice, that it hasn’t escaped the conditionality and history of practice into some kind of absolute beyond.

Now, there’s a lot more behind that observation that I’m willing to go into at this late hour, so allow me to skip ahead to my ire at Heffernan: her dipshit argument makes it harder for those of us who’d prefer our critiques both dip- and shit-free.

So, thanks Virginia, thanks for stuffing your face with shrooms or replacing your neurons with helium or whatever the hell it was that lead you to declare the moon is made of cheese.

But next time, if there is a next time, Just Say No.





Here’s a man who lives a life

23 01 2013

I’m a big fan of science, and an increasingly big fan of science fiction.

I do, however, prefer that, on a practical level, we note the difference between the two.

There’s a lot to be said for speculation—one of the roots of political science is an extended speculation on the construction of a just society—but while I am not opposed to speculation informing practice, the substitution of what-if thinking for practical thought (phronēsis) in politics results in farce, disaster, or farcical disaster.

So too in science.

Wondering about a clean and inexhaustible source of energy can lead to experiments which point the way to cleaner and longer-lasting energy sources; it can also lead to non-replicable claims about desktop cold fusion. The difference between the two is the work.

You have to do the work, work which includes observation, experimentation, and rigorous theorizing. You don’t have to know everything at the outset—that’s one of the uses of experimentation—but to go from brain-storm to science you have to test your ideas.

This is all a very roundabout way of saying that cloning to make Neandertals is a bad idea.

Biologist George Church thinks synthesizing a Neandertal would be a good idea, mainly because it would diversify the “monoculture” of the Homo sapiens.

My first response is: this is just dumb. The genome of H. sapiens is syncretic, containing DNA from, yes, Neandertals, Denisovans, and possibly other archaic species, as well as microbial species. Given all of the varieties of life on this planet, I guess you could make the case for a lack of variety among humans, but calling us a “monoculture” seems rather to stretch the meaning of the term.

My second response is: this is just dumb. Church assumes a greater efficiency for cloning complex species than currently exists. Yes, cows and dogs and cats and frogs have all been cloned, but over 90 percent of all cloning attempts fail. Human pregnancy is notably inefficient—only 20-40% of all fertilized eggs result in a live birth—so it is tough to see why one would trumpet a lab process which is even more scattershot than what happens in nature.

Furthermore, those clones which are successfully produced nonetheless tend to be less healthy than the results of sexual reproduction.

Finally, all cloned animals require a surrogate mother in which to gestate. Given the low success rates of clones birthed by members of their own species, what are the chances that an H. sapiens woman would be able to bring a Neandertal clone to term—and without harming herself in the process?

I’m not against cloning, for the record. The replication of DNA segments and microbial life forms is a standard part of lab practice, and replicated tissues organs could conceivably have a role in regenerative medicine.

But—and this is my third response—advocating human and near-human cloning is at this point scientifically irresponsible. The furthest cloning has advanced in primates is the cloning of monkey embryos, that is, there has been no successful reproductive cloning of a primate.

To repeat: there has been no successful reproductive cloning of our closest genetic relatives. And Church thinks we could clone a Neandertal, easy-peasy?

No.

There are all kinds of ethical questions about cloning, of course, but in the form of bio-ethics I practice, one undergirded by the necessity of phronēsis, the first question I ask is: Is this already happening? Is this close to happening?

If the answer is No, then I turn my attention to those practices for which the answer is Yes.

Cloning is in-between: It is already happening in some species, but the process is so fraught that the inefficiencies themselves should warn scientists off of any attempts on humans. Still, as an in-between practice, it is worth considering the ethics of human cloning.

But Neandertal cloning? Not even close.

None of this means that Church can’t speculate away on the possibilities. He just shouldn’t kid himself that he’s engaging in science rather than science fiction.

(h/t: Tyler Cowen)





All things weird and wonderful, 28

11 01 2013

Galaxy’s centre tastes of raspberries and smells of rum, say astronomers

How is this not among the best news in, um, the galaxy?

Astronomers searching for the building blocks of life in a giant dust cloud at the heart of the Milky Way have concluded that it would taste vaguely of raspberries.

Ian Sample of the Guardian reports that after years of pointing their telescope into the nether regions of the ‘verse,

astronomers sifted through thousands of signals from Sagittarius B2, a vast dust cloud at the centre of our galaxy. While they failed to find evidence for amino acids, they did find a substance called ethyl formate, the chemical responsible for the flavour of raspberries.

“It does happen to give raspberries their flavour, but there are many other molecules that are needed to make space raspberries,” Arnaud Belloche, an astronomer at the Max Planck Institute for Radio Astronomy in Bonn, told the Guardian.

Curiously, ethyl formate has another distinguishing characteristic: it also smells of rum.

I’m a gin and whisky woman, myself, but still. . . .

(Whoops, forgot to note: h/t Charlie Pierce)





Modern thought(less): time isn’t holding us, time isn’t after us

10 10 2012

Been awhile, hasn’t it?

No, I haven’t given up on my attempt to make sense of the outer reaches of modernity by looking at the [European] origins of modernity, but I haven’t made much headway, either.

Oh, I been readin’, oh yeah, but have I done anything with all that reading? Not really. Beyond the most basic fact that modernity and secularism two-stepped across the centuries, as well as the sense that medievalism lasted into the 20th century, I have information, history, ideas—but no theory.

Peter Gay’s two-volume essay on the Enlightenment (called, handily enough, The Enlightenment) has been helpful in understanding how the ideas of the early modern period were cemented in intellectual thought, but precisely because these men were already modern, they are of less help in understanding those who became modern, or who were medieval-moderns.

Newton, for example, was a kind of medieval-modern. His work in physics, optics, and calculus encompass a large portion of the foundation of modern science, but he also conducted experiments in alchemy; the founding of a new kind of knowledge had not yet erased the old.

Other, scattered thoughts: The Crusades were crucial in re-introducing into Europe the ideas of the ancient Greeks. . . although, even here, al-Andalus also provided an entree for Muslim knowledge of and elaboration on Levantine thought into a Christian worldview. Also, I haven’t read much on the impact of westward exploration and colonization on European thought. Hm.

Evolution in war strategy and armaments—I’m thinking here of the recruitment and consolidation of armies—undoubtedly played a role, as did consequences of those wars, especially the Thirty Years War. (The Treaty of Westphalia is commonly considered an origin in the development of the concept of state sovereignty. Which reminds me: Foucault.)

What else. I haven’t read much in terms of everyday life during this period, although I do have Braudel and Natalie Zemon Davis on my reading lists. I’m still not sure where to put the on-the-ground stuff, interested as I am in intellectual history. Still, a concentration on thoughts untethered from practice yields shallow history.

I have developed an abiding hatred for the Spanish Empire. This may be unfair to the Spaniards, but they turn up again and again as the bad guys. (How’s that for subtle interpretation?) I’ve got a big-ass book on the history of the Dutch Republic that I’m looking forward to, not least because of the central role of the Dutch in the development of capitalism.

Capitalism, yeah, haven’t talked much about that, either. Can’t talk about modernity without talkin’ bout capitalism.

Still, I really want to try to nail down the emergence of the individual as a political subject: there is no modernity without this emergence. The Reformation and the wars of religion are crucial, of course, but I want to understand precisely how the connection was made between the individual and his relationship to God and the emergence of the concept of the individual citizen’s relationship to the state. (I say concept because it took awhile for the walk to catch up to the talk.)

I have no plans to abandon this project, but if I can’t get it together, I may have to abandon my hopes for this project.

Maybe I should do that sooner rather than later: I’m always better after I’ve left hope behind.





It’s all too much

3 08 2012

The point is that evidence can be unreliable, and therefore you should use as little of it as possible. . . . I mean, people don’t realize that not only can data be wrong in science, it can be misleading. There isn’t such a thing as a hard fact when you’re trying to discover something. It’s only afterwards that the facts become hard.*

~Francis Crick

It’s no surprise that Crick is a theorist, is it?

I quite like this quote, and (I think) used it in my dissertation, but it also makes me nervous.

First, why I like it: It puts theory first, forces you to think of the evidence in terms of a theory in which it makes sense. If you let the evidence go first, you may end up hiking into a dead end, both because you’re misinterpreting the evidence as evidence (i.e., taking as fact something which is not, yet) and because you miss other bits because you don’t have a way of seeing those bits as something which matters.

But this is where the unease kicks in: Theory can mislead, as well. Thomas Kuhn noted this in The Structure of Scientific Revolutions and his arguments on paradigm shift, although Max Planck had the pithiest observation on this phenomenon: “Science progresses one funeral at a time.”

So, theory leads, and theory misleads.

Richard Rhodes, in his magisterial The Making of the Atomic Bomb, ticks off any number of discoveries which were missed by those with the most data because they weren’t able to see the data correctly.
The most well-known story is that of Frederick Smith, who didn’t discover X rays:

. . . not so unlucky in legend as the Oxford physicist Frederick Smith, who found that photographic plates kept near a cathode-ray tube were liable to be fogged and merely told his assistant to move them to another place. . . . Röntgen isolated the effect by covering his cathode-ray tube with black paper. When a nearby screen of fluorescent material still glowed he realized that whatever was causing the screen to glow was passing through the paper and the intervening air. If he held his hand between the covered tube and the screen, his hand slightly reduced the glow on the screen but in the dark shadow he could see its bones.

So is this a case of theory leading, or misleading? Or is this a third case, where a willingness to follow the evidence led to a hitherto overlooked phenomenon?

My guess: all three. Physics at the turn of the century was in the start of a creative tumult, a half-century active quake zone of discovery: old theories cracked under the pressure of irreconcilable data, new theories upended the known world and brought forth phenomenon which had previously hidden among the unknown unknowns, and all of this piled up and into the felt urgency to explain not just this new world, but a whole new universe.

There was too much of everything, a glorious and disorienting plenty on which one of the finest collection of minds in human history feasted; is it any surprise that pursuit of this course meant that dish was neglected?

All of this is a long way of saying I’m having a bitch of a time trying to make sense of my foray into medieval history. I don’t have a theory, don’t have a direction, and while I’m unbothered by—hell, actively defend—a certain amount of dilettantism, I’ve wandered enough to have become frustrated by my wanderings.

I’m not too worried, though. As frustrating as frustration is, it works for me, (eventually) crowbarring me out of my “it’ll come” complacency and into a “go get it” activity—which is to say, I’ll come up with a theory which will guide me to look at this, not at that.

I’m not doing the [kind of] science Crick did, so his observations on the process of discovery don’t necessarily translate across the fields, but he is right that if you’re going to find anything, it helps to know what you’re looking for.

(*As quoted in Horace Freeland Judson’s The Eighth Day of Creation)





Perspective

3 01 2012

 

Coudal Partners, “History of the Earth in 24 Hours”, via The Daily Dish





Negation—wha. . .what?

18 05 2011

Perhaps I should not have used the term “negation”.

It carries a philosophical load—which is fine, and not unrelated to my use of it—but I wanted (also) to emphasize the more prosaic, i.e., practical, aspects of negation, as in: to negate, to eliminate as an option or consideration.

The germ theory of disease negated theories of miasma, Lavoisier’s experiments with oxygen negated phlogiston, industrial production of beakers and test tubes negated the need for scientists to blow their own glassware (which further negated the need for the knowledge of blowing glassware), fuel injection will likely negate carburetors, etc.

So negation could mean “overturn” (as with germs > miasmas or oxygen > phlogiston) or “leave behind” (as with glass-blowing and carburetors), that is, to negate may be to disprove or it could mean to render irrelevant or trivial.

Now, these practical effects may reverberate ontologically, such that the negation of the practical may serve to negate an entire way of thinking or being, or simply to serve as a signal of the instability of that way of thinking/being. Thomas Kuhn’s The Structure of Scientific Revolutions, with its discussion of paradigm shifts rendering previous modes of scientific practice inert, lays out a version of global negation, while current questions of the role of cyber-technologies signal uncertainty over what counts as “real”.

John Donne’s “An Anatomy of the World” (1611) is often quoted—hell, I quoted it a while back—to exemplify the agonized confusion over the discoveries of the natural philosophers:

And new philosophy calls all in doubt,
The element of fire is quite put out;
The sun is lost, and the earth, and no man’s wit
Can well direct him where to look for it.
And freely men confess that this world’s spent,
When in the planets and the firmament
They seek so many new; they that this
Is crumbled out again to his atomies.
‘Tis all in pieces, all coherence gone;
All just supply, and relation:

Natural philosophy took for itself the name science, and modernity marched on. The laments for the old world died with those who once lived in it.

William Butler Yeats’s “The Second Coming” clearly echoes this lament, with the opening

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;

The times they are a-changin’, indeed.

History is not a line, or rather, history only holds the line, such that events may loosen or smash that hold and the contents of that history scatter.

Some of those pieces are lost and even of those which are found, the meaning of the piece, precisely because it has been scattered, can only be guessed at. It is shard of pottery uncovered in the desert, hinting at something which once was, now gone.

But not everything is lost: it could be hiding in that proverbial plain sight. I’m much taken with the notion of the palimpsest—that is, of a kind of tablet which has been inscribed then scrubbed clean to be reinscribed—largely because I think that the previous inscriptions are still there, that, like words which have been erased from a page, the impression lingers.

Heidegger in The Question Concerning Technology decries the transformation of the Rhine from a river in a landscape into a “water power supplier”, that is, it is no longer itself but a source of reserve power for a hydroelectric plant. Perhaps it could be understood as that river in a landscape, he muses, but “In no other way than as an object on call for inspection by a tour group ordered there by the vacation industry.”

Those who complain that Manhattan has turned into a theme park and that Times Square has lost all its gritty reality have not a little bit in common with Herr Heidegger.

I have a great deal of sympathy for this feeling, but even more skepticism for such sympathy; as I’ve mentioned more times than you probably care to read, we’re never who we’ve been.

So, again, I’m not taking the side of the past against the present, not least because I have no basis for such a taking of sides. Again, I simply want to trace the history of modern history.

I can’t raise all the inscriptions on the palimpsest, but maybe I can see some of what has been left behind.





And those magic wristbands don’t work, either

6 01 2011

Andrew Wakefield is a fraud—and the British Medical Journal has the evidence to prove it.

I tend to stay away from anti-vaxers, not because they don’t deserve the derision, but because there are many who are much better situated than me (see, for example, this post by Orac at Respectful Insolence) to take ‘em on.

It’s not that there are no risks associated with vaccines or that no one has ever been adversely affected by vaccines—every year, for a quick example, there are people who are adversely affected by the flu vaccine who likely would have been fine without it—but one has to be clear what those risks are.

Stating that the measels, mumps, and rubella (MMR) vaccine causes autism is not clarifying those risks.

In fact, Wakefield was not only wrong when he made that connection in a 1998 Lancet article (an article which was retracted in 2010), he was deliberately wrong, that is, he fucked with the data. As the editors of BMJ note:

The Office of Research Integrity in the United States defines fraud as fabrication, falsification, or plagiarism. Deer unearthed clear evidence of falsification. He found that not one of the 12 cases reported in the 1998 Lancet paper was free of misrepresentation or undisclosed alteration, and that in no single case could the medical records be fully reconciled with the descriptions, diagnoses, or histories published in the journal.

Who perpetrated this fraud? There is no doubt that it was Wakefield. Is it possible that he was wrong, but not dishonest: that he was so incompetent that he was unable to fairly describe the project, or to report even one of the 12 children’s cases accurately? No. A great deal of thought and effort must have gone into drafting the paper to achieve the results he wanted: the discrepancies all led in one direction; misreporting was gross. Moreover, although the scale of the GMC’s 217 day hearing precluded additional charges focused directly on the fraud, the panel found him guilty of dishonesty concerning the study’s admissions criteria, its funding by the Legal Aid Board, and his statements about it afterwards.

Furthermore, Wakefield has been given ample opportunity either to replicate the paper’s findings, or to say he was mistaken. He has declined to do either. He refused to join 10 of his coauthors in retracting the paper’s interpretation in 2004, and has repeatedly denied doing anything wrong at all. Instead, although now disgraced and stripped of his clinical and academic credentials, he continues to push his views. [emphasis added]

Again, I leave it to the medical and scientific folk to tear into Wakefield’s manipulations; I want to address the public health implications of his fraud.

BMJ’s editors note that it is difficult to trace declining vaccination rates in the UK and elsewhere directly to Wakefield’s work, but it is clear that rates had fallen after 1998, and are still below the World Health Organization’s recommended coverage of 95 percent of a population. In 2008, measles were “declared endemic in England and Wales”, and an outbreak of mumps in Essen, Germany revealed that of the 71 children affected, 68 hadn’t been vaccinated. Finally, according to a June 2009 Pediatrics article (as discussed in the Wired article linked to, above), pertussis rates jumped from 1000 in 1976 to 26,000 in 2004.

So what? So some kids get sick for awhile. Sucks for them, but that’s what they get for having anti-vax parents.

Except that it’s not fair for those kids, and it puts others at risk of morbidity and mortality. Measles can kill. Meningitis can kill. Pertussis can kill, and on and on. Furthermore, many of the diseases which can be prevented by vaccines depend on herd immunity—they work mainly by preventing a disease from settling into a reservoir in a population—which means that if enough people in any given group are unvaccinated, the disease can spread.

Again, what’s the problem? If folks don’t get themselves immunized, that’s on them.

But it’s not. There are some people—infants, transplant patients, people with compromised immune systems, those  who may be  allergic to (as I am to the egg in flu vaccines) or otherwise intolerant to ingredients in the vaccine, among others—who are vulnerable to outbreaks. And even those who have been vaccinated may be at risk if, say, an especially virulent form of a disease is allowed to spread.

So back to the beginning(ish): There are risks to vaccination, but so there are greater risks to not vaccinating, not only to yourself or your kid, but to everyone around you.

Some parents feel quite comfortable withholding vaccines from their kids, but the only reason they can safely do so is because every other parent is vaccinating her kids. And hey, guess what, if everyone else has to take the risk to keep the disease at bay, then so should the anti-vaxers. Unless they are willing to keep themselves and their kids away from everyone else for as long as they all remain unvaccinated, they are free-riding on the rest of us. They are, in a sense, ripping us off.

If you want the benefits, you have to bear the burdens.

Finally, it is worth noting, along with the editors of BMJ,that

[P]erhaps as important as the scare’s effect on infectious disease is the energy, emotion, and money that have been diverted away from efforts to understand the real causes of autism and how to help children and families who live with it.

Wakefield and Age of Autism and Generation Rescue are doing no favors for those who do have autism or their families. Jennifer McCarthy and JB Handley, parents of kids with autism, may sincerely believe their bullshit, but the sincerity of those beliefs does not make that bs any less malodorous.

Who knows, maybe there is a specific cause for autism, one which, if rooted out, could lead to the end of this syndrome.

But that cause ain’t the MMR vaccine.





I’m a rocket man

14 12 2010

I try to be good, get off the computer for a few hours, and what happens? I miss an entire conversation on science.

Well, goddammit.

(Actually, given that a large portion of the thread was given over to name-calling and trollist cavils, I guess I didn’t miss that much. Still.)

So, science. I am for it.

I am an epistemological nihilist, it’s true, so this support is caveated with the usual cracks and abyssals, but I’m also quite willing to hop right over those chasms to walk among the ruins that compose our human life—and one of our more spectacular ruins is science.

Yes, ‘our’. ‘Our’ because science truly is a human endeavor (even as its dogmatists assert that science can take us outside of ourselves), and as such, there to be claimed by all of us. And it is important to claim it, both against the dogmatists and against those who find nothing of worth in curiosity and rigor, or in experimentation, skepticism, and discovery.

I can only respond to those opposed to discovery with questions and fiction—as we do not inhabit the same world, argument is stillborn—but to the dogmatists and, it must be said, to those who favor curiosity and thus oppose science because they believe science poisons curiosity, I can offer history and reason and ruin.

To offer the whole of that argument is to offer a book; instead, here is the abstract:

We humans have sought to know, and in seeking, have sought to make sense of what we have found. How we make sense has varied—through recourse to myths, common sense, measurement, extrapolation, generalization, systematization, reflection, etc.—and what we make of the sense we make has varied as well. Sometimes we call it truth or religion or wickedness or allegory or interpretation; sometimes we call it science. Sometimes this science is the means, sometimes it is the end, sometimes it is both. In early modern times [in Europe], in the period now known as the Scientific Revolution, science was thought to reveal truths about God, as it also was by those scientists working under the Abbasids; that it also brought technological advance and political and economic gain helped to preserve it against those who argued that a thirst for knowledge was itself corrosive of the faith.

Yet even throughout much of the modern period science was understood as, if no longer an appendage of natural philosophy, as nonetheless a part of a constellation of knowledge which included the arts, literature, and humanities; its practitioners are all a part of the learned class.

This collegiality faded, and now science is understood primarily as comprising the natural sciences and its methods; to the extent some social sciences adopt those methods, they may or may not be admitted to the realm as sciences, albeit as its lesser, ‘softer’, version. That science has a history is barely acknowledged, and it is unclear if scientists (or their learned critics) would consider them as ‘intellectuals’ rather than (mere) technicians, experimentalists, and lab directors.

This separation (and, often, contempt) is lamentable all around. [Natural] science is more than its tools and methods, involves more [hermeneutic] interpretation than the experimentalists may admit of, and requires greater curiosity than its skeptics may allow. But if we want to know, if we humans truly seek a human science (and, again, I would argue there is no other), then we have to prevent science from sliding all the way into scientism. Some think it’s already so technics-shriveled, that it is already mere methodological fetishism; I disagree.

This saving gesture doesn’t require that artists now refer to themselves as scientists or that neurobiologists become novelists. No, this reclamation project (another ruin) would gather the curious back together, to see if we exiles from one another would have anything to say to one another, to see what we could see.

I don’t believe this every day—yesterday, for example, I had no patience for this.

But some days, some days I think we humans could do this. Some days, this is my something more.








Follow

Get every new post delivered to your Inbox.

Join 1,265 other followers