It’s all too much

3 08 2012

The point is that evidence can be unreliable, and therefore you should use as little of it as possible. . . . I mean, people don’t realize that not only can data be wrong in science, it can be misleading. There isn’t such a thing as a hard fact when you’re trying to discover something. It’s only afterwards that the facts become hard.*

~Francis Crick

It’s no surprise that Crick is a theorist, is it?

I quite like this quote, and (I think) used it in my dissertation, but it also makes me nervous.

First, why I like it: It puts theory first, forces you to think of the evidence in terms of a theory in which it makes sense. If you let the evidence go first, you may end up hiking into a dead end, both because you’re misinterpreting the evidence as evidence (i.e., taking as fact something which is not, yet) and because you miss other bits because you don’t have a way of seeing those bits as something which matters.

But this is where the unease kicks in: Theory can mislead, as well. Thomas Kuhn noted this in The Structure of Scientific Revolutions and his arguments on paradigm shift, although Max Planck had the pithiest observation on this phenomenon: “Science progresses one funeral at a time.”

So, theory leads, and theory misleads.

Richard Rhodes, in his magisterial The Making of the Atomic Bomb, ticks off any number of discoveries which were missed by those with the most data because they weren’t able to see the data correctly.
The most well-known story is that of Frederick Smith, who didn’t discover X rays:

. . . not so unlucky in legend as the Oxford physicist Frederick Smith, who found that photographic plates kept near a cathode-ray tube were liable to be fogged and merely told his assistant to move them to another place. . . . Röntgen isolated the effect by covering his cathode-ray tube with black paper. When a nearby screen of fluorescent material still glowed he realized that whatever was causing the screen to glow was passing through the paper and the intervening air. If he held his hand between the covered tube and the screen, his hand slightly reduced the glow on the screen but in the dark shadow he could see its bones.

So is this a case of theory leading, or misleading? Or is this a third case, where a willingness to follow the evidence led to a hitherto overlooked phenomenon?

My guess: all three. Physics at the turn of the century was in the start of a creative tumult, a half-century active quake zone of discovery: old theories cracked under the pressure of irreconcilable data, new theories upended the known world and brought forth phenomenon which had previously hidden among the unknown unknowns, and all of this piled up and into the felt urgency to explain not just this new world, but a whole new universe.

There was too much of everything, a glorious and disorienting plenty on which one of the finest collection of minds in human history feasted; is it any surprise that pursuit of this course meant that dish was neglected?

All of this is a long way of saying I’m having a bitch of a time trying to make sense of my foray into medieval history. I don’t have a theory, don’t have a direction, and while I’m unbothered by—hell, actively defend—a certain amount of dilettantism, I’ve wandered enough to have become frustrated by my wanderings.

I’m not too worried, though. As frustrating as frustration is, it works for me, (eventually) crowbarring me out of my “it’ll come” complacency and into a “go get it” activity—which is to say, I’ll come up with a theory which will guide me to look at this, not at that.

I’m not doing the [kind of] science Crick did, so his observations on the process of discovery don’t necessarily translate across the fields, but he is right that if you’re going to find anything, it helps to know what you’re looking for.

(*As quoted in Horace Freeland Judson’s The Eighth Day of Creation)

Advertisements




Perspective

3 01 2012

 

Coudal Partners, “History of the Earth in 24 Hours”, via The Daily Dish





Negation—wha. . .what?

18 05 2011

Perhaps I should not have used the term “negation”.

It carries a philosophical load—which is fine, and not unrelated to my use of it—but I wanted (also) to emphasize the more prosaic, i.e., practical, aspects of negation, as in: to negate, to eliminate as an option or consideration.

The germ theory of disease negated theories of miasma, Lavoisier’s experiments with oxygen negated phlogiston, industrial production of beakers and test tubes negated the need for scientists to blow their own glassware (which further negated the need for the knowledge of blowing glassware), fuel injection will likely negate carburetors, etc.

So negation could mean “overturn” (as with germs > miasmas or oxygen > phlogiston) or “leave behind” (as with glass-blowing and carburetors), that is, to negate may be to disprove or it could mean to render irrelevant or trivial.

Now, these practical effects may reverberate ontologically, such that the negation of the practical may serve to negate an entire way of thinking or being, or simply to serve as a signal of the instability of that way of thinking/being. Thomas Kuhn’s The Structure of Scientific Revolutions, with its discussion of paradigm shifts rendering previous modes of scientific practice inert, lays out a version of global negation, while current questions of the role of cyber-technologies signal uncertainty over what counts as “real”.

John Donne’s “An Anatomy of the World” (1611) is often quoted—hell, I quoted it a while back—to exemplify the agonized confusion over the discoveries of the natural philosophers:

And new philosophy calls all in doubt,
The element of fire is quite put out;
The sun is lost, and the earth, and no man’s wit
Can well direct him where to look for it.
And freely men confess that this world’s spent,
When in the planets and the firmament
They seek so many new; they that this
Is crumbled out again to his atomies.
‘Tis all in pieces, all coherence gone;
All just supply, and relation:

Natural philosophy took for itself the name science, and modernity marched on. The laments for the old world died with those who once lived in it.

William Butler Yeats’s “The Second Coming” clearly echoes this lament, with the opening

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;

The times they are a-changin’, indeed.

History is not a line, or rather, history only holds the line, such that events may loosen or smash that hold and the contents of that history scatter.

Some of those pieces are lost and even of those which are found, the meaning of the piece, precisely because it has been scattered, can only be guessed at. It is shard of pottery uncovered in the desert, hinting at something which once was, now gone.

But not everything is lost: it could be hiding in that proverbial plain sight. I’m much taken with the notion of the palimpsest—that is, of a kind of tablet which has been inscribed then scrubbed clean to be reinscribed—largely because I think that the previous inscriptions are still there, that, like words which have been erased from a page, the impression lingers.

Heidegger in The Question Concerning Technology decries the transformation of the Rhine from a river in a landscape into a “water power supplier”, that is, it is no longer itself but a source of reserve power for a hydroelectric plant. Perhaps it could be understood as that river in a landscape, he muses, but “In no other way than as an object on call for inspection by a tour group ordered there by the vacation industry.”

Those who complain that Manhattan has turned into a theme park and that Times Square has lost all its gritty reality have not a little bit in common with Herr Heidegger.

I have a great deal of sympathy for this feeling, but even more skepticism for such sympathy; as I’ve mentioned more times than you probably care to read, we’re never who we’ve been.

So, again, I’m not taking the side of the past against the present, not least because I have no basis for such a taking of sides. Again, I simply want to trace the history of modern history.

I can’t raise all the inscriptions on the palimpsest, but maybe I can see some of what has been left behind.





And those magic wristbands don’t work, either

6 01 2011

Andrew Wakefield is a fraud—and the British Medical Journal has the evidence to prove it.

I tend to stay away from anti-vaxers, not because they don’t deserve the derision, but because there are many who are much better situated than me (see, for example, this post by Orac at Respectful Insolence) to take ’em on.

It’s not that there are no risks associated with vaccines or that no one has ever been adversely affected by vaccines—every year, for a quick example, there are people who are adversely affected by the flu vaccine who likely would have been fine without it—but one has to be clear what those risks are.

Stating that the measels, mumps, and rubella (MMR) vaccine causes autism is not clarifying those risks.

In fact, Wakefield was not only wrong when he made that connection in a 1998 Lancet article (an article which was retracted in 2010), he was deliberately wrong, that is, he fucked with the data. As the editors of BMJ note:

The Office of Research Integrity in the United States defines fraud as fabrication, falsification, or plagiarism. Deer unearthed clear evidence of falsification. He found that not one of the 12 cases reported in the 1998 Lancet paper was free of misrepresentation or undisclosed alteration, and that in no single case could the medical records be fully reconciled with the descriptions, diagnoses, or histories published in the journal.

Who perpetrated this fraud? There is no doubt that it was Wakefield. Is it possible that he was wrong, but not dishonest: that he was so incompetent that he was unable to fairly describe the project, or to report even one of the 12 children’s cases accurately? No. A great deal of thought and effort must have gone into drafting the paper to achieve the results he wanted: the discrepancies all led in one direction; misreporting was gross. Moreover, although the scale of the GMC’s 217 day hearing precluded additional charges focused directly on the fraud, the panel found him guilty of dishonesty concerning the study’s admissions criteria, its funding by the Legal Aid Board, and his statements about it afterwards.

Furthermore, Wakefield has been given ample opportunity either to replicate the paper’s findings, or to say he was mistaken. He has declined to do either. He refused to join 10 of his coauthors in retracting the paper’s interpretation in 2004, and has repeatedly denied doing anything wrong at all. Instead, although now disgraced and stripped of his clinical and academic credentials, he continues to push his views. [emphasis added]

Again, I leave it to the medical and scientific folk to tear into Wakefield’s manipulations; I want to address the public health implications of his fraud.

BMJ’s editors note that it is difficult to trace declining vaccination rates in the UK and elsewhere directly to Wakefield’s work, but it is clear that rates had fallen after 1998, and are still below the World Health Organization’s recommended coverage of 95 percent of a population. In 2008, measles were “declared endemic in England and Wales”, and an outbreak of mumps in Essen, Germany revealed that of the 71 children affected, 68 hadn’t been vaccinated. Finally, according to a June 2009 Pediatrics article (as discussed in the Wired article linked to, above), pertussis rates jumped from 1000 in 1976 to 26,000 in 2004.

So what? So some kids get sick for awhile. Sucks for them, but that’s what they get for having anti-vax parents.

Except that it’s not fair for those kids, and it puts others at risk of morbidity and mortality. Measles can kill. Meningitis can kill. Pertussis can kill, and on and on. Furthermore, many of the diseases which can be prevented by vaccines depend on herd immunity—they work mainly by preventing a disease from settling into a reservoir in a population—which means that if enough people in any given group are unvaccinated, the disease can spread.

Again, what’s the problem? If folks don’t get themselves immunized, that’s on them.

But it’s not. There are some people—infants, transplant patients, people with compromised immune systems, those  who may be  allergic to (as I am to the egg in flu vaccines) or otherwise intolerant to ingredients in the vaccine, among others—who are vulnerable to outbreaks. And even those who have been vaccinated may be at risk if, say, an especially virulent form of a disease is allowed to spread.

So back to the beginning(ish): There are risks to vaccination, but so there are greater risks to not vaccinating, not only to yourself or your kid, but to everyone around you.

Some parents feel quite comfortable withholding vaccines from their kids, but the only reason they can safely do so is because every other parent is vaccinating her kids. And hey, guess what, if everyone else has to take the risk to keep the disease at bay, then so should the anti-vaxers. Unless they are willing to keep themselves and their kids away from everyone else for as long as they all remain unvaccinated, they are free-riding on the rest of us. They are, in a sense, ripping us off.

If you want the benefits, you have to bear the burdens.

Finally, it is worth noting, along with the editors of BMJ,that

[P]erhaps as important as the scare’s effect on infectious disease is the energy, emotion, and money that have been diverted away from efforts to understand the real causes of autism and how to help children and families who live with it.

Wakefield and Age of Autism and Generation Rescue are doing no favors for those who do have autism or their families. Jennifer McCarthy and JB Handley, parents of kids with autism, may sincerely believe their bullshit, but the sincerity of those beliefs does not make that bs any less malodorous.

Who knows, maybe there is a specific cause for autism, one which, if rooted out, could lead to the end of this syndrome.

But that cause ain’t the MMR vaccine.





I’m a rocket man

14 12 2010

I try to be good, get off the computer for a few hours, and what happens? I miss an entire conversation on science.

Well, goddammit.

(Actually, given that a large portion of the thread was given over to name-calling and trollist cavils, I guess I didn’t miss that much. Still.)

So, science. I am for it.

I am an epistemological nihilist, it’s true, so this support is caveated with the usual cracks and abyssals, but I’m also quite willing to hop right over those chasms to walk among the ruins that compose our human life—and one of our more spectacular ruins is science.

Yes, ‘our’. ‘Our’ because science truly is a human endeavor (even as its dogmatists assert that science can take us outside of ourselves), and as such, there to be claimed by all of us. And it is important to claim it, both against the dogmatists and against those who find nothing of worth in curiosity and rigor, or in experimentation, skepticism, and discovery.

I can only respond to those opposed to discovery with questions and fiction—as we do not inhabit the same world, argument is stillborn—but to the dogmatists and, it must be said, to those who favor curiosity and thus oppose science because they believe science poisons curiosity, I can offer history and reason and ruin.

To offer the whole of that argument is to offer a book; instead, here is the abstract:

We humans have sought to know, and in seeking, have sought to make sense of what we have found. How we make sense has varied—through recourse to myths, common sense, measurement, extrapolation, generalization, systematization, reflection, etc.—and what we make of the sense we make has varied as well. Sometimes we call it truth or religion or wickedness or allegory or interpretation; sometimes we call it science. Sometimes this science is the means, sometimes it is the end, sometimes it is both. In early modern times [in Europe], in the period now known as the Scientific Revolution, science was thought to reveal truths about God, as it also was by those scientists working under the Abbasids; that it also brought technological advance and political and economic gain helped to preserve it against those who argued that a thirst for knowledge was itself corrosive of the faith.

Yet even throughout much of the modern period science was understood as, if no longer an appendage of natural philosophy, as nonetheless a part of a constellation of knowledge which included the arts, literature, and humanities; its practitioners are all a part of the learned class.

This collegiality faded, and now science is understood primarily as comprising the natural sciences and its methods; to the extent some social sciences adopt those methods, they may or may not be admitted to the realm as sciences, albeit as its lesser, ‘softer’, version. That science has a history is barely acknowledged, and it is unclear if scientists (or their learned critics) would consider them as ‘intellectuals’ rather than (mere) technicians, experimentalists, and lab directors.

This separation (and, often, contempt) is lamentable all around. [Natural] science is more than its tools and methods, involves more [hermeneutic] interpretation than the experimentalists may admit of, and requires greater curiosity than its skeptics may allow. But if we want to know, if we humans truly seek a human science (and, again, I would argue there is no other), then we have to prevent science from sliding all the way into scientism. Some think it’s already so technics-shriveled, that it is already mere methodological fetishism; I disagree.

This saving gesture doesn’t require that artists now refer to themselves as scientists or that neurobiologists become novelists. No, this reclamation project (another ruin) would gather the curious back together, to see if we exiles from one another would have anything to say to one another, to see what we could see.

I don’t believe this every day—yesterday, for example, I had no patience for this.

But some days, some days I think we humans could do this. Some days, this is my something more.





Too goddamned irritated to blog

13 12 2010

You call your ‘movement’ No Labels, give yourself the motto Not Left. Not Right. Forward., and yet on the top of your web page insist

We are Democrats, Republicans, and Independents who are united in the belief that we do not have to give up our labels, merely put them aside to do what’s best for America.

Kentucky fucking chicken, what’s the point of calling yourself No Label if what you really mean is Every Label (on the inside pocket)?

And it’s a stupid idea, anyway.

And then this, from a blog which insistence and crankiness I like and respect: Removing science from anthropology.

What anthropologists do is up to them; that said, I generally think we social scientists should hang on to the word ‘science’ with all our grubby little might. ‘Science’ in its most general terms as a search for knowledge has a long and honorable history and, as I always like to point out, one of the earliest known seekers was Aristotle—who considered political science the highest of all sciences. So there.

What chaps me about this piece is not that non-anthropologists have opinions about this move, but after the requisite words of respect about the so-called softer sciences, Orac also has to toss in the requisite bullshit references to ‘post-modernism’ and ‘political correctness’.

Yeah, I get it, he sees invidious parallels between claims about ‘other ways of knowing’ and his white whale, complementary and alternative medicine.

This is an intriguing claim. Truly.

But again with the KFC: Do you need to haul out straw-ass versions of an interpretive method which definition you draw from Sokal  in order to light the whole goddamned discussion on fire?

Kentucky Jesus Fried Christ.





Are spirits in the material world

8 08 2010

I don’t believe in life after death.

There is life, here, in this world, and death both is and signals the end of life.

Now, is there something else, after life? That, I don’t know.

If there is something else, it doesn’t seem that it would conform to notions of Christian or Muslim heaven; those seem so earth-bound, so reflective of what we already have here, only someone’s version of better.  (A multitude of virgins or streets paved with gold? Really?) If there would be something else, wouldn’t it be. . . something else?

Backing up: I think of life as bounded by this earth, but I’m fudging on the whole existence thing, that is, we exist in life, here, and if our existence continues, then it would be in some other way.

Furthermore, that there could be something else doesn’t mean it’s supernatural. I don’t believe in the supernatural; I think everything—everything—is natural, and that that which is called ‘supernatural’ is simply something for which we lack understanding.

(And woo? Woo is a cover, a con: obfuscation masquerading as understanding.)

This isn’t rank materialism. I also don’t believe the (natural or social) sciences are sufficient to make sense of all worldly—universal—phenomenon; I’m not arguing that understanding necessitates a reduction of all things to the latest brand of physics. It’s simply that, if there is nothing beyond nature, then we’ll need new ways of understanding—new sciences—to make sense of that which current scientific methods cannot.

Does this tend toward a Theory of Everything? Perhaps, but since TOE is conceptualized in contemporary terms, it may be inadequate to describe all that there is.

And ‘is’ itself may be—hell, already is—called into question, along with ‘all’ and ‘that’.

*Sigh* It’s late and I”m not making sense.

I’m wondering about death because a little over a week ago Bean died and a little over a year ago Chelsea died.  I don’t think they’re in pet heaven or regular heaven or whatever. I don’t know if they’ve gone some place after death, if their existence continues, or what relationship that existence has to any worldly one. Maybe there’s nothing, maybe there’s something. I know they’re not with me.

But I would like to think, that if there is something, that they neither forget nor are constrained by life. This existence on earth, this life, is powerful, and if there is something else, I’d like to think it offers us more without taking away what we already were. Perhaps there is no full understanding on this earth, no way for us to comprehend all there is; perhaps life is to get us started, but it’s not enough, not enough for us to know.

I don’t know this, of course. And maybe this is it, and this life which is not enough is it. Perhaps this life is enough.

My methods are insufficient to determine one way or the other.