We are the champions of the world

8 03 2015

Might makes right—that is the basis of morality.

That’s not all there is to morality, but lack the might, and you lack the ability to determine the right.

(And as for God(s) as the basis of morality? What is he/she/they/it but Mighty? the Mightiest of the Mighty?)

Might makes right is also the basis of knowledge. Of course, what counts as “might” varies considerably across time and space: might could mean “ability to summon spirits” or “to discern the secrets of nature” or, of course, to point a sword or an axe or a gun at a person’s head and say “believe” or “recant”; it could also refer to people or resources or the production of results.

Thomas Kuhn referred, famously, to paradigms: scientists operate within a particular paradigm or set of theories of how the world works, and new scientists are inculcated with and succeed according to their ability to produce new knowledge based on elaboration of those theories. Over time, however, those elaborations may run into trouble: the theory leads to x result, but y is what is witnessed. There may be some way to accommodate these anomalies, but eventually the anomalies will overwhelm the paradigm; upon the presentation of a new theory which can account not only for the old knowledge, but also the anomalies, the paradigm will shift.

(Imre Lakatos attempted to meliorate the harshness of this shift (and to mediate between Kuhn and Karl Popper’s strict falsificationism) with a notion of “research programmes” and whether they are “progressive” or “degenerative”, but he, too, allows that new research programs may emerge.)

Older or established members of a field may not accept a new paradigm or research program, but, as Max Planck famously observed, “science advances one funeral at a time”. Einstein, one of the most intelligent men of the 20th century, perhaps ever, just as famously never accepted quantum theory (“God does not play dice with the universe”), but he couldn’t foil it; he is dead, and the theory lives.

What, then, is the paradigm or research program but a form of might? It declares what counts as true and false, what is considered evidence and how to make sense of that evidence, what counts as science—and thus knowledge—at all.

None of this is meant to be argumentative, but axiomatic. This doesn’t mean there is no knowledge or no true knowledge, but that what counts as knowledge and truth is bound up in the conditions of the production of said knowledge and truth. Knowledge depends upon what we say knowledge is (“intersubjective agreement”), and there are a lot of ways to say it.

I’m a fan of science, and consider its methods to be powerful in eliciting knowledge about the natural world. I don’t think it can tell me much about poetry, but if I want to understand how a fertilized egg can turn into a person, then I’ll turn to a biology textbook rather than, say, a book of poetry.

Even the most potent forms of knowledge—the mightiest of the mighty—have their limits (see: embryology won’t teach you much about rhyme and meter), and potency itself is no guarantee against the loss or overthrow of a particular form of knowledge, an insight long known by tyrants, torturers, and con men alike.

Knowledge, for all of its power (Bacon), is also fragile: because there is nothing necessary or autonomous about any one form of knowledge, it can be lost or shattered or tossed away—which means it must be tended, and, when conditions dictate, defended.

All of which is a very long way to saying that the notion of “Let the public decide what’s the truth” with regard to the existence of climate change is a terrible, terrible idea, and as an attack on science itself, deserves to to be driven back to the gaseous bog from whence it came.

Advertisement




All things weird and wonderful, 50

29 01 2015

Calceolaria uniflora, photo by Thomas Mathis

The website from which a got this image, Strange and Wonderful Things (a title after me own little heart), compares these funky little flowers to “little orange penguins marching over the rocks”—and yeah, I can see that.

But I see a bunch of old aunties in wide hats toting their bins back from the fields, or maybe the market.

Clouds are masses of frozen liquids suspended in the atmosphere, and one can use SCIENCE to determine how they form and what their shapes say about conditions in the atmosphere and that’s all for the good. Similarly, one can use the tools of SCIENCE to discover that c. uniflora is “distantly related to Foxglove and Generiads”, and that the flower is pollinated by birds who eat the white bits of the bloom.

But sometimes clouds are castles or armies or profiles of Abe Lincoln, and sometimes flowers are little orange penguins or bin-toting old aunties in wide hats.

~~~

h/t PZ Myers, Pharyngula





She blinded me with science

17 02 2014

When to let go and when to hang on?

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?

No.

The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends  which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

We just have to figure that out.





Angels in the architecture

16 07 2013

This is not a “why I am not a creationist” piece. Oh no. Even though I’m not.

This is a hit on a “why I am a creationist” piece.

Virginia Heffernan, who can be an engaging writer, has apparently decided to disengage from thinking. In a widely commentedupon piece for Yahoo, the tech and culture writer outed herself as a creationist. It is a spectacularly bad piece of . . . well, I guess it’s a species of argumentation, but as she kinds of flits and floats from the pretty to the happy and fleetly flees from sweet reason, it might be best to consider this a kind of (bad) performance art.

My brief with her is less about the God-ish conclusion than that flitting and floating: she rejects science because its boring and sad and aren’t stories about God sooooo much better?

You think I’m exaggerating? I am not. To wit:

I assume that other people love science and technology, since the fields are often lumped together, but I rarely meet people like that. Technology people are trippy; our minds are blown by the romance of telecom. At the same time, the people I know who consider themselves scientists by nature seem to be super-skeptical types who can be counted on to denigrate religion, fear climate change and think most people—most Americans—are dopey sheep who believe in angels and know nothing about all the gross carbon they trail, like “Pig-Pen.”

I like most people. I don’t fear environmental apocalypse. And I don’t hate religion. Those scientists no doubt see me as a dopey sheep who believes in angels and is carbon-ignorant. I have to say that they may be right.

Uh-huh.

Later she mentions that she’s just not moved by the Big Bang or evolution, and that evo-psych is sketchy science (which it is) this must mean all of science is sketchy (which it is not).

And then this stirring conclusion:

All the while, the first books of the Bible are still hanging around. I guess I don’t “believe” that the world was created in a few days, but what do I know? Seems as plausible (to me) as theoretical astrophysics, and it’s certainly a livelier tale. As “Life of Pi” author Yann Martel once put it, summarizing his page-turner novel: “1) Life is a story. 2) You can choose your story. 3) A story with God is the better story.”

(Would it be fair to mention at this point that I hated Life of Pi? Too beside-the-point?)

To summarize, she likes technology—because it’s trippy—but she doesn’t like knowing the hows and whys technology actually works, i.e., the science.

This would be fine—after all, there are all kinds of things I like without necessarily being interested in how and why they came to be—were it not for the fact that she’s a technology writer.

Perhaps she’s a closet Juggalo, or maybe she thought Bill O’Reilly waxed profound on the movement of tides, or maybe she just ate a shitload of shrooms and floated down to her keyboard, but I’d be very—excuse me, super-skeptical of the views of a tech writer who apparently thinks angels make iPhones.

~~~

I have to admit, I was more amused by her piece than anything, and her Twitter exchange with Carl Zimmer left me gasping; to the extent I can make out any kind of coherent line at all, it seems to be “I like stories more than theories—so there!”

As someone who likes both stories and theories—yes, Virginia, we can have both—however, I hate her feeding into the Two Cultures divide, not least because dopey angel-mongering tends to diminish even further the humanities.

I am a science enthusiast, but I am also a critic of the some of the more imperial epistemological claims by some scientists (what often gets branded as “scientism“). To note that the methods of science (methodological naturalism, nomological-deductivism—take yer pick) and knowledge produced from those methods are bounded is often taken as an attack on science itself.

And, to be fair, sometimes—as in the Storified Twitter spat, when Heffernan (big fat honking sigh) pulls Foucault out her nose to fling at Zimmer—it is.

But it ain’t necessarily so. It is simply the observation that science is one kind of practice, that it hasn’t escaped the conditionality and history of practice into some kind of absolute beyond.

Now, there’s a lot more behind that observation that I’m willing to go into at this late hour, so allow me to skip ahead to my ire at Heffernan: her dipshit argument makes it harder for those of us who’d prefer our critiques both dip- and shit-free.

So, thanks Virginia, thanks for stuffing your face with shrooms or replacing your neurons with helium or whatever the hell it was that lead you to declare the moon is made of cheese.

But next time, if there is a next time, Just Say No.





Here’s a man who lives a life

23 01 2013

I’m a big fan of science, and an increasingly big fan of science fiction.

I do, however, prefer that, on a practical level, we note the difference between the two.

There’s a lot to be said for speculation—one of the roots of political science is an extended speculation on the construction of a just society—but while I am not opposed to speculation informing practice, the substitution of what-if thinking for practical thought (phronēsis) in politics results in farce, disaster, or farcical disaster.

So too in science.

Wondering about a clean and inexhaustible source of energy can lead to experiments which point the way to cleaner and longer-lasting energy sources; it can also lead to non-replicable claims about desktop cold fusion. The difference between the two is the work.

You have to do the work, work which includes observation, experimentation, and rigorous theorizing. You don’t have to know everything at the outset—that’s one of the uses of experimentation—but to go from brain-storm to science you have to test your ideas.

This is all a very roundabout way of saying that cloning to make Neandertals is a bad idea.

Biologist George Church thinks synthesizing a Neandertal would be a good idea, mainly because it would diversify the “monoculture” of the Homo sapiens.

My first response is: this is just dumb. The genome of H. sapiens is syncretic, containing DNA from, yes, Neandertals, Denisovans, and possibly other archaic species, as well as microbial species. Given all of the varieties of life on this planet, I guess you could make the case for a lack of variety among humans, but calling us a “monoculture” seems rather to stretch the meaning of the term.

My second response is: this is just dumb. Church assumes a greater efficiency for cloning complex species than currently exists. Yes, cows and dogs and cats and frogs have all been cloned, but over 90 percent of all cloning attempts fail. Human pregnancy is notably inefficient—only 20-40% of all fertilized eggs result in a live birth—so it is tough to see why one would trumpet a lab process which is even more scattershot than what happens in nature.

Furthermore, those clones which are successfully produced nonetheless tend to be less healthy than the results of sexual reproduction.

Finally, all cloned animals require a surrogate mother in which to gestate. Given the low success rates of clones birthed by members of their own species, what are the chances that an H. sapiens woman would be able to bring a Neandertal clone to term—and without harming herself in the process?

I’m not against cloning, for the record. The replication of DNA segments and microbial life forms is a standard part of lab practice, and replicated tissues organs could conceivably have a role in regenerative medicine.

But—and this is my third response—advocating human and near-human cloning is at this point scientifically irresponsible. The furthest cloning has advanced in primates is the cloning of monkey embryos, that is, there has been no successful reproductive cloning of a primate.

To repeat: there has been no successful reproductive cloning of our closest genetic relatives. And Church thinks we could clone a Neandertal, easy-peasy?

No.

There are all kinds of ethical questions about cloning, of course, but in the form of bio-ethics I practice, one undergirded by the necessity of phronēsis, the first question I ask is: Is this already happening? Is this close to happening?

If the answer is No, then I turn my attention to those practices for which the answer is Yes.

Cloning is in-between: It is already happening in some species, but the process is so fraught that the inefficiencies themselves should warn scientists off of any attempts on humans. Still, as an in-between practice, it is worth considering the ethics of human cloning.

But Neandertal cloning? Not even close.

None of this means that Church can’t speculate away on the possibilities. He just shouldn’t kid himself that he’s engaging in science rather than science fiction.

(h/t: Tyler Cowen)





All things weird and wonderful, 28

11 01 2013

Galaxy’s centre tastes of raspberries and smells of rum, say astronomers

How is this not among the best news in, um, the galaxy?

Astronomers searching for the building blocks of life in a giant dust cloud at the heart of the Milky Way have concluded that it would taste vaguely of raspberries.

Ian Sample of the Guardian reports that after years of pointing their telescope into the nether regions of the ‘verse,

astronomers sifted through thousands of signals from Sagittarius B2, a vast dust cloud at the centre of our galaxy. While they failed to find evidence for amino acids, they did find a substance called ethyl formate, the chemical responsible for the flavour of raspberries.

“It does happen to give raspberries their flavour, but there are many other molecules that are needed to make space raspberries,” Arnaud Belloche, an astronomer at the Max Planck Institute for Radio Astronomy in Bonn, told the Guardian.

Curiously, ethyl formate has another distinguishing characteristic: it also smells of rum.

I’m a gin and whisky woman, myself, but still. . . .

(Whoops, forgot to note: h/t Charlie Pierce)





Modern thought(less): time isn’t holding us, time isn’t after us

10 10 2012

Been awhile, hasn’t it?

No, I haven’t given up on my attempt to make sense of the outer reaches of modernity by looking at the [European] origins of modernity, but I haven’t made much headway, either.

Oh, I been readin’, oh yeah, but have I done anything with all that reading? Not really. Beyond the most basic fact that modernity and secularism two-stepped across the centuries, as well as the sense that medievalism lasted into the 20th century, I have information, history, ideas—but no theory.

Peter Gay’s two-volume essay on the Enlightenment (called, handily enough, The Enlightenment) has been helpful in understanding how the ideas of the early modern period were cemented in intellectual thought, but precisely because these men were already modern, they are of less help in understanding those who became modern, or who were medieval-moderns.

Newton, for example, was a kind of medieval-modern. His work in physics, optics, and calculus encompass a large portion of the foundation of modern science, but he also conducted experiments in alchemy; the founding of a new kind of knowledge had not yet erased the old.

Other, scattered thoughts: The Crusades were crucial in re-introducing into Europe the ideas of the ancient Greeks. . . although, even here, al-Andalus also provided an entree for Muslim knowledge of and elaboration on Levantine thought into a Christian worldview. Also, I haven’t read much on the impact of westward exploration and colonization on European thought. Hm.

Evolution in war strategy and armaments—I’m thinking here of the recruitment and consolidation of armies—undoubtedly played a role, as did consequences of those wars, especially the Thirty Years War. (The Treaty of Westphalia is commonly considered an origin in the development of the concept of state sovereignty. Which reminds me: Foucault.)

What else. I haven’t read much in terms of everyday life during this period, although I do have Braudel and Natalie Zemon Davis on my reading lists. I’m still not sure where to put the on-the-ground stuff, interested as I am in intellectual history. Still, a concentration on thoughts untethered from practice yields shallow history.

I have developed an abiding hatred for the Spanish Empire. This may be unfair to the Spaniards, but they turn up again and again as the bad guys. (How’s that for subtle interpretation?) I’ve got a big-ass book on the history of the Dutch Republic that I’m looking forward to, not least because of the central role of the Dutch in the development of capitalism.

Capitalism, yeah, haven’t talked much about that, either. Can’t talk about modernity without talkin’ bout capitalism.

Still, I really want to try to nail down the emergence of the individual as a political subject: there is no modernity without this emergence. The Reformation and the wars of religion are crucial, of course, but I want to understand precisely how the connection was made between the individual and his relationship to God and the emergence of the concept of the individual citizen’s relationship to the state. (I say concept because it took awhile for the walk to catch up to the talk.)

I have no plans to abandon this project, but if I can’t get it together, I may have to abandon my hopes for this project.

Maybe I should do that sooner rather than later: I’m always better after I’ve left hope behind.





It’s all too much

3 08 2012

The point is that evidence can be unreliable, and therefore you should use as little of it as possible. . . . I mean, people don’t realize that not only can data be wrong in science, it can be misleading. There isn’t such a thing as a hard fact when you’re trying to discover something. It’s only afterwards that the facts become hard.*

~Francis Crick

It’s no surprise that Crick is a theorist, is it?

I quite like this quote, and (I think) used it in my dissertation, but it also makes me nervous.

First, why I like it: It puts theory first, forces you to think of the evidence in terms of a theory in which it makes sense. If you let the evidence go first, you may end up hiking into a dead end, both because you’re misinterpreting the evidence as evidence (i.e., taking as fact something which is not, yet) and because you miss other bits because you don’t have a way of seeing those bits as something which matters.

But this is where the unease kicks in: Theory can mislead, as well. Thomas Kuhn noted this in The Structure of Scientific Revolutions and his arguments on paradigm shift, although Max Planck had the pithiest observation on this phenomenon: “Science progresses one funeral at a time.”

So, theory leads, and theory misleads.

Richard Rhodes, in his magisterial The Making of the Atomic Bomb, ticks off any number of discoveries which were missed by those with the most data because they weren’t able to see the data correctly.
The most well-known story is that of Frederick Smith, who didn’t discover X rays:

. . . not so unlucky in legend as the Oxford physicist Frederick Smith, who found that photographic plates kept near a cathode-ray tube were liable to be fogged and merely told his assistant to move them to another place. . . . Röntgen isolated the effect by covering his cathode-ray tube with black paper. When a nearby screen of fluorescent material still glowed he realized that whatever was causing the screen to glow was passing through the paper and the intervening air. If he held his hand between the covered tube and the screen, his hand slightly reduced the glow on the screen but in the dark shadow he could see its bones.

So is this a case of theory leading, or misleading? Or is this a third case, where a willingness to follow the evidence led to a hitherto overlooked phenomenon?

My guess: all three. Physics at the turn of the century was in the start of a creative tumult, a half-century active quake zone of discovery: old theories cracked under the pressure of irreconcilable data, new theories upended the known world and brought forth phenomenon which had previously hidden among the unknown unknowns, and all of this piled up and into the felt urgency to explain not just this new world, but a whole new universe.

There was too much of everything, a glorious and disorienting plenty on which one of the finest collection of minds in human history feasted; is it any surprise that pursuit of this course meant that dish was neglected?

All of this is a long way of saying I’m having a bitch of a time trying to make sense of my foray into medieval history. I don’t have a theory, don’t have a direction, and while I’m unbothered by—hell, actively defend—a certain amount of dilettantism, I’ve wandered enough to have become frustrated by my wanderings.

I’m not too worried, though. As frustrating as frustration is, it works for me, (eventually) crowbarring me out of my “it’ll come” complacency and into a “go get it” activity—which is to say, I’ll come up with a theory which will guide me to look at this, not at that.

I’m not doing the [kind of] science Crick did, so his observations on the process of discovery don’t necessarily translate across the fields, but he is right that if you’re going to find anything, it helps to know what you’re looking for.

(*As quoted in Horace Freeland Judson’s The Eighth Day of Creation)





Perspective

3 01 2012

 

Coudal Partners, “History of the Earth in 24 Hours”, via The Daily Dish





Negation—wha. . .what?

18 05 2011

Perhaps I should not have used the term “negation”.

It carries a philosophical load—which is fine, and not unrelated to my use of it—but I wanted (also) to emphasize the more prosaic, i.e., practical, aspects of negation, as in: to negate, to eliminate as an option or consideration.

The germ theory of disease negated theories of miasma, Lavoisier’s experiments with oxygen negated phlogiston, industrial production of beakers and test tubes negated the need for scientists to blow their own glassware (which further negated the need for the knowledge of blowing glassware), fuel injection will likely negate carburetors, etc.

So negation could mean “overturn” (as with germs > miasmas or oxygen > phlogiston) or “leave behind” (as with glass-blowing and carburetors), that is, to negate may be to disprove or it could mean to render irrelevant or trivial.

Now, these practical effects may reverberate ontologically, such that the negation of the practical may serve to negate an entire way of thinking or being, or simply to serve as a signal of the instability of that way of thinking/being. Thomas Kuhn’s The Structure of Scientific Revolutions, with its discussion of paradigm shifts rendering previous modes of scientific practice inert, lays out a version of global negation, while current questions of the role of cyber-technologies signal uncertainty over what counts as “real”.

John Donne’s “An Anatomy of the World” (1611) is often quoted—hell, I quoted it a while back—to exemplify the agonized confusion over the discoveries of the natural philosophers:

And new philosophy calls all in doubt,
The element of fire is quite put out;
The sun is lost, and the earth, and no man’s wit
Can well direct him where to look for it.
And freely men confess that this world’s spent,
When in the planets and the firmament
They seek so many new; they that this
Is crumbled out again to his atomies.
‘Tis all in pieces, all coherence gone;
All just supply, and relation:

Natural philosophy took for itself the name science, and modernity marched on. The laments for the old world died with those who once lived in it.

William Butler Yeats’s “The Second Coming” clearly echoes this lament, with the opening

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;

The times they are a-changin’, indeed.

History is not a line, or rather, history only holds the line, such that events may loosen or smash that hold and the contents of that history scatter.

Some of those pieces are lost and even of those which are found, the meaning of the piece, precisely because it has been scattered, can only be guessed at. It is shard of pottery uncovered in the desert, hinting at something which once was, now gone.

But not everything is lost: it could be hiding in that proverbial plain sight. I’m much taken with the notion of the palimpsest—that is, of a kind of tablet which has been inscribed then scrubbed clean to be reinscribed—largely because I think that the previous inscriptions are still there, that, like words which have been erased from a page, the impression lingers.

Heidegger in The Question Concerning Technology decries the transformation of the Rhine from a river in a landscape into a “water power supplier”, that is, it is no longer itself but a source of reserve power for a hydroelectric plant. Perhaps it could be understood as that river in a landscape, he muses, but “In no other way than as an object on call for inspection by a tour group ordered there by the vacation industry.”

Those who complain that Manhattan has turned into a theme park and that Times Square has lost all its gritty reality have not a little bit in common with Herr Heidegger.

I have a great deal of sympathy for this feeling, but even more skepticism for such sympathy; as I’ve mentioned more times than you probably care to read, we’re never who we’ve been.

So, again, I’m not taking the side of the past against the present, not least because I have no basis for such a taking of sides. Again, I simply want to trace the history of modern history.

I can’t raise all the inscriptions on the palimpsest, but maybe I can see some of what has been left behind.