On the Internet, anyone can be a historian

The Washington Post has a flattering profile of a young Wikipedian, Adam Lewis, who worked on the article for Washington, D.C. The punchline comes a few paragraphs in:

Lewis joined thousands of other amateurs toiling in obscurity on Wikipedia, where facts are more important than the star historians who tend to dominate the popular view of history. On Wikipedia, anyone can be a historian.

I think this is suspect in a couple of ways (do “star historians” really dominate the popular view of history? what does “historian” mean in the Wikipedia context, where the policy is “no original research“?) but the spirit of the remark is right on, and relevant beyond just Wikipedia.

The history profession hasn’t yet been much affected by the “pro-am revolution“, but it’s increasingly possible for amateur historians to do original work with professional quality (even if that work is unlikely to much resemble academic history writing).  Some academic fields–astronomy is the most dramatic example–have already started benefiting  greatly from the contributions of amateurs.  But history seems slow on the uptake, with frustratingly little appetite for collaborative projects  and little interest in taking the work of amateur historians seriously (the exciting projects of George Mason’s  Center for History and New Media notwithstanding).

Will that change dramatically?  Will a pro-am revolution come to the history profession?  The case of history of science may be instructive here.  History of science has actually had a vibrant “pro-am” community (of scientists who write science history) since well before the Internet made relevant sources and publishing venues easily accessible to other interested groups of amateur historians.  Nevertheless, historians of science have not drawn closer to pro-am scientist-historians in recent decades–just the opposite, they’ve withdrawn from scientist-historians and often dismiss their work as hopelessly naive or self-interested.  If history of science is any guide, I fear that history as a whole may view the coming rise of “pro-am” history as more of a threat than an opportunity.

[cross-posted at Cliopatria]

The Two Cultures, 50 years later

7 May was the 50th anniversary of C. P. Snow‘s famous lecture The Two Cultures. Snow, a novelist who had studied science and held technology related government positions, decried the cultural rift between scientists and literary intellectuals. Snow’s argument, and his sociopolitical agenda, were complex (read the published version if you want the sense of it; educational reform was the biggie), but, especially post-“Science Wars”, the idea of two cultures resonates beyond its original context. The current version of the Wikipedia article says:

The term two cultures has entered the general lexicon as a shorthand for differences between two attitudes. These are

  • The increasingly constructivist world view suffusing the humanities, in which the scientific method is seen as embedded within language and culture; and
  • The scientific viewpoint, in which the observer can still objectively make unbiased and non-culturally embedded observations about nature.

That’s a distinctly 1990s and 2000s perspective.

Snow’s original idea bears only scant resemblance to the scientism vs. constructivism meaning. As he explained, literary intellectuals (not entirely the same thing as humanities scholars) didn’t understand basic science or the technology-based socioeconomic foundations of modern life, and they didn’t care to. Novelists, poets and playwrights, he complained, didn’t know the second law of thermodynamics or what a machine-tool was, and the few that did certainly couldn’t expect their readers to know.

Humanistic studies of science (constructivist worldview and all) would have undermined Snow’s argument, but humanists were only just beginning to turn to science as a subject for analysis. (Kuhn’s Structure of Scientific Revolutions was not until 1962. Structure did mark the acceleration of sociological and humanistic studies of science, but was actually taken up more enthusiastically by scientists than humanists. Widespread constructivism in the humanities only became common by the 1980s, I’d guess, and the main thrust of constructivism, when described without jargon, is actually broadly consistent with the way most scientists today understand the nature of science. It’s not nearly so radical as the popular caricature presented in Higher Superstition and similar polemics.) Rather than humanists understanding the scientific method or scientists viewing their work through a sociological or anthropological lens, Snow’s main complaint was that scientific progress had left the slow-changing world of literature and its Luddite inhabitants behind (and hence, scientists found little use for modern literature).

Snow wrote that “naturally [scientists] had the future in their bones.” That was the core of the scientific culture, and the great failing of literary culture.

Looking back from 2009, I think history–and the point in it when Snow was making his argument–seems very different than it did to Snow. Who, besides scientists, had the future in their bones in 1959? In the 1950s academic world, literature was the pinnacle of ivory tower high culture. Not film, not television, certainly not paraliterary genres like science fiction or comic books. History of science was a minor field that had closer connections to science than to mainstream history.

Today, in addition to scientists, a whole range of others are seen as having “the future in their bones”: purveyors of speculative fiction in every medium; web entrepreneurs and social media gurus; geeks of all sorts; venture capitalists; kids who increasingly demand a role in constructing their (our) own cultural world. The modern humanities are turning their attention to these groups and their historical predecessors. As Shakespeare (we are now quick to note) was the popular entertainment of his day, we now look beyond traditional “literary fiction” to find the important cultural works of more recent decades. And in the popular culture of 1950s through to today, we can see, perhaps, that science was already seeping out much further from the social world of scientsts themselves than Snow and other promoters of the two cultures thesis could recognize–blinded, as they were, by the strict focus on what passed for high literature.

Science permeated much of anglophone culture, but rather than spreading from high culture outward (as Snow hoped it might), it first took hold in culturally marginal niches and only gradually filtered to insulated spheres of high culture. Science fiction historians point to the 1950s as the height of the so-called “Golden Age of [hard] Science Fiction”, and SF authors could count on their audience to understand basic science. Modern geek culture–and its significance across modern entertainment–we now recognize, draws in part from the hacker culture of 1960s computer research. Feminists and the development of the pill; environmentalists; the list of possible examples of science-related futuremaking goes on and on, but Snow had us looking in the wrong places.

Certainly, cultural gaps remain between the sciences and the humanities (although, in terms of scholarly literature, there is a remarkable degree of overlap and interchange, forming one big network with a number of fuzzy division). But C. P. Snow’s The Two Cultures seems less and less relevant for modern society; looking back, it even seems less relevant to its original context.

The scientist in TV dramas

This is a widely-acknowledged Golden Age of American television drama (led, of course, by cable shows, but with network fare that also has its high points). (I’m two discs in to Deadwood right now, which is the one show that is usually mentioned in the same sentence as The Wire in terms of really great shows.) One remarkable thing that’s happened recently, especially this season, is the flood of scientists as main characters. Several established shows have main characters who derive much of their identity, and personality, from being scientists: House, Bones, and (to some extent) Mohinder Suresh from Heroes. More than earlier shows in the same genres (medical dramas, forensic science dramas, superhero dramas), these shows and their characters explore the meanings of what it is to be a scientist in modern society.

But two new shows this season, Fringe and Eleventh Hour, are about science to an unprecedented extent (even including The X-Files and Star Trek: The Next Generation, but excluding CBC’s ReGenesis and the four-episode British version of Eleventh Hour, neither of which I’ve yet seen).

Fringe, and its main scientist character, showcase science-as-threat; Walter Bishop is Dr. Frankenstein for the era of Big Science. In his previous scientific life, Bishop had worked for the government and others on an endless array of “fringe science” research projects, mostly aimed in various ways at controlling the minds and bodies of people living and dead. Institutionalized for years, Bishop is now out and, working out of his old lab at Harvard, is helping the FBI investigate “The Pattern”, a big-business-linked series of weird and deadly happenings that are often the scientific monsters Bishop had helped to create. In Fringe, science is not just a threat to society, it is (inherently?) a threat to the moral fiber and mental health of the scientist. Bishop is an otherwise kindly old man whose broken personality is centered on a self-centeredness that is presented as, at least partly, a mental health issue, and alternately child-like naivety and (in the course of performing science) shocking callousness. Fringe is by no means a serious show, but it does articulate an interesting, and I think significant, interpretation of what it means in American culture to be a scientist.

If Fringe is in part inspired by the works of Michael Crichton, as creator J. J. Abrams claims, then Eleventh Hour is inspired by the other part of Michael Crichton’s works–that is, the part that deals with the moral and ethical dimensions of science as it is actually practiced, rather than the outlandish threats of science gone wild. The compelling main character, biophysicist Jacob Hood, also works for the FBI investigating science-related crimes and mysteries. But where Walter Bishop is pulled, out of dire necessity, from an asylum, Hood was recruited because (in addition to his brilliance) he was friends with someone who ended up in a position of power in the FBI. Most of the crimes involve acute threats to one or a few people, but there is no overarching conspiracy, no Pattern of misused science. Rather, the criminals are usually scientists doing realistic but scientifically/ethically/morally questionable research (often in commercial contexts), or the people who oppose what they do. Hood treads the line of genuine scientific enthusiasm (often accompanied by patronizing bemusement at his female FBI handler’s scientific ignorance), and ironic detachment and quiet disapproval of less-than-pure but not egregiously bad ways of doing science.

What does recent prominence of science and scientists tell us about American culture and the place of science in society? I don’t know, but I feel that it’s my scholarly responsibility to keep watching until I figure it out.

Craig Venter is making history

…or at least trying to.

Venter’s J. Craig Venter Institute, the successor of TIGR and TCAG, has been working on what they characterize as the first man-made organism: Mycoplasma laboratorium. The ongoing project centers on “Synthia”, a slimmed-down synthetic chromosome that they are calling (and patenting as) a “minimal bacterial genome”. It consists of 381 of the ca. 470 genes of the tiny parasitic bacterium Mycoplasma genitalium. (The name “Synthia” comes not from Venter, et. al., but from the critical ETC Group; it seems to have stuck.) Add Synthia to an empty cell, and viola! Life!

The project builds on earlier work in which Venter’s team (led by restriction enzyme pioneer and Nobel laureate Hamilton O. Smith, Clyde A. Hutchinson, III and Cynthia Pfannkoch) recreated the genome of the bacteriophage phi X-174 from scratch and stuck it into an empty coat to create a viable phage; they generated the 5,386 base pair sequence in 14 days. In the 2003 PNAS report, they described a plan to use similarly-sized chunks of synthetic DNA to assemble whole genomes. Since the phi X-174 project, they have been developing and improving DNA cloning methods that can deal with ever larger target sequences without high levels of error–a boon for DNA sequencing as well as chromosome synthesis. (Synthetic phi X-174 could be selected for infectivity to week out high-error sequences, but that’s not an option with arbitrary 5,000 bp “gene cassettes”.)

Since 2003, they’ve gotten to the point of putting together a whole genome (if a very small one). They quietly started filing patents for “Synthia” in 2006, and in June 2007 announced that man-made life in the form of M. laboratorium was right around the corner. Proving that the synthetic genome is viable by sticking it into a genome-less cell and making it live will be a powerful proof-of-concept for new and more drastic kinds of genetic engineering.

“Man-made life” makes a great headline, but it’s worth picking apart. At the fundamental level, even Venter’s team is quick to note that M. laboratorium won’t be a wholesale synthetic organism, as it will rely on the molecular machinery and cellular environment taken from natural cells. (At least, as natural as a laboratory organism with its genome carefully removed can be.) The conflation of genes with life has been the constant complaint of all the biologists except the molecular ones since the rise of molecular biology. It was one of the chief complaints of those who thought the Human Genome Project was (all funding levels being equal) a bad idea. In a recent article in I forget which history of science journal, (atheist) Emile Zuckerkandl accuses HGP leader (Christian-turned-atheist-turned-Christian) Francis Collins of exploiting the genes=life fallacy in his best-selling quasi-intelligent design book The Language of God. (The language of God, of course, is the genome.) The all-powerful gene is a potent political and rhetorical force (and has been a great basis for securing grants, at least since the 1940s), even if biological reality is considerably more complex.

But even looking past the conflation of a genome with life itself, M. laboratorium has a dubious claim as synthetic life. As the ETC Group points out, “Synthia” only distinguishes itself from a natural chromosome by what is missing (i.e., a fifth of its genes). This organism would have a shakier claim at being man-made life even than the 1972 oil-eating bacterium of Diamond v. Chakrabarty (the landmark patent case that established the legitimacy of patenting genetically-engineering lifeforms); at least Chakrabarty’s bug had a combination of characteristics that no natural organism had. Does putting together most of the DNA of an organism (which happens to be synthesized artificially) together with everything but the DNA of that organism mean scientists have created artificial life? It’s hard not to invoke Frankenstein.

Venter has been very successful at framing his science in ways that grab headlines, generate public interest, and seem self-evidently of central historical importance (whatever the later historical verdict). I haven’t decided whether that’s a good thing or a bad thing. He’s certainly earning his place in history, one headline and Discovery channel documentary at a time.

Randy Olson’s science communication suggestions

Recently, the film Flock of Dodos was screened at Yale, followed by a discussion with the director Randy Olson, science writer Carl Zimmer, and others. As you might infer from the title, a poke at March of the Penguins, the movie is a humorous take on the public controversy over intelligent design. Unfortunately, I didn’t even find out about this until after it had come and gone. But the discussion afterwards focused on how evolutionists should deal with ID, and from the report of one of the students in Lloyd’s class, Olson recommended an excellent approach (with Zimmer providing an opposing perspective more in line with the approach of Panda’s Thumb, NCSE, etc.). Fortunately, Zimmer has provided some of the material, Olson’s 10 suggestions for “improving communication,” and I think he pretty much nails it.

Especially notable is number 3: “The most effective means of communication is through storytelling. The shorter, more concise, and punchier the story you can tell, the greater the interest you will hold with an audience.”

Effective storytelling is something that the scientific community as a whole simply fails at. But, ironically, the humanist disciplines (e.g., history) are nearly as bad. Scholars of all persuasions can continue, as PZ Myers suggests and John Lynch seems to support as well, to emphasize their strengths of “depth, intelligence, evidence, history, the whole damn natural world, and just plain having the best and most powerful explanation for its existence.” But that just serves to further insulate an already insular group; putting more priority on effective mass communication does not mean abandoning good explanations, it just means making them available to non-scholars.

While the more practical branches of science might be able to justify their work in terms of the tangible technical payoffs that society gets from it, evolutionary biologists, historians, and most types of scholars simply don’t have any other reason to exist except for the general enrichment of society. Technical monographs and detailed case histories are a proximate goal to enhance the collective knowledge of a specialist community and body of literature, but we ought always to keep an eye on the larger goal of distilling the broad and deep scope of that literature into stories for the rest of society.

This is, of course, the same broad issue that draws me to Wikipedia, one of the easiest and most effective means of actually applying specialist knowledge to reshape public understanding. The way things are now, the extent of historical outreach is the occasional late-career book that aims at a (very limited) popular audience along with the requisite scholarly one. The only historical work that really makes it into public consciousness is what the media industries ask for from historians; Wikipedia (perhaps among other venues) is a chance for disciplines to shape their own public destinies and forge their own places in mass culture.

Stephen Colbert on Science and President Bush

Check out this Colbert Report segment as a followup to my previous science funding post.

Edit: Those of you searching for “Colbert Bush” or some variation are probably looking for something on the White House press correspondents’ dinner at which Colbert recently lampooned the President. Try www.thankyoustephencolbert.org for links to the video.