How can so many people be so wrong?

Is this a representative sample?

I came across an interesting and disturbing poll yesterday: a post on The Daily Beast reports the results of a new Harris poll that finds staggering levels of disconnection from reality among Republicans: 67% think Obama is a socialist, 57% think he’s Muslim, and 24% think he might be the Antichrist.

I dented (like tweeting, but with a free network service) my initial reaction:

57% of Republicans think Obama is Muslim, 24% think he may be the Antichrist? Really? I just can’t wrap my head around it.

American public political discourse has gotten bad, but is it really that bad?  As a friend reminded me, you should always be skeptical of “scientific” data presented in unscientific ways.  I’ve been to a Tea Party; there really are a lot of people who believe that stuff, even in Connecticut.  But 57% of Republicans, and 32% of all Americans, think Obama is Muslim?  It just doesn’t compute that in an age of such powerful and ubiquitous media, so many people could be so wrong.

Fortunately, it looks like unreason and willful ignorance aren’t so widespread as this poll indicates.  In short, the poll is surely crap.  Pew polls in October 2008 and March 2009 found that a stable 17% of Republicans and 11% of all Americans thought Obama was Muslim.  It’s still depressing that there was no decline in misinformation between the campaign and the early months of Obama’s presidency, but 17% is a far cry from 57%.  And it’s just not believable that it could have gone from 17% to 57% in the last year.

The ABC polling blog has a helpful analysis of the methodological flaws in the Harris poll:

The purpose seems to have been to see how many people the pollsters could get to agree to pejorative statements about Obama.

It’s still astounding (and depressing) that poll wording and plausible-sounding but biased methodology can so distort public opinion.  Not to mention that so many people are happy to run with it.

Demand Media vs. Wikimedia: the battle for the soul of the Internet

There’s one company I’ve been talking about more than any other lately: (the demonic) Demand Media and

Jay Rosen on Twitter, 27 November 2009

When journalism professor and media critic Jay Rosen discusses Demand Media and its business model, he always includes the parenthetical adjective demonic.  Demand Media is the answer to the question, what would Internet content look like if it was entirely and solely driven by advertising revenue?  Content is commissioned based on an algorithm that calculates the lifetime value of the ads that could be run against it.

Demand Media takes the routinization of knowledge work to its logical extreme.  (For those with a Marxist bent, is there any clearer example of the knowledge worker alienated from the products of his labor than Christian Muñoz-Donoso, from Rosen’s first link?)  And Demand Media expects to be producing “the equivalent of four English-language Wikipedias a year” by next summer.

Wikipedia and other free culture projects, sometimes pejoratively described as “crowdsourcing” projects, have been criticized for undermining the economic viability of traditional, professionally produced media.   But what if the real choice for the future is not between the Wikimedia model and the traditional media model, but between the Wikimedia model and the Demand Media model?  Media driven by love versus media driven by money.  Editor-driven media where everyone is an editor versus demand-driven media where no one is an editor.  Media built from soul versus media with no soul.

The history of the future of journalism?

In the wake of the Iranian election, lots of the people who focus on the changing journalism landscape have been talking about the significant role Twitter and other social media are playing in organizing and spreading news about the protests. Two of the leaders of the broad journalism discussion are Dave Winer and Jay Rosen, who have a weekly podcast called Rebooting the News. In the latest edition, Winer looks back to September 11, 2001 as the first time when the online social web foreshadowed the kinds of citizen journalism that Winer and Rosen see as a major part of the future of news. As he explains, he had no TV at the time but strictly through the Internet he was just as informed and up-to-date as he would hae been following the events of the day through traditional media.

Around 2001 is also the horizon for historians; for events after that, the archival richness of the internet accellerates from then until now in terms of the experience of ordinary people in major historical events and trends.

In that vein, here’s a paper I wrote in 2005 for a course on narrative history with John Demos, about the usenet traces of the kinds of the thing Dave Winer reflects on from 9/11. (I tried to weave in the pop psychology framework of the five stages of grief, to mixed results.)


We historians like to think that things develop gradually. Yet, in the microcosm, the events of the following months and years were foreshadowed there in the cyberspace of New York City on September 11. All the questions of “why?” and “what now?” were hashed out in the hours following the attacks by net denizens as they struggled to come to grips with the grief of the nation.

After one hour thirty-six minutes of denial, the messages on the NYC General internet discussion group started with a comment calculated to jump-start conversation, going straight to bargaining:

Tues, Sep 11 2001 10:20 am

WTC: Bound to happen

I wonder if this will change our assistance plans to Israel?[1]

Circumspection was the word for the first few replies; questions not answers. Who is sending us this message? Was it “internal” like the Oklahoma City bombing, or was it the Palestinians or someone else? Is it just a coincidence that this is the 25th anniversary of the Camp David Accords?[2] Whoever it was, they were clearly well-organized; they knew they had to use large planes with full fuel tanks to take out the World Trade Center Towers.

Just after noon, they were on to bin Laden as the likely culprit; it seemed like “his style.” Rumors that he had foretold an “unprecedented attack” two weeks earlier, including information from one woman’s unnamed friend from the intelligence community, provided one focus for the rising anger of the discussants. Israel and the celebrating Palestinians on TV were also popular targets of ire. Anger got the better of more than one:

Tues, Sep 11 2001 2:05 pm

Anyone cheering at thousands of Americans being murdered is a declaration of war as far as I’m concerned.

Tues, Sep 11 2001 6:02 pm

Did you all see the Palestinians dancing for joy today?

SCUM. Burn them all.

Calmer voices prevailed quickly, defusing talk of an indiscriminate crusade. But few seemed to doubt that war was on the horizon, even if not everyone had a clear idea of whom (or who) to fight:

Tues, Sep 11 2001 1:51 pm

>>>This must mean war.

>>With who?


Any particular reason, or are you just starting [with] the A’s?

The possible complicity of Iraq was mentioned as well, and the failure to capture Saddam Hussein in the Gulf War illustrated how hard it might be to get bin Laden (if he was even the right target) in an Afghanistan war. But waging war on the Taliban, at least, might yield some human rights dividends, considering the way they treated their women.

The depressing, fatalistic seeds of the prolific conspiracy theories that developed in the months and years after the attack were there in the first hours too:

Tues, Sep 11 2001 11:39 am

I would not think (but I’m NOT an expert) that such impact would so weaken the structure as to cause both to collapse, without further destruction at a lower level.


Tues, Sep 11 2001 1:22 pm

I am just pointing out that I don’t think we can take out Bin Laden because if we could we would have done it long ago.

It would be months before online groups like the 9/11 Truth Movement would spin such speculation into elaborate tapestries of lies and manipulation, in which the strings are pulled by the man by behind the man behind the man (with three U.S. Presidents, at least, in on it), with bin Laden as the fall guy who was working for the CIA all along. But common sense prevailed quickly in this particular cyber niche; the combination of fire and impact would be able to take down the towers, with all that weight above the impact points, they reasoned.

Ultimately, the tension on the internet that day was between anger and acceptance, and with the bombers apparently dead and the looming possibility that there might not be anyone left to blame, the discussants turned on each other:

Tues, Sep 11 2001 8:10 pm

[On the subject of celebrating Palestinians and possible PLO involvement in the attacks]

>>Gosh, you don’t suppose the Isreali blockade has anything to do with it, do you?

>And what does this have to do with just buying food???

Don’t know how the blockade works, do you?

>>>They aren’t feeding their people, giving them housing or water – no

>>As a matter of fact they are, as much as they can. But when Isreal takes their land

> Of, forget it. You’re brainwashed.

This is coming from someone who can’t tell the difference between the PLO and other arab organizations.


Tues, Sep 11 2001 8:31 pm

> I’m not the one advocating bombing anyone.

Ha. So you just want to let them do this and get away with it, eh?

This was the worst of that first 111 message-long thread—tame compared with many of the other virtual shouting matches that developed that afternoon. And ultimately, the feelings of anger won out on NYC General, coming into line with zeitgeist of the rest of the nation as President Bush announced plans to hunt down the terrorists and those who harbor them. But elsewhere on the internet, then and now, every possible response from denial to acceptance has a place. And the stories will still be there waiting for us, for when we are ready to move on.


[1] This and all following quotes come from the USENET archive of nyc.general, as archived by Google Groups ( This discussion thread was started simultaneously on nyc.general, nyc.announce, alt.conspiracy (where it superseded such hot topics as “Moon Landings: Fact of Fiction?,” but did not change that group’s absurdist conspiratorial tone), talk.politics.misc (which was rapidly inundated with separate posts, preventing any sustained discussion), and soc.culture.jewish (where the endemic Zionist/anit-Zionist rhetoric drowned out this relatively moderate thread), and soon spread to other groups, fragmenting and spawning new discussions. There are probably hundreds of preserved usenet discussions documenting the immediate response of thousands of people on September 11.

[2] Actually the Camp David Accords were reached on September 17, 1978, making 9/11 just shy of the 23rd anniversary.

Reply to a tweeted link

Clay Shirky tweeted a link to this essay on the future of journalism, from Dan of Xark!. It isn’t accepting my comment, so I’m posting it here:

This is an interesting vision of the future, but I don’t see how it could possibly be the future of journalism.

For the sake of argument, I’ll assume that collecting news data and maintaining a usefully-organized database of it is a viable business model. I agree that it would not be newspapers who led this, but more likely a web-only company.

But newspapers (and to a much lesser extent, television) are the organizations that have an institutional commitment to investigative journalism (the kind that isn’t database-friendly and that is the main thing people fret about losing). Why would a news informatics company, which would lack that institutional commitment, use its profit to subsidize investigative journalism that isn’t itself profitable?

For newspapers, there have been two jobs that only meet economically at the broadest levels: to sell ads, and to create compelling content for readers. Economics didn’t figure in directly in the choice of whether to send a reporter to the court house or fire; rather, that choice was made within the editorial sphere. For news informatics, every choice of coverage has economic implications: which kind of data will people be paying to access? In that environment, in what is sure to be a tough market to establish, would news informatics companies fund investigative journalism out of sheer civic responsibility?

Rethinking Wikinews

Digital opinion-makers across the blogosphere and the twitterscape been increasingly preoccupied with the rapid decline of the print news industry. Revenues from print circulation and print advertising have both shrunk dramatically, and internet advertising revenues have so far been able to replace only a fraction of that. Newspapers throughout the U.S. are downsizing, some are switching to online-only, and some are simply being shuttered. The question is, what, if anything, will pick up the journalistic slack. (Clay Shirky’s essay, “Newspapers and Thinking the Unthinkable“, is the best thing I’ve seen in this vein, although I would be remiss if I didn’t mention some contrasting viewpoints, such as Dave Winer’s “If you don’t like the news…” and Jason Pontin’s response to Shirky and Winer, “How to Save Media“.)

On its face, Wikinews seems an ideal project to pick up some of that slack. Collaborative software + citizen journalism + brand and community links to Wikipedia…it seems like a formula for success, and yet Wikinews remains a minor project. There are typically only 10 -20 stories per day, most of which are simply summaries of newspaper journalism. Stories with first-hand reporting are published about once every other day, and even many of these rely primarily on the work of professional journalists and have only minor original elements.

Why doesn’t Wikinews have a large, active community? What might a successful Wikinews look like? I have a few ideas.

One reason I write and report for Wikipedia regularly, but only every once in a while for Wikinews, is that writing Wikipedia articles (and writing for the Wikipedia Signpost) feels like being part of something bigger. Everything connects to work that others are doing. I know I’m part of a community working for common goals (more or less). Even if I’m the only contributor to an article, I know there are incoming links to it, that it fits into a broader network. On Wikinews, I can write a story, but it is likely to be one of maybe 20 stories for the day, none of which have much of anything to do with each other.

I went to the Tax Day Tea Party in Hartford, Connecticut with my camera and a notepad. (I put a set of 108 photos on Commons and on Flickr.) Similar protests reportedly took place in about 750 other cities. If there was ever an opportunity for collaborative citizen journalism, this seemed like it. But there was nothing happening on Wikinews, and I didn’t see the point writing a story about one out of hundreds of protests, which wouldn’t even be a legitimate target for a Wikinews callout in the related Wikipedia article.

What I take from this is the importance of organization. Wikinews needs a system for identifying events worth covering before (or while) they happen and recruiting users for specific tasks (e.g., “find out the official police estimate of attendance, photograph and/or record the messages of as many protest signs as possible, and gather some quotes from attendees about why they are protesting”).

My most rewarding experience with Wikinews was a story on the photographic origins of the Obama HOPE poster. It grew out of a comment on the talk page of the poster’s Wikipedia article; the comment appeared while it was on the Main Page as a “Did you know” hook. The lesson here is, in the (alleged) words of Clay Shirky, “go where people are convening online, rather than starting a new place to conveve”. (I think it was unfortunate that Wikinews started as a separate project rather than a “News:” namespace on Wikipedia, but what’s done is done.) There are many places online people gather to discuss and produce news, in addition to Wikipedia; one path to success might be to extend the social boundaries to Wikinews to reach out to existing communities. Although other citizen journalism and special interest communities don’t share the institutional agenda of Wikinews (name, NPOV as a core principle), some members of other communities will be willing to create or adapt their work to be compatible with Wikinews’ requirements. And certain communities actually do share a commitment to neutrality, which raises the possibility of syndication arrangements (in which, e.g., original news reports from a library news automatically get added to the Wikinews database as well).

Shirky and others have argued that some kinds of journalism (in particular, investigative journalism) are not possible without assigning or permitting reporters to develop a story in depth over a long period of time–and these may be the most important kinds of journalism for maintaining a healthy democracy. To some extent, alternative finance models (with public donations like National Public Radio or with endowments like The Huffington Post ) may be filling some of the void left by shrinking newpaper staffs, but it seems unlikely that these models will support anything close to the number of journalists that newspapers do/did.

Wikinews could contribute to investigative journalism in a couple of ways. The simplest is something similar to what Talking Points Memo does–crowdsourcing the analysis of voluminous public documents to identify interesting potential stories. However, as Aaron Swartz recently argued, there are serious limits to what can be gleaned from public documents; as he says, “Transparency is Bunk“.

Another way would be to either fund a core of professionals or collaborate with investigative journalists who work for other non-profits. These professional journalists would–to the extent that it is possible–recruit and manage volunteer Wikinewsies to pursue big stories where the investigative work required is modular enough that part-time amateurs can fruitfully contribute.

In the same vein a professional editor working for Wikinews could be in charge of identifying self-contained reporting opportunities based on geography (e.g., significant political and cultural events) and running an alert system (maybe integrated with the Wikipedia Geonotice system for users who opt in) to let users know what’s happening near them that they could report on. One of the hardest things for a would-be Wikinewsie original reporter is just figuring out what needs covering.

I’m sure there there are a lot of different models for Wikinews that could make it into a successful project. But it’s clear that the current one isn’t working very well.

Public weighs in on Flagged Revisions

Andrew Lih’s blog post “English Wikipedia ready for Flagged Revisions?” is a nice overview of the big news this week: it seems likely that some form of the Flagged Revisions extension is finally going to be used. For more details on the on-wiki discussion, this soon-to-be published Signpost article is a good place to start.

The comments on the NYT blog story on this development give a nice cross-section of public perceptions of Wikipedia among the Times’ audience, and their reactions to the possible change in the way the site works. Some choice quotes:

  • It’s a cesspool of misinformation and bias. Now that the Wikipedians are in charge, it will become even more useless as a reliable resource.

    Someone needs to be monitoring the Wikipedians. They are not to be trusted with the interpretation of things. -Wango

  • It’s a living, multidimensional document and I’m of the mind that it should be left the frak alone […] WIKIPEDIA NEEDS MISTAKES if it is to remain the vital document that it is today. Living things change, static dead things are perfect and immutable. -jengod
  • It’s not arrogant for wikipedia – or any source of authoritative information – to want to be right […] Grafitti on the wall may be instructive, but it does not make the wall more valuable or more purposeful. -Frank
  • Any edit beyond spelling, grammar and syntax, must be considered suspect, if done by a minor, an artist or any individual that does not have any expertise on the subject. -CGC
  • The real bad blunders are almost always corrected within hours (if the article is of no great interest) or minutes (if it is). So why bother? The true capital of Wikipedia is ALL of its contributors – and not just the “trustworthy” elite. Such measures will discourage new, fresh, motivated contributors, and in the long run dry out the project. -Oystein
  • It’s a standard fascist procedure to declare an outrage and then restrict freedoms under the guise of making things better for all. I’m not saying that’s what Wales is doing. Just saying that it sounds like a jack-booted tactic. -Kacoo
  • Is it possible that [the anons who ‘killed’ Kennedy and Byrd] weren’t vandals at all, but just people trying to be “that guy” who made the change to such an important entry. Who knows? -Light of Silver

These are the kind of stories Wikinews should be doing

The election numerology blog has been publishing a series of fascinating “On the road” posts by Sean Quinn and photographer Brett Marty. Quinn and Marty have been traveling through battleground states investigating the “ground game” of the McCain and Obama campaigns, reporting on the voter registration and get-out-the-vote operations managed by volunteers and paid staffers in the regional and local campaign offices.

See the latest few:

Individually, these might seem minor, but the series as a whole makes for an important story that has been largely neglected by traditional news sources. It’s also the type of thing Wikinews could excel at, with a little more organization. Wikimedians all over the U.S. could go out the same weekend and do stories on the local dimensions of these national campaigns, and the result could be something very special.

Bonus link:

  • The Wikipedian Candidate – an interesting analysis of the (it seems increasingly clear) ill-advised selection of Sarah Palin as McCain’s VP and the important things that don’t come across in a Wikipedia article, from’s Nate Silver

How are your Wikimedia Commons photos being used elsewhere?

I don’t know about yours, but I do have some idea of how mine are being used.

Google searches for my name and my username reveal a lot more instances than I was aware of, especially for news article illustrations.

In the “license, schmicense” category, I found this article from The Jerusalem Post, which takes a recent photo of mine (either from Flickr or Wikipedia, but more likely Wikipedia) as simply says “Photo: Courtesy:Ragesoss”.

Marginal cases include the hundreds of Google hits for “ragesoss” come from World News Network websites. This organization runs thousands of online pseudo-newspapers, such as the West Virginia Star and Media Vietnam, that aggregate content from real news organizations. Stories at all of their portals link to World News pages that have teasers for the actual articles at the original sources. And I’ve found a bunch of my photographs as illustrations on these pages. See these:

Of course, my photographs are not the ones used by original articles. World News seems to have used almost every photo I uploaded from the February 4 Barack Obama rally in Hartford, to illustrate campaign news unrelated to the Hartford rally. In terms of photo credits (see the links), most of them they say “photo: Creative Commons / Ragesoss” or “photo: GNU / Ragesoss”. Nearly all of my photos on Wikimedia Commons are copyleft under GFDL and/or CC-by-sa, so non-specific credits like that do not constitute legitimate use under the terms of either license. The GFDL requires a link to the license (GFDL, not “GNU”), and CC-by-sa at least requires notice that the image is free to reuse as long as derivatives are issued under the same license (simply “Creative Commons” is not a license). It is also implicit with CC licenses that credits for my photos should include a link to my Commons userpage, since the author field on the image pages is typically a link titled “Ragesoss”, not just the text. (The third link above, among others I found, does link to the GFDL, although the photo has nothing to do with the article.)

Another major user of my photos is Associated Content, a commercial user-generated content site that pays contributors. AC is a mixed bag in terms of legitimate uses of photos, since individual contributors are responsible for selecting and crediting the illustratons for their articles. This one, which uses a photo of Ralph Nader, credits my shot as “credit: ragesoss/wikipedia copyright: ragesoss/GNU FDL 1.2”. It almost meets the basic requirements of the license (all it needs is a link to the text of the license), although a link to the source would preferable to simply mentioning Wikipedia. This one, on the other hand, just says “credit: Ragesoss copyright: Wikimedia Commons”.

Popular Science, in this article, lists the GFDL, but links it to the Wikipedia article on the license rather than the actual text.

The Bottle Bill Resource Guide links to my Commons userpage, but does not list the license or link to the image source.

Another partly-legit use is by LibraryThing, a book related site that uses several of my photos for authors (e.g., Dava Sobel). They include links back to the original image pages, but the site behaves erratically and sometimes insists on me signing in or creating an account to view the image details.

Unexpectedly, I also found several of my photos illustrating Encyclopedia Brittanica. See:

In each case, they provide a link to one of the licenses (GFDL 1.2 and CC-by-sa 3.0 unported, in these cases), although they don’t provide a userpage link. At least they seem to take the licenses seriously.

Of course, it’s much tougher to find out where my photos are being used without mentioning me at all. I suspect that the majority of uses don’t even attempt to assign credit or respect copyright. Most of the publications that are serious about copyright aren’t even willing to use copyleft licenses, preferring to get direct permission from the photographer (even if it means paying, often).

Craig Venter is making history

…or at least trying to.

Venter’s J. Craig Venter Institute, the successor of TIGR and TCAG, has been working on what they characterize as the first man-made organism: Mycoplasma laboratorium. The ongoing project centers on “Synthia”, a slimmed-down synthetic chromosome that they are calling (and patenting as) a “minimal bacterial genome”. It consists of 381 of the ca. 470 genes of the tiny parasitic bacterium Mycoplasma genitalium. (The name “Synthia” comes not from Venter, et. al., but from the critical ETC Group; it seems to have stuck.) Add Synthia to an empty cell, and viola! Life!

The project builds on earlier work in which Venter’s team (led by restriction enzyme pioneer and Nobel laureate Hamilton O. Smith, Clyde A. Hutchinson, III and Cynthia Pfannkoch) recreated the genome of the bacteriophage phi X-174 from scratch and stuck it into an empty coat to create a viable phage; they generated the 5,386 base pair sequence in 14 days. In the 2003 PNAS report, they described a plan to use similarly-sized chunks of synthetic DNA to assemble whole genomes. Since the phi X-174 project, they have been developing and improving DNA cloning methods that can deal with ever larger target sequences without high levels of error–a boon for DNA sequencing as well as chromosome synthesis. (Synthetic phi X-174 could be selected for infectivity to week out high-error sequences, but that’s not an option with arbitrary 5,000 bp “gene cassettes”.)

Since 2003, they’ve gotten to the point of putting together a whole genome (if a very small one). They quietly started filing patents for “Synthia” in 2006, and in June 2007 announced that man-made life in the form of M. laboratorium was right around the corner. Proving that the synthetic genome is viable by sticking it into a genome-less cell and making it live will be a powerful proof-of-concept for new and more drastic kinds of genetic engineering.

“Man-made life” makes a great headline, but it’s worth picking apart. At the fundamental level, even Venter’s team is quick to note that M. laboratorium won’t be a wholesale synthetic organism, as it will rely on the molecular machinery and cellular environment taken from natural cells. (At least, as natural as a laboratory organism with its genome carefully removed can be.) The conflation of genes with life has been the constant complaint of all the biologists except the molecular ones since the rise of molecular biology. It was one of the chief complaints of those who thought the Human Genome Project was (all funding levels being equal) a bad idea. In a recent article in I forget which history of science journal, (atheist) Emile Zuckerkandl accuses HGP leader (Christian-turned-atheist-turned-Christian) Francis Collins of exploiting the genes=life fallacy in his best-selling quasi-intelligent design book The Language of God. (The language of God, of course, is the genome.) The all-powerful gene is a potent political and rhetorical force (and has been a great basis for securing grants, at least since the 1940s), even if biological reality is considerably more complex.

But even looking past the conflation of a genome with life itself, M. laboratorium has a dubious claim as synthetic life. As the ETC Group points out, “Synthia” only distinguishes itself from a natural chromosome by what is missing (i.e., a fifth of its genes). This organism would have a shakier claim at being man-made life even than the 1972 oil-eating bacterium of Diamond v. Chakrabarty (the landmark patent case that established the legitimacy of patenting genetically-engineering lifeforms); at least Chakrabarty’s bug had a combination of characteristics that no natural organism had. Does putting together most of the DNA of an organism (which happens to be synthesized artificially) together with everything but the DNA of that organism mean scientists have created artificial life? It’s hard not to invoke Frankenstein.

Venter has been very successful at framing his science in ways that grab headlines, generate public interest, and seem self-evidently of central historical importance (whatever the later historical verdict). I haven’t decided whether that’s a good thing or a bad thing. He’s certainly earning his place in history, one headline and Discovery channel documentary at a time.

Democratic Debate on CNN: No word yet on “without restriction”

The first CNN debate came and went. It’s been almost a full day, and still no word about CNN’s promise to release the debate videos “without restriction”.

In early May, CNN announced, with all due bloviation, they would release the video for their 2008 US Presidential debates for free use “without restriction” at the conclusion of each live debate. As a Wikipedian, I naturally wondered whether “without restriction” would mean “with too many restrictions for use on Wikimedia projects”.

LostRemote reports that an “industry source” email clarified that “As previously announced, CNN’s debate coverage will be made available upon the conclusion of the live telecast and may be used without restriction throughout the 2008 election cycle (emphasis added).” It’s an open question whether that qualification is a time limit on freedom or just a reminder that debates in the election cycle may not be released without restriction; if it’s a time limit, then CNN deserves a storm of angry emails.

Naturally, there are already spliced videos and snippets appearing in the political blogosphere, and bloggers and YouTubers can confidently rely on CNN’s vague promise of “without restriction” to know that they won’t be facing copyright lawsuits any time soon. But they would be making and uploading the same videos whether or not CNN allowed it, and they would still be safe under fair use in almost every case.

Unless “without restriction” means we can do something that we couldn’t otherwise do (e.g., store and distribute it perpetually, even commercially), CNN’s celebrated nod toward free culture is just a cheap and meaningless publicity stunt.


My own take on the debates: My favorite part was the responses to the lame question “Gas prices are at record high levels…What would you do to reduce gas prices?” Dodd started off by going through a number of positive things that should be done with energy policy; he didn’t come out and say the straight answer to the question, but it was there between the lines. Then Gravel came right out and said it: the solutions to our energy problems are not going to involve lower gas prices, period. A couple others (Edwards and Richardson in particular) tried to pussyfoot around the issue, implying that investigations of energy company profit-taking would stop rising gas prices, but after Gravel, the tenor of the conversation definitely shifted. The sooner the public discourse moves away from thinking that the solution to energy problems is to lower gas prices, the better.

I really wish someone had taken Kucinich‘s bait on trade issues, for example, to debate the merits of NAFTA. I understand the union argument against NAFTA, but not the fair trade argument; Mexico and Canada aren’t problematic in terms of human rights abuses, so ceteris paribus, it seems like a clear case for the free market. On the related issue of immigration, there was some real conversation and some attempt to parse it in ways that move the public discourse closer to where it ought to be, treating immigration as primarily a moral issue. No one is yet willing to concede the rhetoric of “amnesty” or start from the premise that being born in America shouldn’t entitle one to a better life than someone born in Mexico. But with a path to citizenship in place, we’ll be moving in that direction.

I also liked the role call votes (and the way Clinton and Obama handled the lame ambiguous ones). The roll call was, for example, an effective way to wrap things up after Biden said all that needed to be said about Don’t Ask, Don’t Tell (“Peter Pace is flat wrong. Lemmee tell ya something. Nobody asked anybody else if they were gay. Brits, French, all our allies have gays serving. Our policy is not a rational policy.”). Biden had a number of blusterous moments that will probably help his poll numbers, but he’s driving the whole Democratic field to the right on military issues (except Gravel, and maybe not Kucinich, but their only purpose in the race is to pull the center of gravity leftward).