Rails migrations and Capistrano don’t mix

Last night I learned the hard way what happens when Rails migrations break.

My main project, the Wiki Ed Dashboard, is set up for automatic deployment — via Capistrano and travis-ci— whenever we push new commits to the staging or production branch. It’s mostly nice.

But I ran some migrations yesterday that I shouldn’t have. In particular, this one added three new columns to a table. When I pushed it to staging, the migration took about 5 minutes and then finished. Since the staging app was unresponsive during that time, I waited until the evening to deploy it to production. But things went much worse on production, which has a somewhat large database. The migration took more than 10 minutes — at which point, travis-ci decides that the build has become unresponsive, and kills it. The migration didn’t complete.

No problem, I thought, I’ll just run the migration again. Nope! It tursn out that the first column from that migration actually made it into the MySQL database. Running it again triggered a duplicate column error. Hmmm… okay. Maybe all the columns got added, but the migration didn’t get added to the database? So I manually added the migration id to the schema_migrations table. Alas, no. Things are still broken, because the other two columns didn’t actually get added.

That’s why Rails migrations have an up and a down version, right? I’ll just migrate that one down and back up. But with only one of three columns present, neither the up nor the down migration would run. I ended up writing an ad-hoc migration to add just the second and third columns, deploying it from my own machine, and then deleting the migration afterwards. I fixed production, but it wasn’t pretty.

My takeaway from this: if you deploy via Capistrano — and especially if you deploy straight from your continuous integration server — then write a separate migration for every little thing. When things go bad in the middle of deployment, you don’t want to be stuck with a half-completed migration.

(among the) best programming podcasts

Since I started both running semi-regularly and also biking 30+minutes on the Burke a couple times per week, I’ve started listening to a lot of podcasts — mainly focusing on technology (especially free software, web development, and Ruby) and product management. I’ve listening to enough good, and enough bad, that I want to share some of the podcasts I’ve found most interesting and helpful.

Weekly podcasts

These are the most consistently good, consistently released ones I listen to. Not every episode is great, but they are worthwhile enough of the time that I usually at least sample a bit of each new episode.

  • Ruby Rogues – a great panel, featuring the awesome Coraline Ada Ehmke, among others. It’s excellent both for Rubyists specifically and as a general software discussion venue.
  • Talk Python To Me – a superb interview-based podcast. The focus is Python, but it’s very accessible even without a deep knowledge of the specific language, and I often get ideas from it that are relevant for my work. Episodes usually start with a lot of personal narrative about how the interviewee got to where they are, which is often really interesting.
  • CodePen Radio – this one is focused on the startup codepen.io, and is usually a fun listen. The range of topics — all drawn from running a web app-based startup — maps pretty nicely onto the things that are relevant for me, running the technology side of a small nonprofit. (I’ve still never used CodePen, and don’t feel like I need to in order to get value from the podcast.)
  • The Changelog – the best of the handful of free / open source software podcasts, this one is interview based, usually goes deeply into the background of each guest, and has consistently interesting guests.
  • Javascript Jabber – the javascript companion to Ruby Rogues, this one is a little more scattered and less consistently insightful, but still has a pretty high ratio of solid episodes.
  • Ruby5 – this short podcast comes out twice per week, and basically runs down interesting news and new releases in the Ruby and Rails worlds. It’s a little cheesy, but it’s worth your time if you work with Ruby or Rails.

Individual episodes

These are some of the podcast episodes that I recommend. Some come from the podcasts above, and others are individual episodes from podcasts that I otherwise don’t listen to regularly or wouldn’t recommend highly.

  • The Changelog: The Future of WordPress and Calypso with Matt Mullenweg – I wish I could just hang out all the time with Matt Mullenweg.
  • Data Skeptic: Wikipedia Revision Scoring as a Service – This interview with Aaron Halfaker is the best overview of Wikipedia’s editor trends that I’ve seen/heard.
  • Javascript Jabber: The Evolution of Flux Libraries – This late-2015 overview of React, Flux and Redux is the best of many React-related podcasts I’ve listened it. It helped clarify my thinking a lot.
  • Ruby Rogues: Neo4j – a nice Ruby-centered introduction to the concept of graph databases
  • Javascript Jabber: npm 3 – an interesting overview of the npm roadmap, which helped me understand a lot more about what npm does and what it’s trying to do
  • Ruby Rogues: Feature Toggles – a discussion of feature toggles as an key enabler of a trunk-based development git strategy
  • Ruby Rogues: The Crystal Programming Language – with the creator of Crystal, made me eager to start using Crystal
  • Ruby Rogues: The Leprechauns of Software Engineering – with the author of the book of the same title, super interesting


Hello from 2016

It’s been more than a year since I’ve blogged. Sigh.

It’s been a good year. I learned Ruby, and I’ve become fluent enough that it’s really fun. I even wrote my first Twitter bot.

github streak

I started running, and have done enough of it that isn’t pure torture.

It looks like we’re staying in Seattle for good — we bought a house, and there’s a guest room for folks passing through — and I love it here. Work is fun, the kids are fun (if exasperating), and more and more family is nearby. The Wikipedia and free culture communities around here are great.

Lot’s more I’d like to write about — stories to tell, pictures to post — but Brighton keeps asking me to play, and I can’t put him off any longer. I think my modest goal will be for this not to be the only post in 2016.

Screencasting on Debian: Kazam is good!

I’ve periodically done screencasting and screen recording over the last few years — mostly while running Ubuntu or Debian — and it’s been an evolving pain to find a piece of GNU/Linux screen recording software that actually works. The one I’ve had the most success with is gtk-RecordMyDesktop, but it’s confusing to configure, and can be quite picky with audio sources… sometimes making it impossible to capture audio at all. There are other alternatives — byzanz, istanbul — that tend to be just as buggy or worse.

My current use case is slightly complicated: I’m doing Google Hangouts sessions with people using the web app I’ve been working on, and I want to record the video of them using it, their audio, and my audio. Basically, I want to record my user testing sessions — so far, without success, at least for audio.

The one promising project the last time I tried was Kazam, but it was still too buggy for me to use successfully.  It looks like it’s in pretty good shape now… it lets me choose the window to record, and I can add audio from both sound from speakers and microphone, with human-readable pulldowns for which speaker device and which microphone device, and it worked successfully to record a Hangout. And, it has nice file format options (including VP8/webm, which is the best option for uploading to Wikimedia Commons).

Nice work, Kazam developers!

Remembering Adrianne Wadewitz

Adrianne, skepchickal

Adrianne, skepchickal

I remember, for a long time before I met her, wondering what “a wade wit” meant.

I remember a Skype conversation, years ago. Adrianne, Phoebe, SJ and I talked for probably three hours about the gender gap on Wikipedia, late into the night. Then and always, she was relentlessly thoughtful and incredibly sharp. As superb as she was in writing, she was even better in live conversation and debate.

I remember laughing and talking and laughing and talking at Wikimania 2012. I took this picture of her that she used for a long while as a profile pic. Someone on Facebook said it looked “skepchickal”, which she loved.

I remember her unfailing kindness and generosity, indomitable work ethic, and voracious appetite for knowledge. She made me proud to call myself a fellow Wikipedian.

Scholarly societies, subscription fees, and open access

Strategic planning with historians. :-)

Strategic planning with historians. :-)

This last weekend I flew to Chicago for a two-day strategic planning meeting for the History of Science Society (see my photos). The task, for me and about 40 others historians of science, was to figure out who the society should be trying to serve and what its goals should be. One of the key issues the society is dealing with is our membership model: joining the History of Science Society (HSS) currently consists of becoming a subscriber to the society’s main publications, ”Isis” (a quarterly journal) and ”Osiris” (an annual thematic journal), which are published by University of Chicago Press. The lion’s share of the society’s budget comes from subscription fees for these journals, but individual subscriptions (from about 2200 members, and falling) make up only about a third of that revenue; institutional subscriptions, mainly from libraries who subscribe to large bundles of content from academic publishers, make up the rest. This institutional subscription revenue has actually been increasing recently for HSS. But library budgets are being increasingly squeezed, and can only absorb so much of the cost of traditional journal publishing before many start cancelling the bundles they cannot afford.

Michael Magoulias of University of Chicago Press was part of this meeting, and he submitted a report on university press publishing as part of the ‘environmental scan’ document that was sent out before the meeting. In it, he frames the option of going open access for journals outside the sciences  like those of HSS, and probably many other scholarly societies as well  as shifting the costs from libraries to individual others. Author-pays OA options (or large grants to cover traditional journal costs) are the only ones Magoulias mentioned, but that doesn’t reflect the reality of how OA publishing in the humanities is trending. In fact, there are huge numbers of journals  in humanities, as well as social sciences and mathematics  that are run entirely outside of the traditonal publishing industry. Several open source journal management platforms are available and developing rapidly. (Open Journal Systems seems to be the most widely adopted.) These are essentially DIY, digital-only options, but they can be run with *very* low infrastructure costs (perhaps a few hundred dollars per year for cloud hosting), with the usually sorts of unpaid labor of editing the journal and managing peer review. This approach may mean losing some of the fringe benefits of a high-quality traditional journal  professional typesetting and copyediting  but it doesn’t have to mean a fundamental difference in the quality of the scholarship.

But in the case of HSS  and probably other scholarly societies as well  shifting away from traditional publishing to a low-cost OA model on an free and open source platform would actually mean losing revenue as well. I’d never considered this before. The real issue, then, is not about shifting costs from libraries to individual authors. It is about libraries  through their bundled subscription fees to academic publishers  subsidizing the activities of scholarly societies (after the publishers have taken their cut). Is that how scholarly societies want to be funding themselves? I know that’s not how I want HSS to be funding itself.

OmniROM: solid Android rom, nice place for newcomers

When my last phone died in December, I decided to steer clear of contracts (so that my family could maybe get off of AT&T once all the contracts on the plan expire) and get a Nexus 5. I’ve usually used Cyanogenmod in the past, but I decided to try out the newer OmniROM this time. The Omni project started last year as a response to Cyanogenmod shifting from a completely volunteer project to a for-profit company — sort of the Canonical of the Android ecosystem. I like that the philosophy of Omni is about respecting users and adding value to the open source Android ecosystem.

One concrete difference from Cyanogenmod is that Omni encourages bug reports for from avid users. (Cyanogenmod does not take bug reports for nightly builds, even though that’s what the users who care most about new features and recent changes tend to use.) When I started using Omni, I noticed a few little things that annoyed me: inconsistent icons, and non-standard capitalization in the menu. So I filed some bugs in their bug tracker. These were minor issues, but the developers were quite responsive. The icons I complained about got fixed after a few days. So I decided to try to scratch my own itch for another bug. I followed their guide for getting set up as a developer, and then I submitted patches to fix the capitalization problems I had noticed. (All I did was change a few strings.) All my patches got merged within a few days of submission. 🙂

OmniROM is still a small project, but so far I think it’s a great place for newcomers who want to try out open source Android development.