Crowdsourcing experiment with translation students

In Howe’s 2008 book Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business, which I reviewed here, Howe describes TopCoder Inc., a company that develops software for industry partners by administering programming competitions. Twice weekly, competitions are posted and any of the 200,000+ members of the TopCoder community can compete to complete small components of a piece of software that will eventually be delivered to a client. As Howe describes the process, each project is broken down into manageable chunks that are then posted on the TopCoder website so that programmers can compete against one another to write the most efficient, bug-free solution to the problem. After the entries are submitted, TopCoder members then try to find bugs in the submissions, and only bug-free versions pass to be vetted by TopCoder reviewers. Competitions are also posted to see who can compile the components into a single piece of software and run the program bug-free. Members are ranked according to how often they win competitions, and they also receive monetary rewards ranging from $25-$300 for submitting winning entries.

I decided to try out this crowdsourcing model in the classroom by organizing a similar translation competition. Before we started, I spoke a little about translation and crowdsourcing, describing some of the pros and cons, and showing examples of some of the organizations that have relied on crowdsourcing for their translations (e.g. TED, Facebook, Twitter). Then we moved on to translating a text together, with the TopCoder competition as the model for the translation process.

A few days ago, I’d broken up a short text into one- or two-sentence chunks and posted these chunks on an online form, with an empty text box under each one for the translations. Here’s a screen capture of the form, which I created with Google Docs:
crowdsourcing activity form

All students were given two minutes to translate the first segment, and then click on the “Submit” button at the bottom of the page, which automatically uploaded the translated segments to a Google Docs spreadsheet so that we were able to view all the submissions together. Here’s a screen capture of the spreadsheet, to give you an idea of what we were working with in class:
crowdsourcing activity spreadsheet

Once the first sentence had been submitted, students were then able to vote on which translation they preferred. To help speed up the process, each student was allowed to vote only once for their favourite version and then, after one version was declared the “winner”, students were able to make any revisions they wanted, provided a majority of the class agreed with the change. The revised sentence was then added to a Word document so that a final translation could be pieced together from the winning sentences. Students were then given two minutes to translate the second sentence, and, once they had done so, they were invited to vote on the winner and make any corrections to the translation. After we had translated three or four sentences (and were almost out of time), students were asked to comment on the final version and the translation process.

So what did the students think? Most noted that although our translation process (basically a popularity contest, with the possibility of adding a few touch-ups to the winner) worked well enough in our small group, it might not be as successful outside the classroom, since the most popular answer isn’t necessarily the best. Some raised some very valid concerns about how such a process would work on a larger scale or in another context. For instance, they wondered how well would the final text would hold together if it had been translated by multiple people, and how various linguistic groups (e.g. Brazilian and Iberian Portuguese speakers) would settle on an acceptable version.

It seemed, though, that many students enjoyed the exercise, regardless of whether they felt this method of translating would work outside the classroom. One student liked being able to compare the various versions laid out on the screen, because that way, when revisions/corrections were made to the winning translation, students could incorporate ideas from the versions that did not win. Another noted that one person translating alone might get stuck or lack inspiration at certain points, but that this problem would not arise if many people were working on the same text.

Overall, I think this experiment worked well. Using a Google Docs Form really simplified the setup on my end, since I needed no programming skills and was able (in less than 15 minutes) to create an interface we could work with in class. Next year, I’d do this exercise on a week when we had three hours together instead of one where I had scheduled a test in the second half of class. I think this exercise lends itself well to a 2- or 3-hour class, with 20 to 30 minutes for a talk about crowdsourcing, 1 to 1.5 hours to translate, 15-20 minutes to go over the final translation (since I didn’t give any input while students were voting on and revising the submissions and the final version did have some minor language and translation errors), and then 10-15 minutes for students to reflect on the process and the result.

References:
Howe, Jeff. (2008). Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business. New York: Crown Publishing.

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*
Website