Experimenting with Wikipedia in the classroom

Late last year, I came across a very insightful podcast series called BiblioTech on the University Affairs website. Each episode focuses on technology and higher education–Twitter in the classroom, for instance, or storage in the cloud–so of course I was immediately hooked. I had missed the first thirteen episodes, but they’re all quite short–usually between ten and fifteen minutes long–so I managed to catch up after two jogs and a commute to work.

Episodes 12 (Wikipedia) and 13 (Plagiarism) in particular piqued my interest and actually inspired me to change the format of the courses I’m teaching this term: an MA-level Theory of Translation and a BA-level Introduction to Translation into English course.

First, I listened to the Plagiarism episode, which mainly discussed how to design tests and assignments that discourage students from cheating. As host Rochelle Mazar, an emerging technologies librarian at the University of Toronto’s Mississauga campus, argued:

We need to create assignments that have students produce something meaningful to them, but opaque to everyone else.

Her suggestions included having students use material from the classroom lectures and discussions in their assignments (e.g. by blogging about each week’s lectures, and then using these blog posts to write their final paper), having students build on peer interactions via Twitter, Facebook or the course website to develop their assignments, or having students contribute to open-access textbooks through initiatives like Wikibooks.

I then listened to the Wikipedia episode, where Mazar made the following argument about why instructors should integrate Wikipedia into classroom assignments:

When people tell me that they saw something inaccurate on Wikipedia, and scoff at how poor a source it is, I have to ask them: why didn’t you fix it? Isn’t that our role in society, those of us with access to good information and the time to consider it, isn’t it our role to help improve the level of knowledge and understanding of our communities? Making sure Wikipedia is accurate when we have the chance to is one small, easy way to contribute. If you see an error, you can fix it. That’s how Wikipedia works.

Together, these two episodes got me thinking about the assignments I would be designing for my courses, and it didn’t take me long to decide that I would incorporate Wikipedia and blogging into my courses: translation of Wikipedia articles for the undergraduate translation course, and blogging as the medium for submitting, producing and collaborating on written work in the graduate theory course. Next month, I’ll write a post about how I decided to integrate blogs into my graduate theory class, but right now, I want to focus on Wikipedia and its potential as a teaching tool in translation classrooms.

But first, a short digression: A couple of years ago, I had students in my undergraduate translation classes work in group or partners to translate texts for non-profit organizations as a final course assignment. The students seemed to really like translating texts that would actually be used by an organization instead of texts that were nothing more than an exercise to be filed away at the end of term. And I enjoyed being able to submit a large project to a non-profit at the end of the term. But it was a lot of work on my part, mainly because I acted as a project manager by finding a non-profit with a text of just the right length and just the right difficulty, then splitting up the text for the class, correcting the final submissions, and finally translating the rest of the text, since the documents we were given to translate were inevitably too long for me to assign entirely to the students. So after two years, I went back to having students translate less taxing texts, like newspaper or magazine articles, since it’s easier to correct twenty translations of the same text than it is to correct twenty excerpts from a longer project. But I did miss the authentic assignments.

So, when I listened to the BiblioTech podcasts, I realized Wikipedia might be a good solution to the problem. Students can choose their own articles to translate (freeing me from the project-management aspect), and the wide variety of subjects needing translation–Wikipedians have tagged over 9000 articles as possible candidates for French-to-English translation–means we should be able to find something to interest everyone, and something just the right length for the assignment (around 300 words per student). I still expect to have to spend more time correcting the translations, but I think this will be less work overall than the previous projects.

As I was planning out the project, I was pleasantly surprised to discover that the Wikimedia Foundation has established an education program in Canada, the United States, Brazil and Egypt. The Canada Education Program is intended to help university professors integrate Wikipedia projects into their courses, and it offers advantages like an online ambassador for every class to help students navigate the technical challenges of editing in the Wikipedia environment. In addition, there’s an adviser who works closely with professors who join the program. Fortunately for me, he’s based in Toronto, which means I was able to chat with him earlier this month about the program. His recent article in the Huffington Post offers some good arguments for why Wikipedia is a useful classroom tool. He suggests, for instance, that since companies like the CIA use wikis in their work environments, students are likely to need to be familiar with wiki technology and culture after they graduate. In addition, students gain exposure by contributing to articles that are visible online, and they learn to engage in debates with classmates and Wikipedians as their contributions are reviewed and edited by others.

I’m still in the early stages of this experiment… I don’t yet know, for instance, whether students will have a lot of trouble editing their articles, and whether the technical challenges can all be solved by the online ambassador who will be working us. I’ve asked students to use Google Documents to do most of the translating work, but I’m expecting students to add the final versions to Wikipedia before the end of the term, so many of these problems may crop up only in March or April. I also expect a lot of in-class discussion about Wikipedia’s Translation Guidelines, which encourage omission of irrelevant information and adaptation or explanation of cultural references:

Translation between Wikipedias need not transfer all content from any given article. If certain portions of an article appear to be low-quality or unverifiable, use your judgment and do not translate this content. Once you have finished translating, you may ask a proofreader to check the translation.
[…]
A useful translation may require more than just a faithful rendering of the original. Thus it may be necessary to explain the meaning of terms not commonly known throughout the English-speaking world. For example, a typical reader of English needs no explanation of The Wizard of Oz, but has no idea who Zwarte Piet might be. By contrast, for a typical reader of Dutch, it might be the other way around.

Because students may find they have more freedom to make their own judgements about the relevance of information, I’ve asked them to do in-class presentation about their translation decisions and the experience of working in Wikipedia at the end of the term. I’ll be sure to post some of my own thoughts on this experiment after the term is over, the marking is complete and the translations are posted online. I’ll even post links to some of our work.

Has anyone else used (or thought about using) Wikipedia articles as translation assignments? If so, I’d certainly appreciate your comments.

Wikipedia survey IV (Motivations)

While I’ve still got the survey open in my browser, I thought I’d finish writing about the results. This last post will look at the motivations the 76 respondents gave for translating, editing or otherwise participating in a crowdsourced translation initiative. (I should point out that although the question asked about the “last crowdsourced translation initiative in which [respondents] participated”, 63 of the 76 respondents (83%) indicated that Wikipedia was the last initiative in which they had participated, so their motivations are mainly for Wikipedia, with a few for Pirate Parties International, nozebe.com, open-source software, iFixit, Forvo, and Facebook)

The survey asked two questions about motivations. Respondents were first asked to select up to four motivations for participating.[*] They were then given the same list and asked to choose just one motivation. In both cases, they were offered motivations that can be described as either intrinsic (done not for a reward but rather for enjoyment or due to a sense of obligation to the community) or extrinsic (done for a direct or indirect reward). They were also allowed to select “Other” and add their own motivations to the list, as 11 respondents chose to do.

When I looked at the results, it became clear that most respondents had various reasons for participating: only 4 people choose one motivation when they were allowed to list multiple reasons (and one person skipped this question). All four wanted to make information available to others. Here’s a chart that shows which motivations were most commonly cited. (Click on the thumbnail to see a full-size image):
wikipedia translators-4 motivations

As the chart shows, intrinsic motivations (making information available to others, finding intellectual stimulation in the project, and supporting the organization that launched the initiative) were the motivations most often chosen by respondents. However, a significant number also had extrinsic reasons for participating: they wanted to gain more experience translating or practice their language skills. In the article I wrote about this survey, I broke these motivations down by type of respondent (those who had worked as professional translators vs. those who had not), so I won’t go into details here, except to say that there are some differences between the two groups.

Respondents who chose “Other” described various motivations: one was bored at work, one wanted “to be part of this network movement”, one wanted to improve his students’ translation skills by having them translate for Wikipedia, two thought it was fun, one wanted to quote a Wikipedia article in an academic work but needed the information to be in English, and three noted that they wanted to help or gain recognition within the Wikipedia community. Some more detailed motivations (often with a political/social emphasis) were also cited, either with this question, or in the final comments section:

I am not a developer of software, but I am using it for free. To translate and localise the software for developers is a way to say thank you – Only translated software has a chance to spread and prosper – I get to know new features and/or new software as soon as it is available

As a former university teacher I believe that fighting ignorance is an important way of making world a better place. Translating local knowledge into trans-national English is my personal gift for the humanity 🙂

I’m not sure how you found me because I’m pretty sure I only translated one Wikipedia page… I did it mainly because the subject of the article is almost unknown in the Jewish world, and I wanted more people to know about her and one of the few ways in which I can help make her story more widely known is by translating it into French. That being said I think I’ll try to do more!

The main reason I became involved in crowdsourced translation is that, in my opinion, the translation of science involves more than linguistic problems. It also requires an awareness of context; of why the scientific activities were undertaken, as well as how they fit into the “world” to which they belong. Many crowdsourced translation projects do not take this into account, treating the translation of science as a linguistic problem. This is fallacious. So I participate to fix the errors that creep in.

My translations are generally to make information freely available, especially to make Guatemalan cultural subjects available in Spanish to Guatemalan nationals.

I taught myself German, by looking up every single word in a couple of books I wanted to read about my passionate hobby. I have translated a couple of books in that hobby for the German association regarding that hobby (gratis). Aside from practice, practice, practice, I have had no training in translation. I began the Wiki translations when I was unemployed for a considerable amount of time and there was an article in the German Wiki on my hobby that had a tiny article in English. The rest is history. It’s been a few years since I’ve contributed to Wikipedia, but it was a great deal of fun at the time. Translation is a great deal of work for me (I have several HEAVY German/English dictionaries), but I love the outcome. Can I help English speakers understand the information and the beauty of the original text?

There were very few Sri Lankans editing on English Wikipedia at that time and I manage to bring more in and translate and put content to Wikipedia so other language speakers can get to know that information. I was enjoying my effort and eventually I got the administrator-ship of Sinhala Wikipedia. From then onwards I was working there till I had to quit as I was started to engage more with my work and studies. Well that’s my story and I’m not a full time translator and I have no training or whatsoever regarding that translating.

As these comments show, the respondents had often complex reasons for helping with Wikipedia translations. Some saw it as an opportunity to disseminate information about certain language, cultural or religious groups (e.g. Guatemalans, Sri Lankans) to people within or outside these communities; others wanted to give back to communities or organizations they believed in (for instance, by helping other Wikipedians, by giving free/open-source software a wider audience). But intrinsic reasons seem most prominent. This is undoubtedly why, when respondents were asked to select just one reason for participating in a crowdsourced translation initiative, 47% chose “To make information available to language speakers”, 21% said they found the project intellectually stimulating, and 16% wanted to support the organization that launched the initiative. No one said that all of their previous responses were equally important, which shows that while many motivations are a factor, some played a more significant role than others in respondents’ decisions to volunteer for Wikipedia (and other crowdsourced translation initiatives).

That’s apparent, too, in the responses I received for the question “Have you ever consciously decided NOT to participate in a crowdsourced translation initiative?” The responses were split almost evenly between Yes (49%) and No (51%). The 36 respondents who said Yes were then asked why they had decided not to participate, and what initiative they hadn’t wanted to participate in. Here’s a chart that shows why respondents did not want to participate:
wikipedia translators-4 motivations for not participating

Unlike last time, when only a few respondents chose 1 or 2 motivations for participating, 15 of the 36 respondents chose only 1 reason, and 11 chose only two to explain why they decided not to participate (although they could have chosen up to four motivations). This means that almost 75% of respondents did not feel that their motives for not participating were as complex as their motives for participating. (Of course, it’s also possible that because this was one of the last questions on the survey, respondents were answering more quickly than they had earlier). I had expected that ideological reasons would play a significant role in why someone would not want to participate in a crowdsourced translation initiative (ie. that most respondents, being involved in a not-for-profit initiative like Wikipedia, would have reservations about volunteering for for-profit companies like Facebook), but the most common reason respondents offered was “I didn’t have time” (20 respondents, or 56%), followed by “I wasn’t interested” (12 respondents, or 33%). Only 7 didn’t want to work for free (in four cases, it was for Facebook, while the 3 other respondents didn’t mention what initiative they were thinking of), and only 9 said they didn’t want to support the organization that launched the initiative (Facebook in four cases, a local question-and-answer type service in another, Wikia and Wikipedia in two other cases). There was some overlap between these last two responses: only 12 respondents in all indicated that they didn’t want to work for free and/or support a particular organization.

I think these responses show how attitudes toward crowdsourced translation initiatives are divided, even among those who have participated in the same ones. Although 16 respondents had translated for Facebook (as I discussed in this post), and therefore did not seem ideologically opposed to volunteering for a for-profit company, 12 others had consciously decided not to do so. And even though respondents most commonly said they didn’t participate because they didn’t have time, we have seen that many respondents participated in Wikipedia translation projects because they found it satisfying, fun, challenging, and because they wanted to help disseminate information to people who could not speak the language in which the information was already available. So factors like these must also play a role in why respondents might not participate in other crowdsourced translation initiatives.

On that note, I think I’ll end this series of blog posts. If you want to read more about the survey results, you’ll have to wait until next year, when my article appears in The Translator. However, I did write another article about the ethics of crowdsourcing, and that’s coming out in Linguistica Antverpiensia in December, so you can always check that one out in the meantime. Although I was hoping to conduct additional surveys with participants in other crowdsourced translation initiatives like the TED Open Translation Project, I don’t think I’ll have time to do so in the near future, unless someone wanted to collaborate with me. If you’re interested, you can always email me to let me know.

[*] The online software I used for the survey didn’t allow me to prevent respondents from selecting more than four reasons. However, only 14 people did so: of the 76 respondents, 4 chose 5 reasons, 7 chose 6 reasons, and 3 chose 7 reasons. I didn’t exclude these 14 responses because the next question limited respondents to just 1 reason.

Wikipedia survey III (Recognition, Effects)

It’s been quite some time now since my last post about the Wikipedia survey results, and for that I must apologize. I was side-tracked by some unrelated projects and found it hard to get back to the survey. But I’ve just finished revising my article on this topic (which will be published in the November 2012 issue of The Translator), and that made me sit down to finish blogging about the survey results. This is the third of four posts. I had planned to look at motivations, effects and recognition all in one post, but it became too long, so I’ve split it into two. This one looks at the ways respondents were recognized for participating in crowdsourced projects and what impact (if any) their participation has had on their lives. The next one (which I will post later this week), looks at respondents’ motivations for participating in crowdsourced initiatives.

For anyone who comes across these posts after the article is published, I should mention that the discrepancy between the number of survey respondents in the article and on this blog (75 vs. 76) is because I received another response to the survey after I’d submitted the article for peer review. It was easier to include all 76 responses here, since I’m creating new graphs and looking at survey responses I didn’t have space to explore in the Translator article, but I didn’t update the data in the article because the new response didn’t change much on its own (+/-0.5% here and there) and would have required several hours work to recalculate the figures I cited throughout the 20+ pages.

I also want to thank Brian Harris for discussing these survey results on his blog. You can read his entry here or visit his blog, Unprofessional Translation, to read a number of very interesting articles about translation by non-professionals, including those working in situations of conflict, war, and natural disasters.

And on to the survey results:

Recognition
The survey asked respondents what (if any) recognition they had received for participating in a crowdsourced translation initiative. Although the question asked about the last initiative in which respondents had participated (rather than Wikipedia in particular), 63 of the 76 respondents indicated that Wikipedia was the last initiative in which they had been involved, so the responses are mainly representative of the recognition they received as Wikipedia translators. Here’s a chart summarizing the responses (click on it for a full-sized image):
wikipedia translators-recognition
As the chart illustrates, no respondents received financial compensation, either directly, by being paid for their work, or indirectly, by being offered a discount on their membership fees or other services. This really isn’t surprising, though, because most respondents were Wikipedia translators, and contributors to Wikipedia (whether translators or writers) are not paid for their work. In addition, since Wikipedia does not charge membership fees, there is nothing to discount. Unexpectedly, though, 20 respondents reported receiving no recognition at all–even though 17 of them listed Wikipedia as the last initiative in which they had been involved. Because Wikipedia records the changes made to its pages, anyone who had translated an article would have been credited on the history page. These 20 respondents may not have been aware of the history feature, or–more likely–they didn’t consider it a form a recognition.[*]

Receiving credit for the translation work (either directly beside the translation or via a profile page) was the most common type of recognition. Of the 18 respondents who selected “Other”, 10 reported being credited on the Wikipedia article’s history page, 1 said their name appeared in the translated software’s source code, 1 noted they had received some feedback on the Wiki Talk page, 1 mentioned receiving badges from Facebook, and the others mentioned their motivations (e.g. just wanted to help, translation became better, could refer to the translation in other academic work) or the effect their involvement had on their careers (e.g. higher rate of pay for translating/interpreting duties at work). I discuss the advantages and disadvantages of this enhanced visibility for translators and translation in an article that will appear in Linguistica Antverpiensia later this year, so I won’t elaborate here, except to say that crediting translators, and providing a record of the changes made to translations makes translation a more visible activity and provides researchers with a large corpus of texts that can be compared and analyzed. In fact, I think Wikipedia’s translations are an untapped wealth of material that can help us better understand how translations are produced and revised by both professional and non-professional translators.

Effects
Finally, I asked respondents whether/how their participation in a crowdsourced translation initiative had impacted their lives. Here’s another chart that summarizes the results (again, click on the image to see it in full size):
Wikipedia translators-impact
I was surprised to see that 38 respondents (or 51%) didn’t feel their participation had had some sort of impact: after all, why they would continue volunteering if they were not getting something (even personal satisfaction) out of the experience? However, this may be a problem with the question itself, as I hadn’t listed “personal satisfaction” as an option. If I had, (and I would definitely make this change to the next survey), the responses might have been different. As it is, of the 16 respondents who selected “Other”, 8 indicated that participating gave them personal satisfaction, a sense of pride in their accomplishments, a feeling of gratification, etc. Here are a few of their comments:

Pride in my accomplishments, although I am an amateur translator. I did some cool stuff!

I have the immense satisfaction of knowing that my efforts are building a free information project and hope that my professionalism raises the quality bar for other contributors who see my work (e.g. edit summaries, citations of sources, etc.)

I was spending my spare time on Wikipedia and sharing my knowledge. Moreover I was enjoying what I was doing. That’s it.

As for the rest of the responses in the “Other” category: One person noted that they had been thanked by other Wikipedia users for the translation, another remarked that they had been thanked by colleagues for contributing to “open-source intellectual work”, two said they had learned new things, one had met new Facebook friends, one said they had been asked to do further translation work for the project, two noted they had been invited to participate in this survey, and one (a part-time translation professor) said “My students consider my classes as a useful and positive learning experience” because they help translate for Wikipedia together.

Nearly 1/3 of respondents (22 of the 76) felt they had received feedback that helped improve their translation skills, and I think this point is important: the open nature of Wikipedia (and many other crowdsourced projects) provides an excellent forum for exchanging ideas and commenting on the work of others. But this is also a point that deserves further study, since so few of the respondents reported having training or professional experience translating.

Interestingly, some of the more tangible effects of participating in a crowdsourced initiative, such as receiving job offers and meeting new clients or colleagues, were not often experienced by the survey respondents. I wonder whether the results would be the same if this survey were given to participants in other types of initiatives (translation of for-profit websites such as Facebook, or localization of Free/open-source software such as OmegaT). The results do show, however, that volunteering for crowdsourced translation initiatives has had some positive (and a few negative) effects on the careers and personal lives of the participants, and that personal satisfaction is also an important motivator.

[*]
An interesting aside is that of the 20 respondents who reported receiving no recognition, 5 also indicated they had received other forms of recognition, such as their names appearing beside the translation, an updated profile page, or feedback on their work. Respondents may have been thinking of all projects in which they had been involved, instead of the last one, which the question asked about. These 5 respondents all indicated that Wikipedia was the last initiative in which they had been involved.

Wikipedia survey II (Types of Participation)

This is a follow-up to last month’s post describing preliminary results from a survey of Wikipedia translators. To find out about the survey methodology and the respondent profiles, please read this post first.

I initially planned for this survey to be one of several with translators from various crowdsourced projects, so I wrote the participation-related questions hoping to compare the types of crowdsourced translation initiatives people decide to participate in and what roles they play in each one. I haven’t yet had time to survey participants in other initiatives (and, truth be told, I probably won’t have time to do so in the near future), so the responses to the next few questions will have to be only a partial glimpse of the kinds of initiatives crowdsourcing participants get involved in. Here’s a table illustrating the responses to the question about which crowdsourced translation initiatives respondents had participated in. As expected, virtually all respondents had helped translate for Wikipedia. The one respondent who did not report translating for Wikipedia participated in Translatewiki.net, with a focus on MediaWiki, the wiki platform originally designed for Wikipedia.

Initiative No. of respondents Percentage
Wikipedia 75 98.7%
Facebook 16 21.3%
Free/Open-source software projects (software localization and/or documentation translation for F/OSS projects such as OmegaT, Concrete5, Adium, Flock, Framasoft) 7 9.2%
Translatewiki.net 2 2.7%
TEDTalks 2 2.7%
The Kamusi Project 1 1.3%
Ifixit 1 1.3%
Forvo 1 1.3%
Translated.by 1 1.3%
Anobii 1 1.3%
Science-fiction fandom websites 1 1.3%
Traduwiki 1 1.3%
Orkut 1 1.3%
Der Mundo (Wordwide Lexicon) 1 1.3%
The Lied, Art Song, and Choral Texts Page 1 1.3%

A few points I found interesting. First, I was surprised to see that respondents had participated in such a diverse range of projects. I had expected that because Wikipedia was a not-for-profit initiative, participants would be less likely to have helped translate for for-profit companies like Facebook and Twitter; however, after Wikipedia, Facebook was the initiative that had attracted the most participants. Second, I was intrigued by the fact that almost 10% of respondents were involved in open-source software translation/localization projects. I hypothesized that the respondents who had reported working in the IT sector or studying computer science would be the ones involved in the F/OSS projects, but that was not always the case: when I broke down the data, I found that people from a variety of fields (a high school student, an economics student, two medical students, a translator, a software developer, a fundraiser, etc.) had helped translate/localize F/OSS projects. I think these results really indicate a need to specifically study F/OSS translation projects to see whether the Wikipedia respondents are representative of the participants.

Next, I asked respondents how they had participated in crowdsourced translation projects (as translators, revisers, project managers, etc.) and how much time per week, on average, they had spent participating in their last crowdsourced translation initiative.

Here’s a graph illustrating how respondents had participated in various crowdsourced translation projects. They were asked to select all ways they had been involved, even if it varied from one project to another. This means that the responses are not indicative of participation in Wikipedia alone:
wikipedia translators-roles played

As the graph shows, translation was the most common means of participation, but that wasn’t surprising, because I had invited respondents based on whether they had translated for Wikipedia. However, a significant number of respondents had also acted as revisers/editors, and some had participated in other ways, such as providing links to web resources and participating in the forums. I think this graph shows how crowdsourced translation initiatives allow people with various backgrounds and experiences to participate in ways that match their skills: for instance, someone with weaker second-language skills can help edit the target text in his or her mother tongue, catching typos and factual errors. And someone with a background in a particular field can share links to resources or answer questions about concepts from that field, without necessarily having to do any translating. So when we speak of crowdsourced translation initiatives, it’s important to consider that these initiatives allow for more types of involvement than translating in the narrow sense of providing a TL equivalent for a ST.

Finally, I asked participants how many hours they spent on average, per week, participating in the last crowsourced translation initiative in which they were involved. Here’s a graph that illustrates the answers I received:
wikipedia translators-hours per week

As you can see, most respondents spent no more than five hours per week participating in a crowdsourced translation initiative. On the surface, this may seem to provide some comfort to the professional translators who object to crowdsourcing as a platform for translation, since these Wikipedia respondents did not spend enough time per week on a translation to equal a full-time job; however, hundreds of people volunteering four or five hours per week can still produce enough work to replace several full-time professionals. Not-for-profit initiatives like Wikipedia, where article authors, illustrators and translators all volunteer their time are probably not as problematic to the profession, since professional translators would probably never have been hired to translate the content anyway, but for-profit initiatives such as Facebook are more ethically ambiguous. I’ve discussed some of these ethical problems in an article that will be published in Linguistica Antverpiensia later this year, in an issue focusing on community translation.

In a few weeks, I’ll post the results of the last few survey questions, the ones focusing on motivations for participating, the rewards/incentives participants have received and the effect(s) their participation has had on their lives and careers.

Wikipedia survey I (Respondent profiles)

This is the first in a series of posts about the results of my survey of Wikipedians who have translated content for the Wikimedia projects (e.g. Wikipedia). Because I’ve already submitted an article analyzing the survey, these posts will be less analytical and more descriptive, although I will be able to discuss some of the survey questions I didn’t have space for in the paper. This post will look at the profiles of the 76 Wikipedians who responded to the survey (and whom I’d like to thank once again for their time).

Survey Methodology
I wanted to randomly invite Wikipedia translators to complete the survey, so I first consulted various lists of English translators (e.g. the Translators Available page and the Translation/French/Translators page) and added these usernames to a master list. Then, for each of the 279 languages versions on the List of Wikipedias page*, I searched for a Category: Translators page for translations from that language into English (ie. Category: Translators DE-EN, Category: Translators FR-EN, etc.). I added the usernames in the Category: Translators pages to the names on the master list, and removed duplicate users. This process led to a master list with the names of 1866 users who had volunteered to translate Wikipedia content into English. I then sent out invitations to 204 randomly selected users from the master list, and 76 (or 37%) of them responded. A few caveats: additional Wikipedians have probably translated content for the encyclopedia without listing themselves on any of the pages I just mentioned. Moreover, anyone can generally edit (and translate) Wikipedia pages without creating an account, so the results of the survey probably can’t be generalized for all English Wikipedia translators, let alone Wikipedia translators into the other 280 languages, who are not necessarily listed on the English Wikipedia pages I consulted. Finally, although 76 Wikipedians may not seem like many respondents, it is important to note that many of the users on the master list did not seem to have ever translated anything for Wikipedia: when I consulted their user contribution histories, I found that some Wikipedians had added userboxes to their profile pages to indicate their desire to translate but had not actually done anything else. I was interested only in the views of people who had actually translated, so the 76 respondents actually represents a much larger share of actual Wikipedia translators than it appears.

Profiles
The vast majority of the respondents (64 respondents, or 84%) were male and most were 35 years of age or younger (57 of the respondents, or 75% were under 36). This result is not surprising, given the findings of a 2008 general survey of more than 176,000 Wikipedia users, where 50% of the respondents were 21 years of age or under (in all, 76% were under 30) and 75% were male.

When respondents were asked about translation-related training, most (51 respondents or 68%) responded that they had no formal training in translation. Here’s a graph with a breakdown for each response:
Wikipdia translators-training

Given that respondents were generally young and usually did not have formal training in translation, it’s not surprising that 52 of the 76 respondents (68.4%) had never worked as translators (ie. they had never been paid to produce translations). Only 11 respondents (or about 14%) were currently working as translators on a full- or part-time basis, while 13 (or about 17%) had worked as translators in the past but were not doing so now. So it’s not surprising either that only two respondents were members of a professional association of translators.

Finally, when asked about their current occupations, respondents reported working in range of fields. I’ve grouped them as best I could, using the Occupational Structure proposed by Human Resources and Development Canada. Two respondents did not answer this question, but here’s an overview of the 74 other responses:

Occupation No. of respondents Percentage
Student
    6 High school students
    4 College/University students (languages)
    17 College/University students (other fields)
27 36%
Works in IT sector 11 15%
Works in language industry 9 12%
Works in another sector (e.g. graphic design, law, education) 8 11%
Works in business, finance or administration 7 9%
Unemployed/stay-at-home parent/retired 5 7%
Academic 3 4%
Engineer 2 3%
Works in sales and service industry 2 3%
Total: 74 100%

Later this week (or early next week), I’ll look at the types of crowdsourced translation initiatives the respondents were involved in (other than Wikipedia, of course), and the roles they played in these initiatives. After that, I’ll discuss respondent motivations for volunteering and the impact their participation has had on their lives.


* There are now 281 Wikipedia versions.