A Tale of Two Online Courses, Part I

Now that the fall term is over and I’ve (finally!) finished marking the tests, assignments and essays that were submitted during the last week of classes, I’m ready to sit down and write a few blog posts about my experiences in the (virtual) classroom over the past 13 weeks. Among the courses I taught this term were two that were offered online: one, a practical specialized translation course for undergraduate students, and the other, a theoretical Translation Studies course for graduate students. Although they were designed and delivered in a similar way, I thought the undergraduate course was much more successful. In this series of posts, I’ll be discussing why the two courses had such different results.

In this post, I’ll focus on two aspects the courses: content delivery and deadlines

1. Content delivery: Video vs. audio
As a platform for the two courses, WordPress worked out better than WebCT or Moodle, both from my point of view (creating and uploading the content, managing discussions, organizing information, etc.) and from the student’s (finding information, accessing videos, leaving comments, etc.). In the undergraduate, practical translation class, I mainly uploaded a series of two to five 5-minute videos every week to go over the homework and/or briefly lecture on the week’s topic. In the theoretical master’s course, I mainly uploaded an .mp3 file each week with a 10-15 minute recorded lecture.

Videos
In my last post about tools for the classroom, I mentioned that I was using Screenr to record the 5-minute videos, and now that the term is over, I can say that I’m happy with the results. The videos were easy to record and upload, and with the WordTube plugin for WordPress, I could integrate a video player onto the relevant webpage, and organize the videos into playlists so that each video focused on one short segment of the week’s lesson (e.g. one part of the translation homework, one or two slides from a PowerPoint presentation) and they were arranged in a relevant order. Here’s an example of a video I posted in the first week, describing the requirements for submitting assignments and tests:

Get the Flash Player to see the wordTube Media Player.

Feedback from students about these videos was generally positive, thought they did point out some shortcomings as well. Here’s what some of them had to say:

I wasn’t a big fan of the fact that the videos were short and that there were many of them. However, as the course went on, I found it less bothersome because if you are trying to go back to a video to reference material within it, it is much easier to find what you are looking for in multiple […] 5-minute clips as opposed to one long clip. I was also glad that the videos were not removed from the course website, allowing me to go back and watch them again at my convenience or if/when I needed them.

I am glad you posted videos because I prefer listening to lessons instead of reading them on a screen. However, I feel like if we had in class time I would have learned more and feel like I was improving more as the semester went on.

What I disliked about the class was the weekly videos. I’m sure they work well for some people but I like having notes I can reference as opposed to go back and listen to the video every time I forget something.

I liked watching the videos because you could always go back and watch something over again if you didn’t understand something too well or if the discussion question was relevant to something in the lectures, it was always a good option to watch it again.

I liked the videos, however, I would have loved to have a podcast option I could take with me anywhere. The videos required both an Internet connection and Flash, which limited their portability. I would have loved to have listened to the audio while following the PowerPoint on my iPad on the train.

If I taught this course again in an online format, I would definitely integrate a podcast option–probably a downloadable .mp3 file. It was something I had thought about but just didn’t have time to implement. But the videos were a good fit for the course, allowing me to verbally and visually illustrate points much more effectively than I could have with just written notes.

Audio recordings
In the theoretical master’s course, I didn’t use the videos because the 5-minute limitation was too restrictive. In addition, I didn’t really need the visuals in this class, since I wasn’t going over homework or pointing out relevant websites. The disadvantage to the audio recordings is that I wasn’t able to see how often they were accessed, unlike the videos, for which I could access viewing statistics. Moreover, I found it difficult to sync up audio with a PowerPoint presentation, so I ended up just providing the audio recordings each week. I don’t know whether students found these hard to follow, although some did tell me they found the recordings helpful. In the future, if this course is offered online again, I would probably make a greater effort to match up the lecture with slides so that students could download the recording if they wanted to listen to it on the go, or they could listen to it while flipping through the accompanying slides.

2. Deadlines
Teaching these two courses helped me learn about the importance of set, enforced deadlines for online courses. When preparing the syllabus for my undergraduate course, I decided to encourage weekly participation by setting strict deadlines on when work could be submitted: almost every week, students were expected to respond to a discussion question, submit a short translation, and comment on one other student’s translation. They were awarded one mark for completing each of these homework components, and together, these participation marks were worth 25% of their final grade. If they submitted everything, every week, they would earn a full 25%, which just over half the 26 students did. However, they were not allowed to go back to previous weeks and make up missed participation: I wanted to make sure students were keeping up with the course on a weekly basis, so I didn’t award any retroactive marks. Even so, only a few students earned 15/25 or less on this aspect of the course. The mean participation mark was 21.62/25.

By comparison, in the master’s course, I set aside 15% of the final mark for participation, divided evenly among three tasks: responding to weekly discussion questions, providing feedback to other students when they submitted a critical summary of one of the theoretical texts, and responding to the feedback they received from other students. In this course, however, I did not specify that no marks would be awarded retroactively, as I had assumed master’s students would be more motivated to keep on top of the work. The result? On a weekly basis, participation could best be described as abysmal. Only 2-3 of the 7 students originally enrolled in the class regularly posted their responses to the discussion questions, and no student responded to all 10. In some cases, students answered none of the discussion questions until the final week, which of course prevented other students from engaging with these responses. Because I had offered too much flexibility around the deadlines, participation was lacking, despite my weekly emails to the students reminding them about the work to be completed. (The flexible deadlines may not have been the only reason, but they certainly played a part).

As students in my undergraduate class noted, an online course requires much more self-motivation than one taught in the classroom:

You also have to be somewhat more self-motivated in an online class, because I find that while you’re aware of the submission dates, you might not set aside the same time for it each week, since you don’t have to be there. So it can feel disjointed, in terms of “did I pay enough attention to that material” before answering.

I understand that participation marks are required and I know it definitely motivated me to stay engaged in the course

I really enjoyed the discussion questions as well, which you don’t always get to in the classroom when you only have an hour and a half to take up a translation. I think I got more out of this delivery method than I would have from a classroom experience where you painstakingly go through a text line by line and everybody asks about all their word choices. for my learning style, I found this method more engaging and more stimuating. That being said, it did definitely require a lot more self-motivation, so I think the participation marks were essential.

I really didn’t like the fact that the course was online. I’m a lot more involved when I go to a classroom and discuss course material as a group. In fact, I missed many of the participation marks because since I didn’t have to go to a physical classroom, I would sometimes forget about this course for a few days. I’m usually a better student than that!

These comments, along with the differing participation in my two classes, have really clarified for me the importance of encouraging student participation in online courses by setting clear, enforced deadlines for any work that needs to be submitted. It may also be helpful to remind students early on about the importance of keeping up with the coursework. Encouraging them to meet in person, perhaps in small study groups, might also help them remember to complete the homework each week.

In my next post, I’ll look more specifically at feedback to/from students and how it differed in each class, and I’ll discuss some of the suggestions students offered for future courses.

Electronic tools for the classroom

In the lead up to September, I spent quite a bit of time tweaking the course websites for the three classes I’m teaching this semester. And as I resolved last year, I’ve stopped using WebCT and moved to WordPress instead–not just for my in-the-classroom course (Introduction to Translation into English), but also for my two online courses (Specialized Translation and Translation Studies). Thus far, I’m happy with the platform, and I think it will work well, but I’ll be sure to post a follow-up article in April, once I’ve had a real semester to test out the websites.

For this post, I wanted to share some of the tools I’ve been using for my online specialized translation course, in case other instructors are looking for solutions they can apply to their own courses.

Screenr
I found this online tool when I was looking into options for posting audio or video recordings for my two online courses. Powerpoint, of course, allows you to create narrations to accompany a presentation, but what happens if you want to show students how to use the course website or an online tool? It’s a lot easier to do this if you can simply record what happens on your own computer screen as you demonstrate the process. Screenr allows you to record screen-capture videos, along with your audio commentary, and has several advantages: first, you don’t need to create an account to use it, since you can log-in using your Google, Yahoo, Twitter or Facebook account. This means you can start recording screencasts almost instantly. Second, it allows you to either link to the video hosted on screenr, post the video to YouTube or to download the .mp4 file (and, if you want, to delete the video from the screenr server).

There are, however, several restrictions. First, you can record a maximum of 5 minutes for each video, second, the videos need to be uploaded to the screenr server before they can be used, and third the videos are automatically shared with the community, whether you want them private or not. I’ve been able to work around these restrictions with very few headaches though. In the first case, I just record a series of videos on a topic and then load them up in a playlist. The students may even like being able to commit to watching just five minutes at a time… I’ll have to ask later in the term.

The uploading problem is a little more annoying. Every time I record a five-minute video, I have to upload the 8-10MB file to the screenr server just so that I can download it to my own computer and then re-upload it to the course website. Total prep time for a finished 5-minute, 8-12MB video posted onto the course website: about 15 minutes–or more if I need to re-record the video a few times to get it right. I just do some other prep work while I’m waiting for the uploading/downloading to finish. Of course, if I didn’t mind sharing the videos, I could just post a link video stored on the screenr server; other people may not find this problem as much of a hurdle as I do.

As for the third problem (automatic sharing of videos), I just download the video as soon as it’s finished uploading to the screenr server, and then I delete it from the screenr site. It’s generally available to the public for 30 seconds to a minute (though of course it may hang around on the screenr servers for a little longer… I’m mainly interested in ensuring the videos aren’t available for public viewing.) Other instructors may have different preferences, so maybe this particular screenr feature isn’t seen as a drawback by everyone.

Wordle
A few months ago, I came across a very interesting blog post on ActiveHistory.ca showing how wordle takes a text, removes common words like “the” and “a”, and creates an image that uses different font sizes, according to the frequency with which each word appears in the text: the larger the font, the more frequent the word. While the original blog post discussed how to use Wordle to visually represent word frequencies in historical or political texts to help show changes in political vocabulary over time, the application can be used for translation classes too. For instance, here’s one I generated using Jean Charest’s February 23, 2011 inaugural address in French and in English:
jean charest-discours d'ouverture-23 fev 2011
jean charest-inaugural address-23 feb 2011

As you can see, the images (which can be generated in just a few seconds, with no need to create an account) help show differences in the most common words from the two versions of Charest’s address. Some of the largest words in the French image are Québec, développement, québécois and gouvernement, just as largest English words are Québec, development, Quebecers and government; but words like new, future,and better are different sizes in the two texts, though this is also due to the fact that wordle does not distinguish between variations on a root word (e.g. “nouveau,” “nouvelle,” “nouveaux,” etc. are treated as two different words, but they could all be translated by “new”). Obviously a corpus-analysis tool like WordList would give more accurate results, but wordle offers a quick visual representation of word frequency that could be used to help students look at a source text in a more analytical way. It allows them to do this very quickly and without needing to purchase or learn how to use any specialized software. I’ll be using some wordle-created word clouds of political manifestos that we’ll be translating later this term to have students think about the importance of lexical choices in the ST and to look for trends. But I can certainly think of other applications. For instance, students could use it to compare adapted texts to see what keywords have changed between the ST/TT versions. They could also use it to look at texts on the same topic to see how keywords change over time, from one organization to another, from one language to another, etc. For an undergraduate class, this tool could be a helpful way to get students to start reflecting about text function and lexical choices, which would then allow them to think about how to deal with these choices when translating.

Has anyone already used wordle in a translation course before? I’ll update in a few weeks, once I’ve had a chance to try it out with a group of students. What other tools have you found useful in translation classes?

Wikipedia survey IV (Motivations)

While I’ve still got the survey open in my browser, I thought I’d finish writing about the results. This last post will look at the motivations the 76 respondents gave for translating, editing or otherwise participating in a crowdsourced translation initiative. (I should point out that although the question asked about the “last crowdsourced translation initiative in which [respondents] participated”, 63 of the 76 respondents (83%) indicated that Wikipedia was the last initiative in which they had participated, so their motivations are mainly for Wikipedia, with a few for Pirate Parties International, nozebe.com, open-source software, iFixit, Forvo, and Facebook)

The survey asked two questions about motivations. Respondents were first asked to select up to four motivations for participating.[*] They were then given the same list and asked to choose just one motivation. In both cases, they were offered motivations that can be described as either intrinsic (done not for a reward but rather for enjoyment or due to a sense of obligation to the community) or extrinsic (done for a direct or indirect reward). They were also allowed to select “Other” and add their own motivations to the list, as 11 respondents chose to do.

When I looked at the results, it became clear that most respondents had various reasons for participating: only 4 people choose one motivation when they were allowed to list multiple reasons (and one person skipped this question). All four wanted to make information available to others. Here’s a chart that shows which motivations were most commonly cited. (Click on the thumbnail to see a full-size image):
wikipedia translators-4 motivations

As the chart shows, intrinsic motivations (making information available to others, finding intellectual stimulation in the project, and supporting the organization that launched the initiative) were the motivations most often chosen by respondents. However, a significant number also had extrinsic reasons for participating: they wanted to gain more experience translating or practice their language skills. In the article I wrote about this survey, I broke these motivations down by type of respondent (those who had worked as professional translators vs. those who had not), so I won’t go into details here, except to say that there are some differences between the two groups.

Respondents who chose “Other” described various motivations: one was bored at work, one wanted “to be part of this network movement”, one wanted to improve his students’ translation skills by having them translate for Wikipedia, two thought it was fun, one wanted to quote a Wikipedia article in an academic work but needed the information to be in English, and three noted that they wanted to help or gain recognition within the Wikipedia community. Some more detailed motivations (often with a political/social emphasis) were also cited, either with this question, or in the final comments section:

I am not a developer of software, but I am using it for free. To translate and localise the software for developers is a way to say thank you – Only translated software has a chance to spread and prosper – I get to know new features and/or new software as soon as it is available

As a former university teacher I believe that fighting ignorance is an important way of making world a better place. Translating local knowledge into trans-national English is my personal gift for the humanity 🙂

I’m not sure how you found me because I’m pretty sure I only translated one Wikipedia page… I did it mainly because the subject of the article is almost unknown in the Jewish world, and I wanted more people to know about her and one of the few ways in which I can help make her story more widely known is by translating it into French. That being said I think I’ll try to do more!

The main reason I became involved in crowdsourced translation is that, in my opinion, the translation of science involves more than linguistic problems. It also requires an awareness of context; of why the scientific activities were undertaken, as well as how they fit into the “world” to which they belong. Many crowdsourced translation projects do not take this into account, treating the translation of science as a linguistic problem. This is fallacious. So I participate to fix the errors that creep in.

My translations are generally to make information freely available, especially to make Guatemalan cultural subjects available in Spanish to Guatemalan nationals.

I taught myself German, by looking up every single word in a couple of books I wanted to read about my passionate hobby. I have translated a couple of books in that hobby for the German association regarding that hobby (gratis). Aside from practice, practice, practice, I have had no training in translation. I began the Wiki translations when I was unemployed for a considerable amount of time and there was an article in the German Wiki on my hobby that had a tiny article in English. The rest is history. It’s been a few years since I’ve contributed to Wikipedia, but it was a great deal of fun at the time. Translation is a great deal of work for me (I have several HEAVY German/English dictionaries), but I love the outcome. Can I help English speakers understand the information and the beauty of the original text?

There were very few Sri Lankans editing on English Wikipedia at that time and I manage to bring more in and translate and put content to Wikipedia so other language speakers can get to know that information. I was enjoying my effort and eventually I got the administrator-ship of Sinhala Wikipedia. From then onwards I was working there till I had to quit as I was started to engage more with my work and studies. Well that’s my story and I’m not a full time translator and I have no training or whatsoever regarding that translating.

As these comments show, the respondents had often complex reasons for helping with Wikipedia translations. Some saw it as an opportunity to disseminate information about certain language, cultural or religious groups (e.g. Guatemalans, Sri Lankans) to people within or outside these communities; others wanted to give back to communities or organizations they believed in (for instance, by helping other Wikipedians, by giving free/open-source software a wider audience). But intrinsic reasons seem most prominent. This is undoubtedly why, when respondents were asked to select just one reason for participating in a crowdsourced translation initiative, 47% chose “To make information available to language speakers”, 21% said they found the project intellectually stimulating, and 16% wanted to support the organization that launched the initiative. No one said that all of their previous responses were equally important, which shows that while many motivations are a factor, some played a more significant role than others in respondents’ decisions to volunteer for Wikipedia (and other crowdsourced translation initiatives).

That’s apparent, too, in the responses I received for the question “Have you ever consciously decided NOT to participate in a crowdsourced translation initiative?” The responses were split almost evenly between Yes (49%) and No (51%). The 36 respondents who said Yes were then asked why they had decided not to participate, and what initiative they hadn’t wanted to participate in. Here’s a chart that shows why respondents did not want to participate:
wikipedia translators-4 motivations for not participating

Unlike last time, when only a few respondents chose 1 or 2 motivations for participating, 15 of the 36 respondents chose only 1 reason, and 11 chose only two to explain why they decided not to participate (although they could have chosen up to four motivations). This means that almost 75% of respondents did not feel that their motives for not participating were as complex as their motives for participating. (Of course, it’s also possible that because this was one of the last questions on the survey, respondents were answering more quickly than they had earlier). I had expected that ideological reasons would play a significant role in why someone would not want to participate in a crowdsourced translation initiative (ie. that most respondents, being involved in a not-for-profit initiative like Wikipedia, would have reservations about volunteering for for-profit companies like Facebook), but the most common reason respondents offered was “I didn’t have time” (20 respondents, or 56%), followed by “I wasn’t interested” (12 respondents, or 33%). Only 7 didn’t want to work for free (in four cases, it was for Facebook, while the 3 other respondents didn’t mention what initiative they were thinking of), and only 9 said they didn’t want to support the organization that launched the initiative (Facebook in four cases, a local question-and-answer type service in another, Wikia and Wikipedia in two other cases). There was some overlap between these last two responses: only 12 respondents in all indicated that they didn’t want to work for free and/or support a particular organization.

I think these responses show how attitudes toward crowdsourced translation initiatives are divided, even among those who have participated in the same ones. Although 16 respondents had translated for Facebook (as I discussed in this post), and therefore did not seem ideologically opposed to volunteering for a for-profit company, 12 others had consciously decided not to do so. And even though respondents most commonly said they didn’t participate because they didn’t have time, we have seen that many respondents participated in Wikipedia translation projects because they found it satisfying, fun, challenging, and because they wanted to help disseminate information to people who could not speak the language in which the information was already available. So factors like these must also play a role in why respondents might not participate in other crowdsourced translation initiatives.

On that note, I think I’ll end this series of blog posts. If you want to read more about the survey results, you’ll have to wait until next year, when my article appears in The Translator. However, I did write another article about the ethics of crowdsourcing, and that’s coming out in Linguistica Antverpiensia in December, so you can always check that one out in the meantime. Although I was hoping to conduct additional surveys with participants in other crowdsourced translation initiatives like the TED Open Translation Project, I don’t think I’ll have time to do so in the near future, unless someone wanted to collaborate with me. If you’re interested, you can always email me to let me know.

[*] The online software I used for the survey didn’t allow me to prevent respondents from selecting more than four reasons. However, only 14 people did so: of the 76 respondents, 4 chose 5 reasons, 7 chose 6 reasons, and 3 chose 7 reasons. I didn’t exclude these 14 responses because the next question limited respondents to just 1 reason.

Wikipedia survey III (Recognition, Effects)

It’s been quite some time now since my last post about the Wikipedia survey results, and for that I must apologize. I was side-tracked by some unrelated projects and found it hard to get back to the survey. But I’ve just finished revising my article on this topic (which will be published in the November 2012 issue of The Translator), and that made me sit down to finish blogging about the survey results. This is the third of four posts. I had planned to look at motivations, effects and recognition all in one post, but it became too long, so I’ve split it into two. This one looks at the ways respondents were recognized for participating in crowdsourced projects and what impact (if any) their participation has had on their lives. The next one (which I will post later this week), looks at respondents’ motivations for participating in crowdsourced initiatives.

For anyone who comes across these posts after the article is published, I should mention that the discrepancy between the number of survey respondents in the article and on this blog (75 vs. 76) is because I received another response to the survey after I’d submitted the article for peer review. It was easier to include all 76 responses here, since I’m creating new graphs and looking at survey responses I didn’t have space to explore in the Translator article, but I didn’t update the data in the article because the new response didn’t change much on its own (+/-0.5% here and there) and would have required several hours work to recalculate the figures I cited throughout the 20+ pages.

I also want to thank Brian Harris for discussing these survey results on his blog. You can read his entry here or visit his blog, Unprofessional Translation, to read a number of very interesting articles about translation by non-professionals, including those working in situations of conflict, war, and natural disasters.

And on to the survey results:

Recognition
The survey asked respondents what (if any) recognition they had received for participating in a crowdsourced translation initiative. Although the question asked about the last initiative in which respondents had participated (rather than Wikipedia in particular), 63 of the 76 respondents indicated that Wikipedia was the last initiative in which they had been involved, so the responses are mainly representative of the recognition they received as Wikipedia translators. Here’s a chart summarizing the responses (click on it for a full-sized image):
wikipedia translators-recognition
As the chart illustrates, no respondents received financial compensation, either directly, by being paid for their work, or indirectly, by being offered a discount on their membership fees or other services. This really isn’t surprising, though, because most respondents were Wikipedia translators, and contributors to Wikipedia (whether translators or writers) are not paid for their work. In addition, since Wikipedia does not charge membership fees, there is nothing to discount. Unexpectedly, though, 20 respondents reported receiving no recognition at all–even though 17 of them listed Wikipedia as the last initiative in which they had been involved. Because Wikipedia records the changes made to its pages, anyone who had translated an article would have been credited on the history page. These 20 respondents may not have been aware of the history feature, or–more likely–they didn’t consider it a form a recognition.[*]

Receiving credit for the translation work (either directly beside the translation or via a profile page) was the most common type of recognition. Of the 18 respondents who selected “Other”, 10 reported being credited on the Wikipedia article’s history page, 1 said their name appeared in the translated software’s source code, 1 noted they had received some feedback on the Wiki Talk page, 1 mentioned receiving badges from Facebook, and the others mentioned their motivations (e.g. just wanted to help, translation became better, could refer to the translation in other academic work) or the effect their involvement had on their careers (e.g. higher rate of pay for translating/interpreting duties at work). I discuss the advantages and disadvantages of this enhanced visibility for translators and translation in an article that will appear in Linguistica Antverpiensia later this year, so I won’t elaborate here, except to say that crediting translators, and providing a record of the changes made to translations makes translation a more visible activity and provides researchers with a large corpus of texts that can be compared and analyzed. In fact, I think Wikipedia’s translations are an untapped wealth of material that can help us better understand how translations are produced and revised by both professional and non-professional translators.

Effects
Finally, I asked respondents whether/how their participation in a crowdsourced translation initiative had impacted their lives. Here’s another chart that summarizes the results (again, click on the image to see it in full size):
Wikipedia translators-impact
I was surprised to see that 38 respondents (or 51%) didn’t feel their participation had had some sort of impact: after all, why they would continue volunteering if they were not getting something (even personal satisfaction) out of the experience? However, this may be a problem with the question itself, as I hadn’t listed “personal satisfaction” as an option. If I had, (and I would definitely make this change to the next survey), the responses might have been different. As it is, of the 16 respondents who selected “Other”, 8 indicated that participating gave them personal satisfaction, a sense of pride in their accomplishments, a feeling of gratification, etc. Here are a few of their comments:

Pride in my accomplishments, although I am an amateur translator. I did some cool stuff!

I have the immense satisfaction of knowing that my efforts are building a free information project and hope that my professionalism raises the quality bar for other contributors who see my work (e.g. edit summaries, citations of sources, etc.)

I was spending my spare time on Wikipedia and sharing my knowledge. Moreover I was enjoying what I was doing. That’s it.

As for the rest of the responses in the “Other” category: One person noted that they had been thanked by other Wikipedia users for the translation, another remarked that they had been thanked by colleagues for contributing to “open-source intellectual work”, two said they had learned new things, one had met new Facebook friends, one said they had been asked to do further translation work for the project, two noted they had been invited to participate in this survey, and one (a part-time translation professor) said “My students consider my classes as a useful and positive learning experience” because they help translate for Wikipedia together.

Nearly 1/3 of respondents (22 of the 76) felt they had received feedback that helped improve their translation skills, and I think this point is important: the open nature of Wikipedia (and many other crowdsourced projects) provides an excellent forum for exchanging ideas and commenting on the work of others. But this is also a point that deserves further study, since so few of the respondents reported having training or professional experience translating.

Interestingly, some of the more tangible effects of participating in a crowdsourced initiative, such as receiving job offers and meeting new clients or colleagues, were not often experienced by the survey respondents. I wonder whether the results would be the same if this survey were given to participants in other types of initiatives (translation of for-profit websites such as Facebook, or localization of Free/open-source software such as OmegaT). The results do show, however, that volunteering for crowdsourced translation initiatives has had some positive (and a few negative) effects on the careers and personal lives of the participants, and that personal satisfaction is also an important motivator.

[*]
An interesting aside is that of the 20 respondents who reported receiving no recognition, 5 also indicated they had received other forms of recognition, such as their names appearing beside the translation, an updated profile page, or feedback on their work. Respondents may have been thinking of all projects in which they had been involved, instead of the last one, which the question asked about. These 5 respondents all indicated that Wikipedia was the last initiative in which they had been involved.

Wikipedia survey II (Types of Participation)

This is a follow-up to last month’s post describing preliminary results from a survey of Wikipedia translators. To find out about the survey methodology and the respondent profiles, please read this post first.

I initially planned for this survey to be one of several with translators from various crowdsourced projects, so I wrote the participation-related questions hoping to compare the types of crowdsourced translation initiatives people decide to participate in and what roles they play in each one. I haven’t yet had time to survey participants in other initiatives (and, truth be told, I probably won’t have time to do so in the near future), so the responses to the next few questions will have to be only a partial glimpse of the kinds of initiatives crowdsourcing participants get involved in. Here’s a table illustrating the responses to the question about which crowdsourced translation initiatives respondents had participated in. As expected, virtually all respondents had helped translate for Wikipedia. The one respondent who did not report translating for Wikipedia participated in Translatewiki.net, with a focus on MediaWiki, the wiki platform originally designed for Wikipedia.

Initiative No. of respondents Percentage
Wikipedia 75 98.7%
Facebook 16 21.3%
Free/Open-source software projects (software localization and/or documentation translation for F/OSS projects such as OmegaT, Concrete5, Adium, Flock, Framasoft) 7 9.2%
Translatewiki.net 2 2.7%
TEDTalks 2 2.7%
The Kamusi Project 1 1.3%
Ifixit 1 1.3%
Forvo 1 1.3%
Translated.by 1 1.3%
Anobii 1 1.3%
Science-fiction fandom websites 1 1.3%
Traduwiki 1 1.3%
Orkut 1 1.3%
Der Mundo (Wordwide Lexicon) 1 1.3%
The Lied, Art Song, and Choral Texts Page 1 1.3%

A few points I found interesting. First, I was surprised to see that respondents had participated in such a diverse range of projects. I had expected that because Wikipedia was a not-for-profit initiative, participants would be less likely to have helped translate for for-profit companies like Facebook and Twitter; however, after Wikipedia, Facebook was the initiative that had attracted the most participants. Second, I was intrigued by the fact that almost 10% of respondents were involved in open-source software translation/localization projects. I hypothesized that the respondents who had reported working in the IT sector or studying computer science would be the ones involved in the F/OSS projects, but that was not always the case: when I broke down the data, I found that people from a variety of fields (a high school student, an economics student, two medical students, a translator, a software developer, a fundraiser, etc.) had helped translate/localize F/OSS projects. I think these results really indicate a need to specifically study F/OSS translation projects to see whether the Wikipedia respondents are representative of the participants.

Next, I asked respondents how they had participated in crowdsourced translation projects (as translators, revisers, project managers, etc.) and how much time per week, on average, they had spent participating in their last crowdsourced translation initiative.

Here’s a graph illustrating how respondents had participated in various crowdsourced translation projects. They were asked to select all ways they had been involved, even if it varied from one project to another. This means that the responses are not indicative of participation in Wikipedia alone:
wikipedia translators-roles played

As the graph shows, translation was the most common means of participation, but that wasn’t surprising, because I had invited respondents based on whether they had translated for Wikipedia. However, a significant number of respondents had also acted as revisers/editors, and some had participated in other ways, such as providing links to web resources and participating in the forums. I think this graph shows how crowdsourced translation initiatives allow people with various backgrounds and experiences to participate in ways that match their skills: for instance, someone with weaker second-language skills can help edit the target text in his or her mother tongue, catching typos and factual errors. And someone with a background in a particular field can share links to resources or answer questions about concepts from that field, without necessarily having to do any translating. So when we speak of crowdsourced translation initiatives, it’s important to consider that these initiatives allow for more types of involvement than translating in the narrow sense of providing a TL equivalent for a ST.

Finally, I asked participants how many hours they spent on average, per week, participating in the last crowsourced translation initiative in which they were involved. Here’s a graph that illustrates the answers I received:
wikipedia translators-hours per week

As you can see, most respondents spent no more than five hours per week participating in a crowdsourced translation initiative. On the surface, this may seem to provide some comfort to the professional translators who object to crowdsourcing as a platform for translation, since these Wikipedia respondents did not spend enough time per week on a translation to equal a full-time job; however, hundreds of people volunteering four or five hours per week can still produce enough work to replace several full-time professionals. Not-for-profit initiatives like Wikipedia, where article authors, illustrators and translators all volunteer their time are probably not as problematic to the profession, since professional translators would probably never have been hired to translate the content anyway, but for-profit initiatives such as Facebook are more ethically ambiguous. I’ve discussed some of these ethical problems in an article that will be published in Linguistica Antverpiensia later this year, in an issue focusing on community translation.

In a few weeks, I’ll post the results of the last few survey questions, the ones focusing on motivations for participating, the rewards/incentives participants have received and the effect(s) their participation has had on their lives and careers.

Wikipedia survey I (Respondent profiles)

This is the first in a series of posts about the results of my survey of Wikipedians who have translated content for the Wikimedia projects (e.g. Wikipedia). Because I’ve already submitted an article analyzing the survey, these posts will be less analytical and more descriptive, although I will be able to discuss some of the survey questions I didn’t have space for in the paper. This post will look at the profiles of the 76 Wikipedians who responded to the survey (and whom I’d like to thank once again for their time).

Survey Methodology
I wanted to randomly invite Wikipedia translators to complete the survey, so I first consulted various lists of English translators (e.g. the Translators Available page and the Translation/French/Translators page) and added these usernames to a master list. Then, for each of the 279 languages versions on the List of Wikipedias page*, I searched for a Category: Translators page for translations from that language into English (ie. Category: Translators DE-EN, Category: Translators FR-EN, etc.). I added the usernames in the Category: Translators pages to the names on the master list, and removed duplicate users. This process led to a master list with the names of 1866 users who had volunteered to translate Wikipedia content into English. I then sent out invitations to 204 randomly selected users from the master list, and 76 (or 37%) of them responded. A few caveats: additional Wikipedians have probably translated content for the encyclopedia without listing themselves on any of the pages I just mentioned. Moreover, anyone can generally edit (and translate) Wikipedia pages without creating an account, so the results of the survey probably can’t be generalized for all English Wikipedia translators, let alone Wikipedia translators into the other 280 languages, who are not necessarily listed on the English Wikipedia pages I consulted. Finally, although 76 Wikipedians may not seem like many respondents, it is important to note that many of the users on the master list did not seem to have ever translated anything for Wikipedia: when I consulted their user contribution histories, I found that some Wikipedians had added userboxes to their profile pages to indicate their desire to translate but had not actually done anything else. I was interested only in the views of people who had actually translated, so the 76 respondents actually represents a much larger share of actual Wikipedia translators than it appears.

Profiles
The vast majority of the respondents (64 respondents, or 84%) were male and most were 35 years of age or younger (57 of the respondents, or 75% were under 36). This result is not surprising, given the findings of a 2008 general survey of more than 176,000 Wikipedia users, where 50% of the respondents were 21 years of age or under (in all, 76% were under 30) and 75% were male.

When respondents were asked about translation-related training, most (51 respondents or 68%) responded that they had no formal training in translation. Here’s a graph with a breakdown for each response:
Wikipdia translators-training

Given that respondents were generally young and usually did not have formal training in translation, it’s not surprising that 52 of the 76 respondents (68.4%) had never worked as translators (ie. they had never been paid to produce translations). Only 11 respondents (or about 14%) were currently working as translators on a full- or part-time basis, while 13 (or about 17%) had worked as translators in the past but were not doing so now. So it’s not surprising either that only two respondents were members of a professional association of translators.

Finally, when asked about their current occupations, respondents reported working in range of fields. I’ve grouped them as best I could, using the Occupational Structure proposed by Human Resources and Development Canada. Two respondents did not answer this question, but here’s an overview of the 74 other responses:

Occupation No. of respondents Percentage
Student
    6 High school students
    4 College/University students (languages)
    17 College/University students (other fields)
27 36%
Works in IT sector 11 15%
Works in language industry 9 12%
Works in another sector (e.g. graphic design, law, education) 8 11%
Works in business, finance or administration 7 9%
Unemployed/stay-at-home parent/retired 5 7%
Academic 3 4%
Engineer 2 3%
Works in sales and service industry 2 3%
Total: 74 100%

Later this week (or early next week), I’ll look at the types of crowdsourced translation initiatives the respondents were involved in (other than Wikipedia, of course), and the roles they played in these initiatives. After that, I’ll discuss respondent motivations for volunteering and the impact their participation has had on their lives.


* There are now 281 Wikipedia versions.

Highlights of the Monterey Forum 2011

I’ve just returned from my trip to Monterey, where I attended (and presented at) the Monterey Forum on Innovations in Translator, Interpreter and Localizer Education. (See my last post for more details). Although many of the presentations focused on interpreter training, I did come back with some new ideas and tools to integrate into my translation classes. I thought I’d write about some of the points I found useful, in case those who weren’t able to attend are interested in some of the tools and teaching strategies the presenters discussed. For brevity’s sake, I’ll focus on just four presentations, although many others were also very helpful.

On Friday, Jost Zetzsche, one of the keynote speakers, briefly touched on crowdsourcing in response to a question from an audience member. Jost argued that translation programs need to train not just translators, but team leaders, because crowdsourcing has become such an integral part of the translation process in a few specific industries, such as social media. Although Jost believes crowdsourcing is unlikely to affect the translation industry as a whole, it will affect the way translation is performed for companies like Facebook or Twitter. If these companies don’t turn to professional translators to lead the crowd, Jost argues that translators should instead go to them, and that translator trainers should therefore prepare students to fill these kinds of leadership roles. While I could see Jost’s point, I haven’t quite decided where I stand on this issue. On the one hand, incorporating some kind of leadership training into the translation curriculum will likely be beneficial to students, whether they go on to manage crowdsourced translation projects, small translation companies, or large translation agencies; on the other hand, training translators to work in the shadows of crowdsourced projects (as the invisible correctors of translations generated by bilinguals who usually have little or no formal training in translation), helps lower the professional status of translation, giving the impression any bilingual can produce accurate, effective translations, when these translations are actually being reviewed and revised up by professionals who work behind the scenes. I’m not sure there is an easy answer to this problem, but it’s something translator training programs will likely have to consider for at least the next few years.

On Saturday, I attended a number of interesting talks. Cécile Frérot’s presentation on bilingual corpora and corpus-analysis tools was particularly useful, since she teaches French/English translation at the Université Stendhal and the corpora she discussed would therefore be well-suited to most translation classes here in Canada. She discussed the Corpus of Contemporary American English (available for free at this website), a fantastic resource of more than 400 million words (spread over 175,000 texts published between 1990 and 2011). As the website notes, the corpus is evenly divided between five genres: spoken, fiction, popular magazines, newspapers, and academic journals. Users are able to restrict searches to any one of these genres, sort by frequency, search for collocates and more. In addition, the website interface allows users to search the British National Corpus, the TIME Corpus, which contains 100 million words published between the 1920s and the 2000s, and the Corpus of Historical American English, which contains 400 million words published between 1810 and 2009. I can’t believe I hadn’t discovered this website before! I will definitely be introducing it to my students next year. She also mentioned AntConc a free monolingual concordancer for Windows, Mac OS and Linux, in case instructors want to develop their own corpora for use in the classroom (e.g. a corpus of scholarly/scientific articles published in a specific domain, to be used in a specialized translation course).

Barry Slaughter Olsen, from the Monterey Institute, talked about filming his students in class while they interpreted, and then uploading the videos to YouTube so students could watch themselves and their classmates and then leave feedback for one another. One of the more helpful points I got from his presentation was the need to give guidance to students when you ask them to leave online feedback for their peers. One student wanted to be told which issues to focus on each week. Another thought students should be asked to comment on one or two other students each week, instead of being expected to provide feedback to everyone. Both these points are applicable to translation students as well. If the class had just spent some time exploring ways to reduce wordiness, for instance, students could be asked comment on how well some of their peers incorporated these strategies into their homework. This would help reinforce what the students had learned, while giving more students direct feedback, since instructors can’t possibly comment in detail on everyone’s homework in a class of 15+ students. I’ll definitely be incorporating this tip into my translation classes next year, when I start using WordPress as a platform to complement our in-class sessions. I had already been thinking of having students comment on the submissions of their peers, so hearing what Barry’s students thought about the peer-to-peer feedback process was very helpful.

Finally, my Glendon colleagues María Constanza Guzmán and Lyse Hébert, discussed translator engagement, ideologies and agency. They really made me realize the importance of encouraging students to recognize their own ideologies and so they can realize that their backgrounds, training, and (political, social….) beliefs are inevitably going to affect their translations. Until now, I had left questions of engagement, ideology, and agency for the translation theory course, but I see now that even practical translation classes can benefit from having students briefly consider a) what their own views, prejudices, assumptions, etc. are, b) whether and how these views might differ from the ST author’s and the intended SL readers, and c) whether students should (or even want to) compensate for these differences.

I’ll have lots to consider over the summer, as I revise the syllabi for my existing courses and prepare for my new classes. If you were at the Forum, feel free to add your thoughts on these or other presentations. And if you weren’t able to attend, feel free to comment on anything I’ve said or to discuss your own teaching strategies and tools. I’m particularly interested in hearing about whether you encourage peer-to-peer feedback in your classes and if so, how it works.

Monterey FORUM 2011

I’m writing from my hotel in Monterey, California, where I’ll be attending Monterey Forum on Innovations in Translator, Interpreter and Localizer Education at the Monterey Institute of International Studies. Here’s a copy of the program, for those who are interested. I’m looking forward to today’s presentations on technology in the classroom; I’ll write a post or two about the parts I found more interesting or useful for teaching translation.

My presentation will look at how Google Docs can be integrated into translation classrooms to enhance collaboration. I’ll be talking about some of the cloud-based productivity suites currently available (e.g. Microsoft Web Apps, Adobe Acrobat’s Buzzword, Tables and Presentations, Zoho, ThinkFree Online, and of course, Google Docs. I’ll spend some time covering privacy, confidentiality, security and reliability issues when using cloud-based services like Google Docs, and then I’ll talk about some of the ways I’ve used Google Docs in my own classes, which I’ve talked about in this blog already (here, here and here, for instance).

I used Google Docs to create the presentation, and I’ve embedded a link to it below, in case anyone is interested in taking a look at it:

I’ll write more about the conference over the next few days.

Crowdsourcing experiment with translation students

In Howe’s 2008 book Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business, which I reviewed here, Howe describes TopCoder Inc., a company that develops software for industry partners by administering programming competitions. Twice weekly, competitions are posted and any of the 200,000+ members of the TopCoder community can compete to complete small components of a piece of software that will eventually be delivered to a client. As Howe describes the process, each project is broken down into manageable chunks that are then posted on the TopCoder website so that programmers can compete against one another to write the most efficient, bug-free solution to the problem. After the entries are submitted, TopCoder members then try to find bugs in the submissions, and only bug-free versions pass to be vetted by TopCoder reviewers. Competitions are also posted to see who can compile the components into a single piece of software and run the program bug-free. Members are ranked according to how often they win competitions, and they also receive monetary rewards ranging from $25-$300 for submitting winning entries.

I decided to try out this crowdsourcing model in the classroom by organizing a similar translation competition. Before we started, I spoke a little about translation and crowdsourcing, describing some of the pros and cons, and showing examples of some of the organizations that have relied on crowdsourcing for their translations (e.g. TED, Facebook, Twitter). Then we moved on to translating a text together, with the TopCoder competition as the model for the translation process.

A few days ago, I’d broken up a short text into one- or two-sentence chunks and posted these chunks on an online form, with an empty text box under each one for the translations. Here’s a screen capture of the form, which I created with Google Docs:
crowdsourcing activity form

All students were given two minutes to translate the first segment, and then click on the “Submit” button at the bottom of the page, which automatically uploaded the translated segments to a Google Docs spreadsheet so that we were able to view all the submissions together. Here’s a screen capture of the spreadsheet, to give you an idea of what we were working with in class:
crowdsourcing activity spreadsheet

Once the first sentence had been submitted, students were then able to vote on which translation they preferred. To help speed up the process, each student was allowed to vote only once for their favourite version and then, after one version was declared the “winner”, students were able to make any revisions they wanted, provided a majority of the class agreed with the change. The revised sentence was then added to a Word document so that a final translation could be pieced together from the winning sentences. Students were then given two minutes to translate the second sentence, and, once they had done so, they were invited to vote on the winner and make any corrections to the translation. After we had translated three or four sentences (and were almost out of time), students were asked to comment on the final version and the translation process.

So what did the students think? Most noted that although our translation process (basically a popularity contest, with the possibility of adding a few touch-ups to the winner) worked well enough in our small group, it might not be as successful outside the classroom, since the most popular answer isn’t necessarily the best. Some raised some very valid concerns about how such a process would work on a larger scale or in another context. For instance, they wondered how well would the final text would hold together if it had been translated by multiple people, and how various linguistic groups (e.g. Brazilian and Iberian Portuguese speakers) would settle on an acceptable version.

It seemed, though, that many students enjoyed the exercise, regardless of whether they felt this method of translating would work outside the classroom. One student liked being able to compare the various versions laid out on the screen, because that way, when revisions/corrections were made to the winning translation, students could incorporate ideas from the versions that did not win. Another noted that one person translating alone might get stuck or lack inspiration at certain points, but that this problem would not arise if many people were working on the same text.

Overall, I think this experiment worked well. Using a Google Docs Form really simplified the setup on my end, since I needed no programming skills and was able (in less than 15 minutes) to create an interface we could work with in class. Next year, I’d do this exercise on a week when we had three hours together instead of one where I had scheduled a test in the second half of class. I think this exercise lends itself well to a 2- or 3-hour class, with 20 to 30 minutes for a talk about crowdsourcing, 1 to 1.5 hours to translate, 15-20 minutes to go over the final translation (since I didn’t give any input while students were voting on and revising the submissions and the final version did have some minor language and translation errors), and then 10-15 minutes for students to reflect on the process and the result.

References:
Howe, Jeff. (2008). Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business. New York: Crown Publishing.

Goodbye WebCT, Hello WordPress

For several years now, I’ve been using WebCT as the online course environment for my translation classes, first at the University of Ottawa, and now at York University. While the University of Ottawa has discontinued WebCT in favour of Blackboard, York still offers only WebCT or Moodle as course environments.

When I first starting using WebCT, I found it easy enough to organize my classroom material: I could post PowerPoint presentations and the documents we translated in class, along with the tests and assignments students needed to complete. I could also add links to glossaries, term banks, corpora and other tools we would use throughout the semester. As for the students, they could post their homework online via the discussion board I’d set up.

But then I started teaching more than one course per term, and I also started teaching the same courses over again. That’s when I began to get annoyed with the limitations of WebCT. I had a lot less time for fiddling around in the system and just wanted to get course material up online as quickly and painlessly as possible; WebCT just wasn’t cutting it anymore. Let’s say, for instance, that I need to upload a significant number of files all at once. The WebCT interface forces me to select and upload them one at a time. Or, maybe I want to quickly double-check whether I’ve uploaded a particular file or added a link to a new resource. I need to log-in to WebCT, then click through four or five pages just to get to that information. And what happens every single semester when I want to create a course website, even if the site is supposed to look exactly the same as it did last year? I need to email a request for each course (and each section) to the IT department so that they can create (either from scratch or from a backup copy) a course for me in the system. If I make a course request close to the beginning of term (as many other professors do, I’m sure), it can take a few days before one of the support staff is able to respond. I want to be able to create the site right away, when I’ve got the time and inclination to work on it. Which leads me to my first resolution of 2011: ditch WebCT altogether and migrate to WordPress by September.

WordPress offers enough plugins and customization options that I should be able to offer my students everything they were getting via WebCT:

  • Course material: I can post the course material each week as part of blog post, which also gives me the possibility of providing students with a few details about my expectations for the upcoming week and my thoughts on the previous class–something I can’t do with WebCT.
  • Student submissions: Students will be able to use the commenting function to submit homework or share their thoughts about topics we’ve discussed in class. I think this will provide a better environment for interaction between me and the students, as well as among the students themselves.
  • Links: I plan to include links to resources in the sidebar, which means no one will have to click through three or four pages just to get to the list of links, and I’ll be able to group the various types of resources more effectively.
  • Privacy: With WordPress, I can choose whether I want to make the course accessible to people who are not registered in the course, but the course website does not have to appear in search engine results. WebCT users are generally limited to York students, staff and faculty, so WordPress will give me more flexibility about who can read, download and respond to course material.

Has anyone else tried to use tools like Blogger or WordPress to post their course material? What were your experiences like? I do plan to post more about my switch to WordPress as I work on the new sites over the summer, but I’d appreciate feedback from anyone who has tried something similar in another course. Leave a comment or send me an email to let me know what you think.