The Crowdsourcing Process: Decisions about Tasks, Expertise, Communities and Platforms

paper, specified "long paper"
Authorship
  1. 1. Lynne Siemens

    University of Victoria

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.

Introduction
Business, governments, community groups and academic projects are turning to crowdsourcing, an internet-based process, to facilitate access outside expertise needed to complete various tasks (Brabham, 2008; Howe, 2006). This mechanism holds great potential for academic projects, particular for those involving large amounts of data that needs to be identified, transcribed, analyzed and/or catalogued. It can often supplement meager project budgets by sourcing the work at relatively low cost (Corney et al., 2009; Holley, 2010). Further, crowdsourcing supports growing calls for public engagement in research projects (SSHRC, 2012). Several academic projects are at the forefront of this trend with public involvement in data classification (Galaxy Zoo, 2010), error correction in texts (Holley, 2009), text transcription and/or annotation (Transcribe Bentham, 2012; Zou, 2011; Pynchonwiki, nd; Siemens, 2012), cataloging and metadata and database creation (The Bodleian Library, 2012; Picture Australia, nd), and many others.

As crowdsourcing is introduced into more activities, it becomes important to understand “how to manage the crowd in a networked society” to ensure that a project achieves its objectives (DISH, 2011). Initial studies in participation motivation have found that individuals are interested in participating in a larger cause, using their skills, earning recognition and/or being part of a community, and potentially money (Wexler, 2011; Organisciak, 2010; Raddick et al., 2010; Brabham, 2008, 2010). Other research has focused on the most appropriate ways to solicit and encourage participation among a potential community of contributors by making the activity in question fun, presenting a big challenge to be undertaken, and reporting on progress regularly (Digital Fishers, nd; Holley, 2012). However, most of the resaerch has been conducted within the private sector context and on short-term projects with little need to manage volunteers over a longer period of time (Wexler, 2011), which may limit the applicability of results to academic projects.

While the use of crowdsourcing is increasing, little work has been done to understand ways to organize the work to ensure that this crowd’s contribution is delivered within an academic’s project’s schedule, budget and other resources and to the required quality standard (Geiger et al., 2011; Organisciak, 2011). In particular, projects need to understand the most appropriate ways to organize work flows, technical infrastrucure, and staffing requirements to manage volunteers and confirm quality (Zou, 2011). Opportunity exists to build from reflections of several projects to understand these issues (DISH, 2011; Holley, 2012; Zou, 2011; Corney et al., 2009; Holley, 2009).

This paper will contribute to this discussion by reviewing the literature to suggest a crowdsourcing process framework explore the range of decisions that must be addressed in advance to ensure quality and successful project outcomes. In addition, it will report on interviews with several crowdsourcing projects with regards to their workflow organization. The paper will conclude with recommendations for projects contemplating this tool as a way to reach project outcomes with limited project funds.

Literature Review
While no common definition exists for crowdsourcing (Holley, 2010), every crowdsourcing project shares several components. As seen in Figure 1, there must be an organization, a particular task to be completed to meet project goals and outcomes at a specified quality level, and a community, comprised of both experts and novices, which is willing to do the work for little or no money. These interactions are facilitated through an internet-based platform (Brabham, 2012; Schenk et al., 2011; Geiger et al., 2011; Tong et al., 2012). The interested organization must make a series of decisions regarding the type of expertise, qualification and/or knowledge required, the presence of a contributors, the mechanisms by which they will participate and contribute, project remuneration, motivators to keep participants engaged, and quality control mechanisms (Geiger et al., 2011; Corney et al., 2009; Rouse, 2010; Organisciak, 2011). The range of tasks and required expertise can be seen along the continuum in Figure 2.

Figure 1: Crowdsourcing Approach
(Adapted from Geiger et al., 2011; Corney et al., 2009; Rouse, 2010)

Figure 2: Crowdsourcing Task Continuum
(Adapted from Corney et al., 2009; Rouse, 2010)

Ultimately, projects then select the appropriate interface to both solicit and receive contributions from the public in ways that keep the “crowd” motivated and participating (Tong et al., 2012; Organisciak, 2010, 2011).

Methods
This project uses a qualitative research approach with in-depth interviews with members of crowdsourcing projects. The interview questions focus on the participants’ use of the “crowd” to achieve tasks, ways to organize workflows, type of infrastructure in place to support the work, and challenges (Marshall et al., 1999; McCracken, 1988).

At the time of writing this proposal, final data analysis is being completed. The results will inform decision making needed to be made by projects to effectively and successfully use crowdsourcing.

This research will make several contributions to the knowledge base about ways to incorporate the public into academic projects. First, it builds on work already undertaken to community and engage the public in academic research through crowdsourcing (Causer et al., 2012; Holley, 2009; Organisciak, 2010). Second, it builds on earlier studies on participants’ motivation to participate in these projects with an exploration of the organization of the tasks, process, and other components of the crowdsourcing process from the perspective of the project itself (Brabham, 2010; Organisciak, 2010; Wexler, 2011). Finally, it extends understanding about the nature of academic collaboration as projects expand relationships beyond the team to the public (Siemens, 2009).

References
Brabham, D. (2008). Crowdsourcing as a Model for Problem Solving: An Introduction and Cases. Convergence: The International Journal of Research into New Media Technologies, 14.1: 75-90.
Brabham, D. (2010). Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application. Information, Communication & Society, 13.8 1122-1145.
Brabham, D. (2012). The Myth of Amateur Crowds: A Critical Disccourse Analysis of Crowdsourcing Coverage. Information, Communication & Society, 15.3 394-410.
Causer, T., J. Tonra, and V. Wallace (2012). Transcription Maximized; Expense Minimized? Crowdsourcing and Editing the Collected Works of Jeremy Bentham. Literary and Linguistic Computing.
Corney, J. R., C. Torres-Sánchez, P. Jagadeesan, and W. Regli (2009). Outsourcing Labour to the Cloud. International Journal of Innovation and Sustainable Development, 4.4. 294-313.
Digital Fishers (nd). Digital Fishers. http://digitalfishers.net/ (accessed February 21, 2012).
Dish (2011). Theme: Co-Creation and Crowdsourcing. http://www.dish2011.nl/themes/crowdsourcing-and-co-creation (accessed February 21, 2102).
Galaxy Zoo (2010). Galaxy Zoo. http://www.galaxyzoo.org/ (accessed February 22, 2012).
Geiger, D., S. Seedorf, T. Schulze, R. Nicerson, et al. (2011). Managing the Crowd: Towards a Taxonomy of Crowdsourcing Processes. Seventeenth Americas Conference on Information Systems. Detroit, Michigan.
Holley, R. (2009). Many Hands Make Light Work: Public Collaborative Ocr Text Correction in Australian Historic Newspapers, National Library of Australia.
Holley, R. (2010). Crowdsourcing: How and Why Should Libraries Do It? D-Lib Magazine.
Holley, R. (2012). Digital Cultural Heritage Awards for Crowdsourcing (and Thoughts on Gamification). http://rose-holley.blogspot.com/2012/02/digital-cultural-heritage-awards-for.html (accessed February 21, 2012).
Howe, J. (2006). The Rise of Crowdsourcing. Wired.
Marshall, C. and G. B. Rossman (1999). Designing Qualitative Research, Thousand Oaks, California, SAGE Publications.
Mccracken, G. (1988). The Long Interview. Newbury Park, CA: SAGE Publications.
Organisciak, P. (2010). Why Bother? Examining the Motivations of Users in Large-Scale Crowd-Powered Online Initiatives. Humanities Computing — Library and Information Studies. Edmonton: University of Alberta.
Organisciak, P. (2011). When to Ask for Help: Evaluating Projects for Crowdsourcing. Digital Humanities 2011. Stanford.
Picture Australia (nd). Trove: Australia in Pictures. http://www.flickr.com/groups/pictureaustralia_ppe/ (accessed
Pynchonwiki (nd). A Literary Wiki Exploring the Novels of Thomas Pynchon. http://pynchonwiki.com (accessed
Raddick, M. J., Bracey, G., Gay, P. L., Lintott, C. J., et al. (2010). Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers. Astronomy Education Review, 9.1.
Rouse, A. C. (2010). A Preliminary Taxonomy of Crowdsourcing. ACIS 2010. Brisbane, Australia.
Schenk, E. and C. Guittard (2011). Towards a Characterization of Crowdsourcing Practices. Journal of Innovation Ecnomics, 7.1: 93-107.
Siemens, L. (2009). 'It's a Team If You Use "Reply All": An Exploration of Research Teams in Digital Humanities Environments. Literary & Linguistic Computing, 24.2: 225-233.
Siemens, R. (2012). A Social Edition of the Devonshire Ms (Bl Add 17, 492). http://en.wikibooks.org/wiki/The_Devonshire_Manuscript
Sshrc (2012). Knowledge Mobilization. http://www.sshrc-crsh.gc.ca/society-societe/community-communite/index-eng.aspx
The Bodleian Library (2012). Help Us to Describe the Libraries' Digitised Music Collections. http://www.whats-the-score.org
Tong, R., and K. R. Lakhani. (2012). Public-Private Partnerships for Organizing and Executing Prize-Based Competitions. Berkman Centre for Internet & Society.
Transcribe Bentham (2012). About Us. http://www.ucl.ac.uk/transcribe-bentham/about/.
Wexler, M. N. (2011). Reconfiguring the Sociology of the Crowd: Exploring Crowdsourcing. International Journal of Sociology and Social Policy, 31.1-2: 6-20.
Zou, J. J. (2011). Civil War Project Shows Pros and Cons of Crowdsourcing. http://chronicle.com/blogs/wiredcampus/civil-war-project-shows-pros-and-cons-of-crowdsourcing/31749 (accessed February 21, 2012).

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.

Conference Info

Complete

ADHO - 2013
"Freedom to Explore"

Hosted at University of Nebraska–Lincoln

Lincoln, Nebraska, United States

July 16, 2013 - July 19, 2013

243 works by 575 authors indexed

XML available from https://github.com/elliewix/DHAnalysis (still needs to be added)

Conference website: http://dh2013.unl.edu/

Series: ADHO (8)

Organizers: ADHO