An Evaluation of Crowdsourcing and Participatory Archives Projects

advertisement
AN EVALUATION OF CROWDSOURCING AND
PARTICIPATORY ARCHIVES PROJECTS FOR
ARCHIVAL DESCRIPTION AND
TRANSCRIPTION
Robert P. Spindler
rob.spindler@asu.edu
June, 2014
ARCHIVAL DESCRIPTION, DIGITAL LIBRARIES
MOVING IN DIFFERENT DIRECTIONS
Acquisitions by Repository (linear feet)
Department of Archives and Special Collections
Fiscal
Year
1990-91
1991-92
1992-93
1993-94
1994-95
1995-96
1996-97
1997-98
1998-99
1999-2000
2000-01
2001-02
2002-03
2003-04
2004-05
2005-06
2006-07
2007-08
2008-09
2009-10
2010-11
2011-12
2012-13
2013-14
TOTAL
Arizona
Collection
50.50
499.20
308.70
239.97
3,772.41
130.16
185.82
167.50
450.91
80.08
104.09
211.35
576.08
265.03
169.48
46.54
41.65
177.87
168.16
26.26
67.91
130.75
2,109.65
104.48
10,084.55
1990-2014
Chicano
Collection
163.00
200.85
19.95
43.67
99.18
49.00
25.30
56.00
38.15
72.85
14.02
26.56
70.54
27.27
33.12
10.54
28.41
45.51
46.00
54.75
23.00
71.00
8.25
1.0
1,227.92
Child
Drama
172.00
34.38
593.33
313.45
20.00
63.00
60.50
262.25
31.30
152.50
62.25
96.50
1,861.46
Labriola
Center
0.06
0.04
0.97
3.37
5.31
1.62
40.04
8.68
18.04
19.12
22.50
14.25
38.92
20.00
13.25
2.00
20.20
5.25
11.75
62.25
15.0
322.62
Special Collections
22.71
33.03
84.18
15.33
8.50
17.98
7.50
9.00
11.00
11.27
3.0
223.50
University
Archives
327.00
238.60
125.73
207.76
140.42
217.10
229.20
258.58
381.92
161.77
289.61
123.82
312.93
573.95
476.06
153.87
354.86
229.44
169.76
293.61
45.21
165.85
39.33
48.45
5,564.83
Visual
Literacy
TOTAL
15.00
15.00
1.30
0.50
17.65
1.27
0.87
12.00
28.77
3.00
15.00
3.04
6.00
1.00
.50
1.50
11.00
6.00
0.00
1.70
.25
0
0
141.35
555.50
953.65
455.68
491.96
4,029.70
398.50
444.56
499.39
901.37
357.74
431.40
382.81
1,156.67
946.84
1,319.77
647.50
481.75
548.57
470.40
665.57
261.57
543.10
2293.00
268.43
19,505.43
MUSEUMS, THEN LIBRARIES, THEN ARCHIVES…
•
What Happens After “Here Comes Everybody”: An Examination of Participatory Archives
Society of American Archivists Annual Meeting, 2011
•
Dr. Robert B. Townsend (Chair & Commentator)
Deputy Director
American Historical Association
Kate Theimer
ArchivesNext
Exploring the Participatory Archives
Dr. Elizabeth Yakel
University of Michigan
Credibility in the Participatory Archives
Alexandra Eveleigh
University College London
Crowding Out the Archivist? A British Perspective on Participatory Archives
CROWDSOURCING, CROWDFUNDING AND
SOCIAL BEHAVIOR
• Crowdsourcing has its origins in early 21st century crowd funding initiatives.
There are important distinctions between crowdsourcing, social engagement
and participatory archives.
• While large numbers of individuals visit crowdsourcing projects, few make
sustained and useful contributions. Powerful feelings of ownership, belonging
and connectedness are derived from feedback provided by the
crowdsourcing system or the associated community, and these feelings
along with a sense of shared authority motivate the most dedicated
participants.
ISTO HUVILA AND SHARED AUTHORITY (2008)
• Huvila’s progressive view of participatory archives is characterized by
“decentralised curation, radical user orientation, and contextualization of both
records and the entire archival process”.
• “Rethinking the relationship between official and unofficial knowledge is
probably the main challenge that cultural institutions have to face when
undertaking a crowdsourcing process.”
•
Huvila, Isto, Participatory Archive: Towards Decentralised Curation, Radical User Orientation, and Broader
Contextualisation of Records Management, Archival Science, Volume 8, Number 1 (2008),15-36.
•
Carletti, Laura, UK , Gabriella Giannachi, UK, Dominic Price, UK, Derek McAuley, UK, “Digital Humanities and
Crowdsourcing: An Exploration”, MW2013: Museums and the Web 2013 Conference, April, 2013.
THEIMER, EVELEIGH AND PARTICIPATORY
ARCHIVES
• Participatory archives seek public contributions of work or information that
expands our useful knowledge of culture and history. It is more than
conversational social engagement as seen in Facebook or Flickr .
METHODS – IMPROVING QUALITY
• Project developers have experimented with a number of methods to improve
the quality of knowledge or metadata production by combining social
participation with standards or systems based solutions.
• Projects seem to be moving toward separate professionally curated and
socially curated spaces, although linkages between the spaces are clumsy
and manual in most current applications.
METHODS – IMPROVING QUALITY
• Mediation can improve quality, but it is work-intensive and can leave the host
institution vulnerable to claims of censorship, especially when the rules of
engagement are not clearly stated in advance.
• Participants may have an expectation that their posts will be permanently
preserved. Peer mediation can be more effective than professional
mediation.
METHODS – IMPROVING QUALITY
• Several technologies can be used to improve quality such as heat maps,
transcription version comparisons, personalization features and reward
systems.
• Open source gaming solutions for improving metadata quality are now
available.
• “Computational techniques” can be applied to extract, normalize, and
disambiguate terms used in social tags.
AN EVALUATION OF CROWDSOURCING AND
PARTICIPATORY ARCHIVES PROJECTS FOR
ARCHIVAL DESCRIPTION AND
TRANSCRIPTION
Robert P. Spindler
rob.spindler@asu.edu
June, 2014
Download