Mapping Life – Quality Assessment of Novice vs. Expert Georeferencers

Date: 
Friday, May 20, 2016
Resource Type:
Peer-reviewed article | Research Products
Environment Type: 
Media and Technology, Websites, Mobile Apps, and Online Media, Public Programs, Citizen Science Programs
Audience: 
Undergraduate/Graduate Students | General Public | Scientists
Discipline: 
Computing and information science | Ecology, forestry, and agriculture | Life science
Organization:
Florida State University, Tulane University Biodiversity Research Institute
Description: 

The majority of the world’s billions of biodiversity specimens are tucked away in museum cabinets with only minimal, if any, digital records of the information they contain. Global efforts to digitize specimens are underway, yet the scale of the task is daunting. Fortunately, many activities associated with digitization do not require extensive training and could benefit from the involvement of citizen science participants. However, the quality of the data generated in this way is not well understood. With two experiments presented here, we examine the efficacy of citizen science participants in georeferencing specimen collection localities. In the absence of an online citizen science georeferencing platform and community, students served as a proxy for the larger citizen science population. At Tulane University and Florida State University, undergraduate students and experts used the GEOLocate platform to georeference fish and plant specimen localities, respectively. Our results provide a first-approximation of what can be expected from citizen science participants with minimal georeferencing training as a benchmark for future innovations. After outliers were removed, the range between student and expert georeferenced points was <1.0 to ca. 40.0 km for both the fish and the plant experiments, with an overall mean of 8.3 km and 4.4 km, respectively. Engaging students in the process improved results beyond GEOLocate’s algorithm alone. Calculation of a median point from replicate points improved results further, as did recognition of good georeferencers (e.g., creation of median points contributed by the best 50% of contributors). We provide recommendations for improving accuracy further. We call for the creation of an online citizen science georeferencing platform.

Funder(s): 
NSF
Award Number: 
1202953
Funder(s): 
NSF
Award Number: 
1115210
Funder(s): 
NSF
Award Number: 
1029808
Citation
ISSN:
2057-4991
DOI:
10.5334/cstp.30
Publication Name: 
Citizen Science: Theory and Practice
Volume: 
1
Number: 
1
Page Number: 
4
Document:

Team Members

Elizabeth EllwoodElizabeth EllwoodAuthor
Henry BartHenry BartAuthor
Michael DooseyMichael DooseyAuthor
Dean JueDean JueAuthor
Justin MannJustin MannAuthor
Gil NelsonGil NelsonAuthor
Nelson RiosNelson RiosAuthor
Austin MastAuthor

Request to Edit a Resource

If you would like to edit a resource, please use this form to submit your request.