Citizen Science Conference: Full Evaluation Report

Date: 
Saturday, August 1, 2015
Resource Type:
Evaluation Reports | Research and Evaluation Instruments | Survey
Environment Type: 
Public Programs, Citizen Science Programs, Professional Development, Conferences, and Networks, Professional Development and Workshops, Conferences
Audience: 
Museum/ISE Professionals | Scientists | Evaluators | Learning Researchers
Discipline: 
General STEM | Life science
Organization:
Lifelong Learning Group
Description: 

Citizen Science 2015 was the inaugural conference of the Citizen Science Association (CSA). The conference planned for two days of building connections and exchanging ideas across a wide spectrum of disciplines and experiences and was held February 11th and 12th in San Jose, California, as a pre-conference of the American Association for the Advancement of Science's Annual Meeting.

In addition to the other strands, a specific strand dedicated to education was held to identify opportunities and strategies to support the integration of citizen science into the Science, Technology, Engineering, and Math (STEM) learning ecosystem. The field of citizen science, broadly defined as public participation in scientific research, is growing rapidly. It is producing significant science and societal outcomes; however, its potential to transform STEM education has yet to be realized at scale. Innovations in diverse disciplines are improving the effectiveness and utility of citizen science in achieving STEM learning goals. Rarely, however, do these innovators have opportunities to share ideas and develop activities that will help spread and scale up their implementation. The goal was to facilitate growth and innovation in the use of citizen science in STEM education for diverse audiences.

The organizers of the Citizen Science Conference are committed to evaluation and have conducted evaluations of prior conferences and workshops. For this conference, the goal was to evaluate the overall conference and specifically focus on the education strand. The evaluation was driven by eight guiding questions and included a pre-measure; process observation and interviews; post measure; and a delayed-post measure.

Overall, the conference appears to have met its goals. Participants felt very positive about the conference and about the field of Citizen Science, there was a sense of reinvigoration and commitment moving forward, and there is evidence of follow-up activity by individuals after the conference. Specifically addressing the questions guiding the evaluation of the conference:

1. Who are the participants and do they represent the breadth of the field of citizen science?
The participants are somewhat diverse in that they cover range of positions, purposes, and “homes” of citizen science programs. Geographic and type of program focus are diverse.
2. How relevant were the strands for the participants?
The strands for the conference were used in the solicitation for papers and posters to ensure topics the Conference Committee thought were important were addressed. The intention was the strands would be used to help individuals navigate the many concurrent sessions. In the findings, the strands were seen as relevant, by the respondents, but were not, for many of them, meaningful. This is in part due to the complex, competing interests of the participants and the multiple roles and needs many of the participants bring to the conference and hope to get from the experience.
3. How important were the goals for the conference for the participants? For the field?
The goals were seen as important for the conference participants, but more important and uniformly more important for the field. This finding speaks to the breadth of motivation for participation in citizen science, the breadth of utility to science and people of the work of citizen science, and the breadth of topics addressed through citizen science projects.
4. How important is presenting for conference participation?
Conferences vary on the need to present to support attendance. As this is the first measure for this conference and Association, it is important to note that over half (59%) of respondents noted that presenting was important for them to be able to attend. This information will be important in planning future conferences to ensure there are ample and different opportunities for individuals to make contributions to the conference.
5. What are expectations of participants and are those expectations satisfied?
Consistent with prior findings, the dominant expectations coming into the conference, immediately after the conference, and remaining as important several months after the conference include networking, skill building, and obtaining new insights. These expectations were met through the conference.
6. Do perceptions of participants change during the course of participation?
Generally, in this conference, perceptions did not change much during the course of the conference. Some individuals enter with broad goals or interests and get clarity in what they need/want from the conference as it progresses. Those coming with clear intentions/expectations seem to seek out those experiences offered by the conference to get these expectations met.
7. What is the process of engagement in a single strand of the conference?
The strand, specifically the Education strand, makes sense as an organizing tool for the conference. Approximately half the interviewees from this strand intentionally engaged in the education strand as the overall topic of education relates to their professional work. About a quarter attended based on individual curiosity about the strand, and the other quarter choose to attend based on the conference program description of a particular session.
8. Does coupling the conference with another benefit the participants?
There is a slightly positive benefit to participants, but overall not very important to the participants for the conference to be aligned with another conference.

Funder(s): 
NSF
Funding Program: 
AISL
Award Number: 
1501158
Funding Amount: 
$49,470

Team Members

Joe E HeimlichEvaluator
Gary TimkoGary TimkoEvaluator

Request to Edit a Resource

If you would like to edit a resource, please use this form to submit your request.