On-the-Spot Assessment to Improve Scientist Engagement with the Public
As part of its overall strategy to enhance learning in informal environments, the Advancing Informal STEM Learning (AISL) program seeks to advance new approaches to, and evidence-based understanding of, the design and development of STEM learning in informal environments. This includes providing multiple pathways for broadening access to and engagement in STEM learning experiences, advancing innovative research on and assessment of STEM learning in informal environments, and developing understandings of deeper learning by participants. This Innovations in Development project will develop and test a model for audience assessment that STEM professionals can use during their public engagement efforts. These on-the-spot assessments will allow scientists to monitor audience understanding and use this to make immediate improvements to their activity. This project fills a critical gap in the field of public science communication as expressed by outreach scientists and by the professional and academic organizations that train them. Project partners include The Astronomical Society of the Pacific, Oregon State University, Pacific Science Center, and the National Radio Astronomy Observatory. Research questions include: (1) How and under what conditions can scientists, regardless of discipline, learn to build assessment into outreach activities? (2) To what degree are scientists willing and able to change their outreach activities to include assessment to ascertain audience attentiveness or understanding? (3) To what degree will scientists be able to adjust their outreach and engagement efforts based on audience feedback, and what support do they need to do so? (4) With constraints on their time and resources, how can the model help scientists conduct audience assessment on their own? (5) Are audience members finding events more enjoyable and fruitful that use assessment techniques? Several iterative rounds of R&D will be used in the spiral design-based approach. A group of Design Testers will start each of the iterative cycles for the initial development and testing techniques, associated training, and support. Later phases will use Field Testers using a rubric for usability, validity, and reliability of various strategies. The summative evaluation of the project will begin early and add data over the life of the project. A mixed methods approach will be used including surveys of scientists once each year after they have participated in the Prototype training. Descriptive and inferential statistics will be used to document the extent to which the project has achieved its intended outcomes. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.