Most students readily get the idea that as an ecosystem’s lizard population increases, the population of lizards’ prey (say, crickets) will decrease. But less obvious and more difficult to grasp is that fewer lizards isn’t necessarily good for the crickets overall. Without any predator to limit their numbers, crickets might overpopulate, wipe out their food source (grass, in this case), and die out.

Increasingly, teachers are turning to computerized simulations to help them determine whether — or how well — their students understand such complex scientific concepts as the dynamic relationships and interactions among organisms in an ecosystem. According to Jodi Davenport, Deputy Director of WestEd’s STEM program, “Interactive assessments provide a better picture of how well students understand and apply scientific principles than traditional methods. Interactive simulations allow students to design and run experiments rather than just selecting from options on a multiple-choice test.”

WestEd’s focus on interactive assessments began in 2007 and led to the development of two simulation-based initiatives that aim to improve science assessment: SimScientists and ChemVLab+. SimScientists has produced assessment activities — 10 for middle school students, one for high school — focused on life, physical, and earth science concepts. ChemVLab+ has created virtual lab experiences that can be used for formative assessment in high school chemistry.

The National Science Foundation (NSF) and the U.S. Department of Education’s Institute of Education Sciences (IES) funded the development of SimScientists and ChemVLab+ activities as well as trainings to support teachers’ use of them in classrooms. Funding has also underwritten numerous research studies that have involved tens of thousands of students across the country and focused on how students master scientific concepts, what kinds of tools best monitor their progress, and how best to modify simulation-based assessment for students with disabilities.

Meeting the next generation of standards in science

In November 2017, WestEd received a three-year, $1.4-million IES grant to build on that work through a collaboration with researchers from Carnegie Mellon University to develop new online activities that revise and extend existing ChemVLab+ activities. The goal is to better align the activities with the Next Generation Science Standards (NGSS) — science education guidelines developed by a coalition of state education officials, scientists, and teachers. The standards, which include performance expectations that students must meet to demonstrate science proficiency, favor instruction that engages students in scientific practices and emphasizes the interconnected nature of science as applied to real-world situations. To date, 19 states and the District of Columbia have adopted the standards and are working to implement them.

According to WestEd Senior Research Associate Matt Silberglitt, who manages simulation-based assessment projects, the NGSS were “just what we’ve been waiting for . . . more complex, multidimensional learning goals that require students to not just know about science but also be able to do science: manipulate variables, make hypotheses, construct explanations, and make predictions.” And those kinds of expectations, he says, “call for exactly the kinds of understanding that interactive assessment tools such as SimScientists and ChemVLab+ are designed to measure.”

With the IES grant, Davenport’s team will spend three years working with 12 chemistry teachers and learning from the experiences of approximately 1,200 high school students in diverse California schools to analyze and improve the alignment of ChemVLab+ activities with the NGSS. The process, says Davenport, will involve incorporating “design principles that research tells us best support learning” into the formative assessment activities. Such principles include the use of computerized representations that are scientifically appropriate for various student populations, multimedia formats that use visual cues to guide students through activities, and interactive features that allow students to make connections between what they can see and measure and what is happening at the level of atoms.

Engaging students with interactive design

Silberglitt notes that because the simulations are interactive and include coaching and feedback, the SimScientists and ChemVLab+ assessment activities are often learning experiences in themselves. Teachers using the SimScientists module on ecosystems, for example, might ask students to design a system that will survive over the long term by observing its organisms, exploring population models, and creating food webs. “Students manipulate the data, changing the population of crickets and lizards and availability of grass, and then run simulations to determine if all the populations are still there after 20 years,” explains Silberglitt. “What this simulation does is make the material more accessible for a wider range of students because it’s an active investigation. Students see firsthand the results of the decisions they make. It’s a more meaningful, engaging, and concrete way of presenting content.”

Nonetheless, the simulations are not meant to replace a school’s science curriculum. Rather, they’re meant to enhance the curriculum by helping teachers know where the students’ mastery of the material is robust or breaks down.

Other SimScientists modules call on students to predict events such as earthquakes and volcanic eruptions based on their understanding of the motion of tectonic plates, or assess students’ knowledge of how forces act on objects by asking them to design experiments that investigate how changes in mass influence an object in motion. ChemVLab+ modules cover concepts that include concentration and dilution, temperature and heat transfer, and chemical thermodynamics. In a module on the properties of acids and bases, for example, students are asked to demonstrate their knowledge by monitoring the water quality in a simulated neighborhood swimming pool.

Documenting evidence of success

Davenport contends that “using simulations may fundamentally transform science education,” pointing to the results of research studies that found small-to-moderate, but statistically significant, gains in learning for students who completed SimScientists modules as compared to the progress of peers who did not. Other findings suggest that “engaging students through interactive assessments may provide better estimates of their ability to apply complex science practices, such as the ability to conduct experiments and investigations, than assessing their progress with traditional formats,” she adds. Moreover, analyses of student engagement and learning have found that those who participated in ChemVLab+ activities became more efficient in completing tasks and less likely to pursue incorrect lines of inquiry.1

“[I]nteractive assessments may provide better estimates of [students’] ability to apply complex science practices . . . than assessing their progress with traditional formats.”

Silberglitt says such evidence is helping convince state-level policymakers, whose responsibility for assessment has grown enormously over the last 20 years, of the need to move away from single-test, end-of-year measures, which he says are not as helpful as more embedded, frequent assessment. Instead, he envisions new systems of assessment “built from the ground up,” the elements of which reinforce each other. “It’s gratifying to see recognition at the state level of the need for this kind of assessment at the classroom, school, and district levels and the efforts being made through avenues such as professional learning and additional funding to provide teachers with tools to build and use their own assessments.”

Linking learning and assessment

One of the key lessons learned from the research on interactive assessment, says Silberglitt, is the value of using simulations to improve not only assessment but also learning. “Learning and assessment are closely linked, and simulations can be effective tools for helping teachers monitor their students’ progress toward mastery, provide appropriate feedback, and adjust instruction as needed,” he says. “There’s no merit in waiting until the last day of the semester to find out how well students understand the material.”

Another takeaway from WestEd’s work in this area, according to Davenport, is that further research is needed to increase the potential for future science assessments to more fully and effectively capture evidence of deep understanding of scientific thinking and active problem solving. She believes technology is key to such change: “It’s easy to see the potential that technology offers for helping kids to learn and demonstrate their science knowledge and skills. But not all simulations are equal, and there’s still a lot of work to be done as we explore new ways to capture what students know through both formative and summative assessment.”


Notes:
1For more information on the research base, see https://simscientists.org/downloads/Simulations-for-Supporting-and-Assessing-Science-Literacy.pdf.


This R&D Alert article includes reporting on research that was supported by grants to WestEd from the Institute of Education Sciences, U.S. Department of Education (R305A100069, R305A170049, R305A130160, R305A080225, R305A120390, R305A080614, R305A130160), and the National Science Foundation (0733345, 1020264, 1221614, 0814776, 1420386), and by a grant to the Nevada Department of Education from the U.S. Department of Education (09-2713-126). Any opinions, findings, conclusions, or recommendations expressed in this article do not necessarily reflect the views of the funders.