| WMU News
KALAMAZOO, Mich.—A Western Michigan University education researcher has received a $799,665 grant from the National Science Foundation to develop resources to plan studies that can better assess how well professional development initiatives for science teachers are working.
Dr. Jessaca Spybrook, WMU associate professor of educational leadership, research and technology, is heading a team of researchers from WMU, the Biological Sciences Curriculum Study and Abt Associates that will mine a treasure trove of large-scale data sets, analyze them and come up with ways to inform the design of large evaluations of teacher development initiatives.
"In education, we're always asking and trying to answer questions about what works," Spybrook says. "What science program is most effective? What professional development program for teachers is going to be most effective and thereby lead to better student achievement? This study will help us to more completely answer those questions."
Too often, such large-scale teacher development randomized trials are inconclusive, she notes. So it's unknown how well efforts to improve teaching are actually working.
A major impediment to analyzing professional development program effectiveness is designing studies that have the capacity to determine "what works." Her research will give guidance about how many schools need to be studied and how many teachers should be tested within those schools to answer those questions with greater confidence.
"In this particular study, the outcomes of interest are actually teacher outcomes," Spybrook says. "So we're interested in science teacher practice and science teacher content knowledge, because we know that by improving these things, ultimately that will lead to improved student achievement."
In order to figure out how many schools and teachers need to be studied, researchers will examine how much teacher content knowledge varies from one school to the next. Does it vary a lot? Do teachers with a high amount of content knowledge tend to be clustered in one school or is there a lot of variation within schools?
"We're going to analyze these data sets to get an idea of what that variation looks like," Spybrook says. "After we analyze those data sets, we're going to write papers and disseminate them so that if you were a researcher and wanted to test the curriculum you set up for teachers, you would then go to our resource and pull these parameters out so you could design your study with the right number of schools and the right number of teachers."
Researchers don't have time to study every school district in advance of conducting a teacher development study, she says. By taking the data sets and analyzing them, researchers have the tools to design their study effectively.
more accurate prediction
By performing a meta-analysis of studies of science teacher professional development, Spybrook's project will also allow researchers to more accurately predict how big an impact the teacher development intervention should have.
"What we're trying to figure out is, what's a reasonable threshold to design these teacher studies at," she says. "You need to have a sense of what the true effect is going to be. So combined, someone who is going in to conduct one of these studies would have everything they need to design a good study."
Measuring the effectiveness of professional development programs is a gray area, Spybrook says.
"We're not sure how much growth is going on for teachers during the year and how much difference one of these PD programs could make," she says. "So if there's a big difference, it's easier to detect and we don't need as many schools. But if the difference is small, but still important, we're going to need more schools in order to determine that difference."
Too often, researchers go into a study expecting to find a large difference in teacher outcomes, recruit too small a sampling of schools and detect no difference, Spybrook says.
"At the end of the day, the difference may have been small," she says. "So you will have invested this money in the study, but you didn't actually have the capacity to detect the difference. We want people to be designing studies with the capacity to be able to say, 'Yes, this program works,' or 'No, it doesn't.' If the study is underpowered, if they can't detect a meaningful effect, then it's really a waste of money."
Spybrook's latest project builds off a previous one, also funded by the NSF, that was aimed at science student achievement. Her most recent NSF grant is her third from agency.
"So, basically, when we got done with that study, we said, 'We've done this part, but now we need to do the same type of thing, only looking at teacher outcomes."
For more news, arts and events, visit wmich.edu/news.