I’m noticing a trend, and I wonder if anyone else is noticing it.
When I was working on my Master’s back in the oughts, the big push was for meaningful action research in schools and classrooms. If you are unfamiliar with this term, don’t feel bad, many of us at the time were as well. Action research, in short, is the deliberate experimentation with teaching methods in the classroom or school, following typical scientific processes like building hypothesis, data collection, reflection, etc. The great thing about this effort was that it really got teachers involved in the change process. They created their own evidence to rationalize what worked and what didn’t, and it was completely defensible when done correctly. “Why don’t you have students do poster projects?” “Because we did some action research on extinction of knowledge for visual learners, and found that our students retained that knowledge far better if they built websites.” Easy. People collected this on-the-ground research on blogs, sites, and in books, and it informed best practices. Then it didn’t. At least, it doesn’t in the same ways it did some years ago.
I haven’t seen or heard a lot of talk about deliberate action research by teachers or principals for a while. Passing trend? Perhaps. I suspect, though, it was obsoleted by several factors. Here are a few that come to mind:
- Lots of action research was published and replicated and published again. I suspect that to some extent, potential researchers find nothing new under the sun, and are finding that many of the key questions have been pretty adequately answered already. I hope not, but there it is. Maybe we just aren’t wondering as much?
- Similar to the first reason, some researchers became so well associated with their research that schools have adopted methodologies for improvement as a packaged unit, therefore eliminating a lot of the need to experiment. As districts jump on the DuFour, Marzano, Danielson, or other bandwagons, we will likely see much less action research.
- The rise of Response to Intervention (RtI) programs which rely heavily on standardized data collection may de-emphasize or discourage action research. Data collection has become largely quantitative. As these programs proliferate, and include the intervention strategies, the need for experimentation may be lessened. Likewise, the development of new positions like data coaches, and groups such as data teams whose job it is to disseminate standardized data have also become commonplace, further devaluing action research.
- Common Core State Standards and other standardization efforts imply that there is one best method to teaching and achieving goals (even though most teachers would disagree), that action research is not valued highly in districts as much as it used to be.
- The concept of a PLC has changed. In the early days of PLC work, groups used this time for book study, action research, and evaluation of methods and materials. More common today is the PLC that is essentially an implementation team in which small groups work collaboratively to advance an overarching initiative set at the building or district level. Shifting the focus of these groups has certainly had an affect on the ability for small teaching groups to engage in action research.
- Preservice and graduate teaching and leadership programs simply are not teaching action research like they used to, and there are fewer people comfortable with this method to use it frequently.
If you know of teachers or schools who still engage in deliberate action research, rather than simply working with raw data acquired from other sources, I’d love to hear from you. I think this is a professional practice unfortunately on the way out, or at least changing a lot. Leave a comment and let me know your take on it.