Education research, so we are told, leads to good public policy — and every new policy initiative is still introduced with the simple claim that it is based upon “the best and latest research.” That may be so, but only if that research is sound, scientifically-based, and not ideologically-driven. And now even the leading international experts concede that most of what passes as “education research” falls short of those standards.
One of the leading critics, Jeffrey R. Henig, author of Spin Cycle (2008), has addressed the issue squarely with specific reference to the American “data war” over the effectiveness of Charter Schools. Studying the No Left Child Left Behind (NCLB) public debate, he called into question the “hype” about conclusive evidence based upon supposedly “randomized field trials.” On such politically-charged questions, he found “opposing cliques” ready and able to “muster their own stable of researchers and findings” to buttress their case and challenge the legitimacy of the other side. In the United States, it is so bad that the policy researchers now jump into the fray challenging the motives and competence of their adversaries. It not only breeds public cynicism, it undermines the credibility of education research itself. http://www.ascd.org/publications/educational-leadership/dec08/vol66/num04/The-Spectrum-of-Education-Research.aspx
Research tends to be dismissed in Canadian provincial systems for different reasons. First and foremost, most of the research is small-scale, personal, and lacking in support from large scale data sets. At Canadian faculties of education, some 300 to 400 academics claim to be conducting “university-based research.” Yet at the recent Canadian Society for the Study of Education (CSSE) Congress at UNB Fredericton most of the papers were only given orally, focused mostly on “pet projects”, and were not subjected to proper peer review. The quality of the work is as much of a concern as the obvious biases inherent in most of the narrow, practical papers.
One former Deputy Minister of Education, Dr. Bernard Shapiro, put is best. Speaking in 1991 in Calgary, AB, he quipped: ” All policy decisions are made by leaping over the data.” Educators are well know for paying little attention to “OISE”classroom-based research, and policy-makers simply prefer to consult the political opinion polls.
The new International Alliance of Leading Education Institutes is determined to to change the situation. Since its founding in 2007, the IALEI has mobilized 10 different national education faculties and much of its agenda is driven by the Ontario Institute for Studies in Education (OISE). http://www.intlalliance.org/ When the IALEI met on June 14-15, 2001 in Toronto, Dr Ben Levin and the OISE researchers dominated the proceedings. The latest craze is “research knowledge mobilization” and it’s not only Dr. Levin’s favourite topic but a 21st century mantra found in recent World Bank and Organization of Economic Cooperation and Development (OECD) publications.
Mobilizing edu-research and transferring it into policy and practice is a project that faces formidable obstacles. OECD’s Kirk Van Damme stunned the IALEI audience of 150 educrats and researchers with the declaration that most of the current research was simply shoddy. “It’s mostly of low quality,” he said,” and “we need to be more hygenic when using the word research.”
The 2011 IALEI Research Synthesis report by Jie Qi and Ben Levin itemized the familiar list of criticisms of university-based education research: Lack of rigour; failure to produce cumulative findings; theoretical incoherence; ideological bias; irrelevance to schools; lack of involvement of teachers; inaccessibility and poor dissemination; and poor cost-effectiveness. (pp. 5-6)
Given the glaring weaknesses, is it any wonder that education research has such a bad name? If the criticisms are sound, we may have an answer as to why it is safely ignored in policy councils as well as in the classroom.
Judging from the 2011 IALEI conference, Dr. Ben Levin is not easily deterred, nor should his influence be underestimated. As the sole Canadian representative among the 10 institutes, OISE is presumed to be speaking for Canada on every policy matter, including the state of research. While other national reports express concern about the narrowness of the research focus or bemoan the lack of “blue sky research”, the OISE researchers decry the “public skepicism” about “public spending on research.” and the “pressure ” to demonstrate “value for money.” It was left to the U.S. ( U of Wisconsin at Madison)and England (University of London) to reference criticisms that too much research is “vacuous and obvious” in the eyes of the public.
Education research is improving but most of it is hard to take seriously. OECD’s Van Damme was deadly accurate in describing most education “research” as lacking in credibility because most researchers “begin from fixed ideological positions” and limit themselves to “small scale” projects with limited broader applicability. He sees a “great urgency” for improvement because it’s stalling reform and means we are “simply not preparing students for 21st century challenges.” What’s needed, Van Damme says, is serious research capable of “destabilizing” school systems.
Why does education research continue to have “a bad name” among policy-makers as well as teachers and engaged parents? In Canada’s provinces, why are so many of the studies conducted by “embedded researchers” dependent upon the existing system for their livelihood? How are independent think-tanks and researchers known as “boundary crossers” effectively marginalized in the public arena? What would it take to have education research taken more seriously?